Whether you sell a service or a product, what you really do is create value for your customers — and every interaction with your business provides measurable data. But the smartest companies don’t just passively store and analyze that data; they make it actionable by running experiments.

The secret to getting value from data is testing, and if you’re looking to grow your business online, implementing well-executed, consistent A/B testing is a necessity. Here are some helpful tips that can set you on your way to creating growth through experimentation.

Teamwork by Monkik
Teamwork by Monkik

1. Keep the team small. All you need to perform tests is a 3-4 person team made up of an engineer, a designer/front-end developer, and a business analyst (or product owner). Make sure your designer can think iteratively and conduct tests, and that your product person has the skills to analyze those tests as they happen, to avoid going to another department for results.

2. Metrics: choose one, view many. The metric you tried to move probably won’t tell the whole story. Plenty of very smart people have been puzzled by A/B test results. You’ll need to look at lots of different metrics to figure out what change really happened, and more often than not, you’ll be incorrect. When a test fails, don’t give up. Instead, learn what happened (figure out, for example, which metric did move), and use that to inform future iterations.

3. Dig in with de-averaging. Often a test doesn’t perform better on average, but does for particular customer segments, such as new vs. existing customers. The test may also be performing better for a particular geo, language, or even user persona. You won’t find these insights without looking beyond averages by digging into different segments.

Laboratory by whanwhan.ai
Laboratory by whanwhan.ai

4. Test small changes. Don’t spend months building a test just to throw it away when it doesn’t work. If you have to spend a long time creating it, then you’re not working smartly. Find the smallest amount of development you can do to create a test based on your hypothesis; one variable at a time is best.

5. Re-test ideas. Tests can often fail because of seasonality, or because you missed one tiny nuance. That doesn’t mean you should discard them. Keep a backlog of previously run tests, and try re-running a few later. You might be surprised what you find.

6. Don’t forget performance. Performance testing should be considered part of your design and optimization. Even a small page-load increase can mess up what would otherwise be a winning design. Page weight and load should be tested alongside other tests.

Top image: Infographic Elements by Filip Roberts

A longer version of this article originally appeared on the Harvard Business Review blog.