Athletes in every sport monitor and capture data to help them win. They use cameras, sensors, and wearables to optimize their caloric intake, training regimens, and athletic performance, using data and exploratory thinking to refine every advantage possible. It may not be an Olympic event (yet!), but A/B testing can be dominated the same way.
I talked to a website owner recently who loves the “always be testing” philosophy. He explained that he instructs his teams to always test something—the message, the design, the layout, the offer, the CTA.
I asked, “But how do they know what to pick?” He thought about it and responded, “They don’t.”
Relying on intuition, experienced as your team may be, will only get you so far. To “always test something” can be a great philosophy, but testing for the sake of testing is often a massive waste of resources—as is A/B testing without significant thought and preparation.
Where standard A/B testing can answer questions like “Which version converts better?” A/B testing combined with advanced analyses gives you something more important—a framework to answer questions like “Why did the winning version convert better?”
Changing athletes, or a waste of resources?#section1
Typical A/B testing is based on algorithms that are powered by data during the test, but we started trying a different model on our projects here at Clicktale, putting heavy emphasis on data before, during, and after the test. The results have been more interesting and strategic, not just tactical.
Let’s imagine that Wheaties.org wants to reduce bounce rate and increase Buy Now clicks. Time for an A/B test, right?
The site’s UX lead gets an idea to split test their current site, comparing versions with current athletes to versions featuring former Olympians.
But what if your team monitored in-page visitor behavior and saw that an overwhelming majority of site visitors do not scroll below the fold to even notice the athletes featured there?
Now the idea of testing the different athlete variants sounds like a waste of time and resources, right?
But something happens when you take a different vantage point. What if your team watched session replays and noticed that those who do visit the athlete profiles tend to stay on the site longer and increase the rate of “Buy Now” clicks exponentially? That may be a subset of site visitors, but it’s a subset that’s working how you want.
If the desired outcome is to leverage the great experiences built into the pages, perhaps it would be wise to bring the athlete profiles higher. Or to A/B test elements that should encourage users to scroll down.
In our experience, both with A/B testing our own web properties and in aggregating the data of the 100 billion in-screen behaviors we’ve tracked, we know this to be true: testing should be powerful, focused, and actionable. In making business decisions, it helps when you’re able to see visual and conclusive evidence.
Imagine a marathon runner who doesn’t pay attention to other competitors once the race begins. Now, think about one who paces herself, watches the other racers, and modifies her cadence accordingly.
By doing something similar, your team can be agile in making changes and fixing bugs. Each time your team makes an adjustment, you can start another A/B test … which lets you improve the customer experience faster than if you had to wait days for the first A/B test to be completed.
The race is on#section2
Once an A/B test is underway, the machines use data-based algorithms to determine a winner. Based on traffic, conversion rate, number of variations, and the minimum improvement you want to detect, the finish line may be days or weeks away. What is an ambitious A/B tester to do?
Watch session replay of each variation immediately, once you’ve received a representative number of visitors. Use them to validate funnels and quickly be alert to any customer experience issues that may cause your funnels to leak.
Focus on the experience. Understanding which user behavior dominates each page is powerful, internalizing why users are behaving as they are enables you to take corrective actions mid-course and position yourself properly.
The next test#section3
In your post-test assessments, again use data to understand why the winning variation succeeded with its target audience. Understanding the reason can help you prioritize future elements to test.
For example, when testing a control with a promotional banner (that should increase conversions) against a variation without a promotion, a PM may conclude that the promotion is ineffective when that variation loses.
Studying a heatmap of the test can reveal new insights. In this example, conversions were reduced because the banner pushed the “buy now” CTA out of sight.
In this case, as a next test, the PM may decide not to remove the banner, but rather to test it in a way that keeps the more important “buy now” CTA visible. There is a good chance such a combination will yield even better results.
There are plenty of other examples of this, too. For instance, the web insights manager at a credit card company told me that having the aggregate data, in the form of heatmaps, helps him continually make more informed decisions about this A/B testing. In their case, they were able to rely on data that indicated they could remove a content panel without hurting their KPIs.
Another one of our customers, GoDaddy, was able to increase conversions on its checkout page by four percent after running an A/B test. “With our volume, that was a huge, huge increase…. We also tripled our donations to Round Up for Charity,” said Ana Grace, GoDaddy’s director of ecommerce, global product management. But the optimization doesn’t stop once a test is complete; GoDaddy continues to monitor new pages after changes, and sometimes finds additional hypotheses that require testing.
What it takes to go for the gold#section4
I was not blessed with the natural athletic ability of an Olympian, but when it comes to A/B testing web assets and mobile apps, I have what I need to determine which version will be the winner. The powerful combination of behavioral analytics and big data gives athletes the knowledge they need to make the most of their assets, and it can do the same for you.