A/B Testing is the process of comparing two variations of the same thing to see which variant yields the best results. A/B testing is often used in marketing to determine which marketing message, offer or other element is most effective at improving response rates. On the Web, A/B testing is used in WEBSITE OPTIMIZATION to determine which variations of a page element improve conversion rates the most.
A/B testing is commonly used by online companies to improve the performance of their website and marketing campaigns, because it is relatively easy to create and run tests by updating the code or design of a website. This makes A/B testing much easier on the Web than it is to test things like billboards or magazine ads.
A/B testing isn’t just for marketing however, product teams can A/B test different product variations, customer service can A/B test various responses and more.
Marketers use A/B testing on their websites to improve conversion rates. For example you may want to determine which call to action on your landing page results in more people clicking to the next page or step in your funnel. To determine this you can set up an A/B test with two different variations of the button. In this hypothetical example the “A” variant of the button might say “Learn More” while the “B” variant of the button might say “See How”.
In an A/B test, both variants of the button run at the same time, to a set percentage of website visitors. In the simplest case, 50% of visitors see variant “A” and 50% see variant “B”. By using an A/B testing software like Optimizely, you can watch over time to see which button variant gets the most clicks.
Suppose the results of the A/B test are:
At the end of your test, you might find that one button performs better than another at getting people to click more frequently. In this example, button variant B performs twice as well as variant A. This variant is called the “winning variant” or “winner”.
Assuming that the conversion rate holds*, by using “See How” for every visitor after the test concludes, you have successfully doubled your conversion rate. After the conclusion of this test, you may run another test to see if you can improve upon the “See How” button variation, or test some other element of the page. This is the essence of CONVERSION RATE OPTIMIZATION.
In reality, most conversion rate optimization efforts don’t A/B test entire websites at once. Doing so would make it very difficult to identify which changes were responsible for which results and tests would be time consuming and expensive.
Instead, most A/B testing is done as part of an ongoing WEBSITE OPTIMIZATION PROCESS, where marketers try to improve conversion rates by focusing on improving one or more elements on a page at a time.
The conversion optimization process looks as follows, where A/B testing is just one step in the overall optimization effort.
As you can see from the diagram above, the first step in the process is coming up with the hypotheses that you will A/B test on your site. Every business has different needs and challenges, meaning that what they need to test first will be different. The first step is to review your website’s current performance compared to your business objectives and look for areas of opportunity for improvement.
Pages with exceptionally high bounce and exit rates, large drop off points in your checkout funnel, landing pages that are the destinations for paid advertising campaigns and the most popular pages on your site are all great starting points for investigation.
Headlines and Value Propositions – Website copy is one of the easiest things to test and potentially the most impactful. Most website visitors only read the headline of a page before deciding whether to stay or leave. If you have landing pages with short time on site visits and high bounce rates, A/B testing your headline can be a way to improve conversion rates of the page.
Top Landing Pages – Thanks to Google, most websites have many different entry points, as visitors come into your website through pages they find in Google. These pages are not always the homepage. By reviewing the top landing pages in your analytics reporting, you can evaluate your most popular landing pages to see how they perform. Are there pages that are significantly better or worse than others? Which are the most trafficked? By A/B testing your top landing page copy, offer and calls to action you can improve the conversion of visitors to the next step in your funnel.
Lead Pages – Does your business rely on collecting leads or email addresses? If it does, you can A/B test your lead conversion pages and email collection forms. A/B tests for these pages can be everything from the headline, to the call to action, to the number of fields of information requested or the types of data you’re asking for. You may find an A/B test of your form results in big changes in new leads.
Checkout Pages – If you’re selling something online then shopping cart abandonment is the bane of your existence. A/B testing can help improve your shopping cart conversion rate. Test variations of your checkout pages to find those that have the highest conversion rate. You may find that simple changes, such as adding a customer service number, or return guarantee can improve conversion rates. Alternatively you may choose to A/B test entirely different checkout page designs to see which has a higher conversion rate.
Advertising Campaign Landing Pages – If you’re spending money to drive traffic to your website through ad campaigns such as Google Adwords and Facebook ads, improving the performance of the landing pages for these campaigns is an important way to improve your return on investment for these campaigns. If you can improve the conversion rate of your landing page by 25% you reduce your customer acquisition cost by 25% and increase your return on investment.
The harsh reality of A/B testing is that most tests fail to either improve on the original variant or lose outright. It can be very hard to find a winning test. This reality makes running repeated tests challenging. It’s easy to get demoralized. While failed tests are part of any A/B testing process, there are a few ways to improve your odds for successful tests.
Don’t guess at what A/B tests you think will improve the performance of your website. Rather, look at your analytics data and combine it with qualitative insights from Qualaroo WEBSITE SURVEYS to find the real issues that are getting in the way of conversions.
For example, you might think that your conversion rate will improve by changing the color of the call to action button, when in reality your website visitors might be concerned by the lack of a visible return policy. Wasting time on tests that are just guesses is a sure fire way to have more losing tests than winners. Instead, analyze user behavior and feedback to determine what to test.
Not all A/B tests are created equal. If you are running an A/B test on a page that only gets a small fraction of your website traffic, no matter how well the test performs, the overall impact on your business will be minimal. However, finding a winning test on your highest valued pages can result in meaningful gains.
Evaluate each of your potential tests by three criteria: Impact, Confidence and Ease. Impact: how likely is it that the test will have a meaningful impact on your business? Confidence: how confident is your team based on the data that this test will be effective? Ease: how easy is it to test this hypothesis? Is it fast or will it take a lot of resources to execute?
When you rank all of your test ideas by these three criteria you’ll find that those that rise to the top have the greatest impact, the highest likelihood of success and are easiest to test. Start with those that score highest–especially in all three areas.
Don’t test blindly. Ensure that you’re capturing the learning from previous tests so that you document what assumptions and tests worked and which didn’t. By making every A/B test a learning experience (even the failed ones), you can improve your odds with subsequent tests that don’t repeat the similar mistakes or flawed assumptions.
In order to run effective A/B tests, you need a few tools to determine what to test, run your tests and track your A/B tests. While there are dozens of A/B testing tools out there, we’ve found these to be the most effective.
Determine What A/B Tests to Run – Qualitative and quantitative data collection and analysis tools help you understand where opportunities and challenges lie on your site. Google Analytics, KISSmetrics, Qualaroo, CrazyEgg and UserTesting.com are all great tools to get this insight.
Running A/B Tests – Optimizely and Visual Website Optimizer are the best tools for running A/B test quickly and effectively without needing much, if any, help from your development teams. Once the Optimizely code is on site, marketers can make changes and set up experiments on the site by themselves.
Track A/B Tests – There isn’t any one tool that does this exceptionally well. Most people use a combination of Microsoft Excel, Google Spreadsheets, PowerPoint or internal company wikis or Intranets to keep track of historical A/B test results.