What Is A/B Testing (And Other Popular Questions You Will Never Again Have to Ask)

Design, but simple.

My popular design ebook is now on sale. Don't miss out!

On this site, I tend to focus on making your branding consistent and visually pleasant. This is important for the overall brand image and crucial if you want anyone to take your company seriously.

But that’s hard to measure.

It’s hard to know if that press release you sent out ended up inspiring an article on Forbes because of the font you picked. Don’t laugh – there seems to be evidence that some fonts are more believable than others.

Since the impact of pretty graphic design is hard to measure, it can be a hard sell to co-founders, investors, and employers.

But fortunately, most design can be and is measured, and quite effectively so. On the web, we can see exactly how many people clicked through each of the banner ad variations to check out the offer. We can easily find out which of the two headlines performs better on a landing page, which subject line enticed more opens, or even which colour works best on buttons!

The simplest way of testing different designs is with click-through display ads like the ones on Facebook. There we can simply upload two different banners and compare them by how much traction each generated.

Using some advanced tools, we can mix-and-match all the elements of a landing page until we have the combination that makes the most money possible (or whatever your endgame is).

Stop having debates with your cofounders about who likes what. Test every design decision. Forget your pride, and let the numbers decide.

I think I should start a career in rap.

Intro to A/B testing

A/B testing software splits your traffic into two groups and shows a different design to each. By tracking how many visitors complete the primary action (for example, click the buy button or complete the purchase), we’re able to determine which of the designs is better.

We can test different headlines, button colours, layouts, fonts, and even pricing. We’re able to compare two elements on a page (serving one or the other as the page loads) or two completely different pages (redirecting visitors to one or another).

Testing results are oftentimes unintuitive and unexpected. For example, the test on Performable’s website, carried out by HubSpot, revealed that the red button on an overall green website performed 21% better than its control, a green button.

A/B testing is a great way to establish which of the two versions is better. It’s the backbone of data-driven design. But what if you wanted to compare more than two different headlines?

Multivariate Testing

To figure out the perfect combination of all elements on a web page that produces the absolute best results, we use a more advanced testing method called multivariate testing. This type of testing will mix and match different website layouts and give you the best possible solution.

For example, say you have a landing page for one of your products. You want to test your headline, but also the pricing.

You could carry out two separate A/B tests, one for each of those elements. But the winners of these tests may not work well together, so your results would actually be inaccurate.

Multivariate testing has a couple of downsides, too. For one, it requires a lot more traffic than A/B testing. Optimizely blog explains why: Because of the fully factorial nature of these tests, the number of variations in a test can add up quickly. The result of a many-variation test is that the allocated traffic to each variation is lower. In A/B testing, traffic for an experiment is split in half, with 50 percent of traffic visiting each variation. In a multivariate test, traffic will be split into quarters, sixths, eighths, or even smaller segments, with variations receiving a much smaller portion of traffic than in a simple A/B test.

Another challenge of multivariate tests is when a certain element doesn’t make much impact on the end result – meaning that it would be much more informative to simply carry out an A/B test.

Segmentation

Sometimes, your data won’t make sense. There will be non-conclusive tests and tied variations. The absolute winning variation will stop performing so well.

The problem is that people come to your website for a number of reasons. It can be to purchase your product, but the reason might also be to see if there are any job vacancies at your firm; to evaluate your website’s design; to read your blog; to check the pricing, etc. These are very different levels of engagement in your sales funnel, and they simply can’t be treated equally.

You’ll want to pay more attention to the results coming from your paying users than from those just kicking the tires. And this is where segmentation comes in.

Your overall conversion rate might be 2%. This piece of data alone doesn’t tell us much, even on a very focused landing page. But when we start analysing different traffic subcategories, we’ll find out that paid search produces a much higher conversion rate (9%) than organic traffic (1%). This can help us focus on the important results and bring our costs down.

There are a few different ways to segment your traffic. One is by source, explained in the example above. Another is by behaviour, which can be much more useful. You can segment the people who visited more than a single page on your website (they’re invested in your content and want to see more) to analyse their behaviour and find out the leaks. Similarly, you could segment people who searched your site (which shows a certain level of purchase intent) to see if they end up converting. Another useful segmentation method is by outcome. With it, you can take a look at people who are on the list of return buyers, but never bought your latest product.

The takeaway is: when evaluating conversion data, always look at the big picture. Your conversion rate may be low, but this could be because the traffic to your site is highly unqualified and doesn’t come from your target audience at all.

Don’t jump to conclusions before digging into the data!

Qualitative Data

Online marketing has its challenges, and one of them is the lack of real-life feedback from visitors.

When you run a brick-and-mortar business, you get feedback in real time. People walk out of your store when you don’t have a certain product in stock. Or they constantly need help locating the aisle with yoghurts, which would point to UX issues.

You don’t have that online. People visit the site and bounce after one second without an explanation. They fill out the entire order form only to close the site at the last step.

You have no idea why that happened.

Talking to people (or, in fancy terms, acquiring qualitative data) is an indispensable tool in your marketing arsenal. Questions such as “How did you feel after completing the order?” and “Did you find all the features of our app?” simply cannot be answered with testing alone – even though it can have the largest impact on your bottom line.

You need to get out there and start asking questions.

Well, you can stay indoors, anyway. Here are couple of ideas to start collecting feedback:

Record Behaviour on Your Site

Install a website-recording app like Inspectlet on your site, and watch users navigate your website unbeknownst to the testing being in place. I personally used this method on this very website and discovered a few important issues.

Inspectlet, what is ab testing
This is actual footage of someone using this site, captured by Inspectlet.

Quick Usability Testing

Submit your website to one of those 5-minute video user testing sites. For this method, meticulous customer segmentation and putting effort into chasing your real target audience is crucial.

Don’t expect to get very specific feedback from a general audience, but this method may work great for discovering the most screaming conversion problems.

You can also do a couple of actual usability testing sessions.

On-Site Survey

Add an exit-intent popup with survey. To actually entice people to take the survey, make it super-short (one question) and offer a reward (like a free ebook).

Informizely, what is ab testing

Finding a chat box or a survey to the bottom right side of a website is almost expected these days. Direct written feedback from your visitors may help tremendously in discovering issues.

Thank You Page Survey

Survey people on your thank-you page to find out why your customers bought from you. This method obviously only covers the segment of users who are buyers, so it may not be as helpful.

Survey

Send out a survey email to your list of customers. You can simply add a single yes/no question and measure the clicks. The newsletter All The Small Things cares a lot about their subscribers’ feedback.

Qualitative data can be a great roadmap to the things you could run A/B tests on. I encourage you to give it a shot.

In the science of increasing conversions, the most important rule is that there are no rules. Every assumption, every rule of thumb and best practice must be tested in your own environment, with your customers. What works for someone else, may not work for you.

Design, but simple.

My popular design ebook is now on sale. Don't miss out!

Get 30% off