We live in times of data-driven marketing. Lost are the times where marketing is exclusively throwing money at things and hope for a favorable outcome. The modern day marketer has a scientific approach and relies on data. And the best way to remove uncertainty and guts to all marketing or design decisions for websites, ads—anything online actually, is A/B Testing.
Embrace the experimentation mindset.
By basing your strategy on data, tests, you’ll be agile but most importantly you’ll have hard cold feedback on what works and what doesn’t. You’ll be better at making sound business decisions and invest time and money in what your visitors actually want.
75% of websites with more than 1 million monthly visitors already do A/B Testing. Maybe it’s time for you too to remove risks from your marketing decisions.
But A/B testing needs work to get into, and more work to practice. You’ll have to create a process, put a framework in place, learn a bit about statistics, set up and learn a new tool, make sure you’re actually getting accurate results, … But with great potential, come great (time) investment—or so Uncle Ben didn’t say.
But fear not, you can get everything you’d need here, with the best content on the topic, from the best blogs and experts.
Definition: A/B testing is an online experiment conducted on a website, mobile application or ads (among other things), to test potential improvements in comparison to a control (or original) version. Put simply, it it allows you to see which version works better for your audience based on a statistical analysis.
A/B Testing is also known as split testing, which can be either exactly the same thing as A/B testing OR mean split URL testing. For a classic A/B Test, the 2 variations are on the same URL, whereas split URL testing where your changed variation is on a different URL (your visitor doesn’t see the difference of course).
Sometimes, you want to test several changes on a page, let’s say the banner, header, description and video. To test all of these at the same time, you would do what is called multivariate testing (or MVT).
You would have variations generated to test all the different combinations of these changes to determine the best one.
The big downside of Multivariate testing is that it requires gigantic amount of traffic. If you want more details check this Hubspot’s article on the differences between MVT and A/B Tests.
Example with a banner and a picture
(4 variations, control version not pictured)
Multi-armed bandit testing is when your algorithm automatically and gradually redirect your audience toward the winning variation..
To learn more about multi-armed bandit testing, the best article by far is by Alex Birkett from ConversionXL.
A/B/n testing is when testing more 2 variations of an element or page. You could test 6 versions of a page and do an A/B/C/D/E/F test.
Why should YOU do A/B Testing? Or a better question is: Are you satisfied with the way you’re exploiting your hard-earned traffic? Once you have traffic, increasing your conversions is much less expensive with great potential ROI. And with A/B Testing, it’s even greater. But that's not all, here are a couple of other benefits:
Learn deeply about your audience with every test: what they like, how they react, their needs and habits.
Remove gut decisions from your marketing strategy by adopting an experimentation culture and testing everything.
Focus your time and money on what your visitors respond to best, thanks to the learnings of your A/B Tests.
And to give you more operational examples of questions you’ll be able to answer with A/B Testing:
Should you have long or short forms?
Should you have long or short forms
Should you implement this new feature?
Which title for your article generates more shares?
Which steps of your conversion funnels are underperforming?
You compare the current version (control) of a page/element against a (or more) variation of it with the changes you want to test (website page, element in a page, a CTA, picture, …).
You divide your traffic in equal portions, then they are randomly exposed to one or the other variation during a given period of time. Then, their performances (conversions, sales, …) are compared and analyzed to determine if the change(s) are worth implementing.
What’s Conversion Rate Optimization (or CRO)?
It’s the process of improving your website to increase the rate at which your visitors perform the action(s) you want them to accomplish.
It’s usually either buy something or “convert” and give you their contact info. In other words, get your visitors through the buyer’s journey.
Having lots of traffic is cool—admittedly, but you need to do something with it. That’s where CRO is so valuable.
The success of great A/B testing is the process. It’s a science experiment, so your process has to be rigorous, with strong prioritization to focus on the most valuable tests.
Each company has its unique process but it usually resembles something like this:
Measure, study, analyze you website data. Identify the problems and opportunities.
Formulate hypothesis (great way to formulate an hypothesis by Craig Sullivan)
Prioritize your tests ideas: one of the most used prioritization framework is PIE first coined by Widerfunnel.
With this framework, you rank your tests ideas on 3 criteria to determine which ones you should run first:
Potential ./10: How much room for improvement is there on this(these) page(s)?
Impact ./10: How valuable is the traffic on this(these) page(s)?
Ease (of implementation) ./10: How easy will this test be to implement on your site?
You then average the 3 and you’ll know which tests to do first. There are of course several other frameworks, try them out and make them your own.
Test your highest priority hypothesis.
Analyze your test results and learn from them
You can basically test everything on your website:
Call to actions
But sometimes, you need inspiration. So here are dozens of A/B testing ideas for you.
Note : Before we let you dive in these listicles, a small disclaimer: what worked for others might not work for you. Don’t blindly apply what you read there, make sure to analyze thoroughly if it’s relevant for you and how (if) you can adapt for your business.
A/B testing can be hard … and easy to mess up at the same time. So it’s good to be aware of what could go wrong and have safeguards in place. Thankfully, people have gotten into lengths on both these topics.
Make sure you set up for success by studying best practices and possible mistakes. But like with tests ideas, don’t take everything at face value. Put things to the test, see if it applies for your business.
Best practices to help you win
Note:We have a monthly newsletter with in-depth content A/B testing and CRO, you’re welcome to subscribe here
A/B testing mistakes
(false data will make you lose money, careful)
There aren’t many books (or ebooks) on A/B testing, but here are a couple you can quench your thirst for knowledge with.
A/B testing is about making data-driven decisions AND learning. Meaning your reporting and results have the outmost importance. Be it to extract the lessons, communicate with your colleagues or get ideas for your next tests.
A/B testing is based on statistical methods. You don’t need to know all the maths behind, but a little brush up in statistics won’t hurt and certainly improve the chances of your success.
There are 2 main statistical methods behind A/B testing solutions. There aren’t one better than the other, they just have different use. Here is how we handle it with Kameleoon’s statistical engine.
Allows a simple read on result reliability thanks to a confidence level: with a level of 95% or more, you have a 95% chance of obtaining the same result should you reproduce the experiment in the same conditions. But this method has a downside: it has a "fixed horizon", meaning the confidence level has no value up until the end of the test.
Provides a result probability as soon as the test starts. No need to wait until the end of the test to spot a trend and interpret the data. But this method also has prerequisites: you need to know how to read the confidence interval given to the estimations during the test. With every additional conversion, the trust in the probability of a reliable winning variant improves.
To put all odds in your favor, there are a number of skills and management tips you can polish. Web analytics, UX design, communicating results, are some examples.
Be it project management, sample size or duration calculator, or toolkits for your process, there are many tools to help you win.
You might want to keep yourself in the loop as the CRO world moves quite fast. Best way to do that is to follow the most prominent experts. Here are their handle on twitter, and a list so you can follow all of them (and us of course @kameleoonrocks)
Carlos del Rio
Tiffany Da Silva
Theo van der Zee
Outsourcing your A/B testing can be a great way to still do it without the necessary resources. Here are some of the best ones out there.
Note: we sell an A/B testing tool but we also can handle your entire testing, check out our customer success page