What is A/B testing
We live in times of data-driven marketing. Lost are the times where marketing is exclusively throwing money at things and hope for a favorable outcome. The modern day marketer has a scientific approach and relies on data. And the best way to remove uncertainty and guts to all marketing or design decisions for websites, ads—anything online actually, is A/B Testing.
Embrace the experimentation mindset.
By basing your strategy on data, tests, you’ll be agile but most importantly you’ll have hard cold feedback on what works and what doesn’t. You’ll be better at making sound business decisions and invest time and money in what your visitors actually want.
A/B Testing as a weapon of mass(ive) marketing success
75% of websites with more than 1 million monthly visitors already do A/B Testing. Maybe it’s time for you too to remove risks from your marketing decisions.
But A/B testing needs work to get into, and more work to practice. You’ll have to create a process, put a framework in place, learn a bit about statistics, set up and learn a new tool, make sure you’re actually getting accurate results, … But with great potential, come great (time) investment—or so Uncle Ben didn’t say.
But fear not, you can get everything you’d need here, with the best content on the topic, from the best blogs and experts.
What is A/B Testing?
Definition: A/B testing is an online experiment conducted on a website, mobile application or ads (among other things), to test potential improvements in comparison to a control (or original) version. Put simply, it it allows you to see which version works better for your audience based on a statistical analysis.
A/B Testing is also known as split testing, which can be either exactly the same thing as A/B testing OR mean split URL testing. For a classic A/B Test, the 2 variations are on the same URL, whereas split URL testing where your changed variation is on a different URL (your visitor doesn’t see the difference of course).
What about multivariate (MVT)?
Sometimes, you want to test several changes on a page, let’s say the banner, header, description and video. To test all of these at the same time, you would do what is called multivariate testing (or MVT).
You would have variations generated to test all the different combinations of these changes to determine the best one.
The big downside of Multivariate testing is that it requires gigantic amount of traffic. If you want more details check this Hubspot’s article on the differences between MVT and A/B Tests.
Bandit testing or multi-armed bandit testing
Multi-armed bandit testing is when your algorithm automatically and gradually redirect your audience toward the winning variation.
To learn more about multi-armed bandit testing, the best article by far is by Alex Birkett from ConversionXL.
A/B/n testing is when testing more 2 variations of an element or page. You could test 6 versions of a page and do an A/B/C/D/E/F test.
Introductory articles on A/B testing to help you get started
What are the benefits of A/B Testing?
Why should YOU do A/B Testing? Or a better question is: Are you satisfied with the way you’re exploiting your hard-earned traffic? Once you have traffic, increasing your conversions is much less expensive with great potential ROI. And with A/B Testing, it’s even greater. But it’s not all, here is a couple of other benefits:
- Learn deeply about your audience with every test: what they like, how they react, their needs and habits.
- Remove gut decisions from your marketing strategy by adopting an experimentation culture and testing everything.
- Focus your time and money on what your visitors respond to best, thanks to the learnings of your A/B Tests.
And to give you more operational examples of questions you’ll be able to answer with A/B Testing:
- Which elements drive sales, conversions or impact user behavior
- Should you have long or short forms
- Should you implement this new feature
- Which title for your article generates more shares
- Which steps of your conversion funnels are underperforming
Examples of A/B Testing from leading websites
How does A/B Testing work
You compare the current version (control) of a page/element against a (or more) variation of it with the changes you want to test (website page, element in a page, a CTA, picture, …).
You divide your traffic in equal portions, then they are randomly exposed to one or the other variation during a given period of time. Then, their performances (conversions, sales, …) are compared and analyzed to determine if the change(s) are worth implementing.
Getting deeper in the inner workings of A/B testing
Conversion Rate Optimization is key for your success with A/B Testing
Having lots of traffic is cool—admittedly, but you need to do something with it. That’s where CRO is so valuable.
Best Conversion Rate Optimization Guides out there
- The Beginner’s Guide to Conversion Rate Optimization (Qualaroo)
- The Definitive Guide to Conversion Rate Optimization (Quicksprout)
- Conversion Rate Optimization Techniques (100+ Techniques and Free PDF) (Note: Lots of ideas to test in there)
- The Beginner’s Guide to Conversion Rate Optimization (ConversionXL)
- The A-Z Guide to Conversion Rate Optimization
- The Definitive How-To Guide For Conversion Rate Optimization
- What you have to know about conversion optimization
- The Conversion Optimization Rulebook
- Widerfunnel’s case studies
How to do A/B Testing: frameworks and methodology
Measure, study, analyze you website data. Identify the problems and opportunities.
Formulate hypothesis (great way to formulate an hypothesis by Craig Sullivan)
Prioritize your tests ideas: one of the most used prioritization framework is PIE first coined by Widerfunnel.
With this framework, you rank your tests ideas on 3 criteria to determine which ones you should run first:
Potential ./10: How much room for improvement is there on this(these) page(s)?
Impact ./10: How valuable is the traffic on this(these) page(s)?
Ease (of implementation) ./10: How easy will this test be to implement on your site?
You then average the 3 and you’ll know which tests to do first. There are of course several other frameworks, try them out and make them your own.
Test your highest priority hypothesis.
Analyze your test results and learn from them
Great A/B testing processes and frameworks
- PXL: A Better Way to Prioritize Your A/B Tests
- The A/B Testing Framework So Good It Got A Codename
- Widerfunnel's infinity optimization process
- How to Build a Strong A/B Testing Plan That Gets Results
- Iterative A/B Testing – A Must If You Lack a Crystal Ball
- Start A/B Testing Today with 5 Simple Steps
What to A/B test: ideas by the dozen
You can basically test everything on your website:
But sometimes, you need inspiration. So here are dozens of A/B testing ideas for you.
Note : Before we let you dive in these listicles, a small disclaimer: what worked for others might not work for you. Don’t blindly apply what you read there, make sure to analyze thoroughly if it’s relevant for you and how (if) you can adapt for your business.
Dozens of A/B tests ideas for you to get inspired
A/B best practices and … mistakes
A/B testing can be hard … and easy to mess up at the same time. So it’s good to be aware of what could go wrong and have safeguards in place. Thankfully, people have gotten into lengths on both these topics.
Make sure you set up for success by studying best practices and possible mistakes. But like with tests ideas, don’t take everything at face value. Put things to the test, see if it applies for your business.
Best practices to help you win
- 36 essential A/B testing best practices to boost your conversions
- 55 A/B Testing Best Practices Every Marketer Should Know
- 8 Rules of A/B Testing – The Art in Marketing Science
- 8 Best Practices for Starting Your A/B Testing
- What are some best practices with A/B testing?
- A/B Testing Best Practices Can Save You Time, Money and Effort – Here's How
- The Endless Suck of Best Practice and Optimisation Experts
Note: We have a monthly newsletter with in-depth content A/B testing and CRO, you’re welcome to subscribe here
A/B testing mistakes
(false data will make you lose money, careful)
- 12 A/B Split Testing Mistakes I See Businesses Make All The Time
- [INFOGRAPHIC] 19 Ways A/B Testing Is Ruining Your Site (And How To Fix It)
- Should You Run an A/A test?
- Why Your Brain Is Your Worst Enemy When A/B Testing
- Are You Misinterpreting Your A/B Tests Results?
- Warning! Is the world sabotaging your A/B Tests?
- Are You Stopping Your A/B Tests Too Early?
- 7 Mistakes Most Beginners Make When A/B Testing
- How to Minimize A/B Test Validity Threats
- Sample Pollution: The A/B Testing Problem You Don’t Know You Have
- 11 ways to stop FOOC’ing up your A/B tests
Books to go even deeper
There aren’t many books (or ebooks) on A/B testing, but here are a couple you can quench your thirst for knowledge with.
A/B testing reporting & results
A/B testing is about making data-driven decisions AND learning. Meaning your reporting and results have the outmost importance. Be it to extract the lessons, communicate with your colleagues or get ideas for your next tests.
How to handle A/B testing results and reporting
Dive deep in A/B Testing statistics (or don’t, it’s scary down there)
A/B testing is based on statistical methods. You don’t need to know all the maths behind, but a little brush up in statistics won’t hurt and certainly improve the chances of your success.
There are 2 main statistical methods behind A/B testing solutions. There aren’t one better than the other, they just have different use. Here is how we handle it with Kameleoon’s statistical engine.
Allows a simple read on the results reliability thanks to a confidence level: with a level of 95% or more, you have a 95% chance of making the right decision. But this method has a downside: it has a « fixed horizon », meaning the confidence level has no value up until the end of the test.<.p>
Provides a result probability as soon as the test starts. No need to wait until the end of the test to spot a trend and interpret the data. But this method also has prerequisites: you need to know how to read the confidence interval given to the estimations during the test. With every additional conversion, the trust in the probability of a reliable winning variant improves.
A/B testing statistics decrypted
- Ignorant No More: Crash Course on A/B testing Statistics
- A/B Testing Tech Note: determining sample size
- Speed vs. Certainty in A/B Testing
- How Not To Run An A/B Test
- Statistical Significance Does Not Equal Validity (or Why You Get Imaginary Lifts)
- What is the difference between Bayesian and frequentist statisticians?
A/B testing skills & mangement
To put all odds in your favor, there are a number of skills and management tips you can polish. Web analytics, UX design, communicating results, are some examples.
Sharpen your skills for better A/B tests
- Five skills you need to make AB testing work
- 10 Things Every Marketer Should Know About A/B Testing
- Free A/B Testing course by Google
- Beginner's Guide To Web Data Analysis: Ten Steps To Love & Success
- The Absolute Beginner's Guide to Google Analytics
- The ultimate guide to user experience
- Beginner’s guide to UX
- 9 strategies for becoming the marketing optimization champion your company can’t live without
86 A/B Testing experts to follow
Lance Jones > @userhue
Jason Kincaid > @jasonkincaid
Noah Kagan > @noahkagan
Hiten Shah > @hnshah
Dave McClure > @davemcclure
Avinash Kaushik > @avinash
Daniel Gonzalez > @HiDanielG
David Kirkpatrick > @davidkonline
Shanelle Mullin > @shanelle_mullin
Steve Blank > @sgblank
Matt McGee > @mattmcgee
Rand Fishkin > @randfish
Bart Schutz > @BartS
Rick Perreault > @rickperreault
Sean Ellis > @SeanEllis
Campaign Monitor > @CampaignMonitor
Moz > @Moz
Bryan Eisenberg > @TheGrok
Shopify > @Shopify
Scott Brinker > @chiefmartec
Chris Goward > @chrisgoward
Brian Massey > @bmassey
Jeffrey Eisenberg > @JeffreyGroks
Sherice Jacob > @sherice
Carlos del Rio > @inflatemouse
Pam Moore > @PamMktgNut
Angie Schottmuller > @aschottmuller
Ryan Deiss > @ryandeiss
Ian Lurie > @portentint
ashukairy > @ayat
Khalid Saleh > @khalidh
Anne Holland > @AnneHolland55
Lincoln Murphy > @lincolnmurphy
Amy Africa > @amyafrica
Unbounce > @unbounce
Raven Tools > @RavenTools
Roger Dooley > @rogerdooley
Neil Patel > @neilpatel
Nichole Elizabeth > @NikkiElizDemere
Craig Sullivan > @OptimiseOrDie
Peep Laja > @peeplaja
Jon Henshaw > @RavenJon
Marketing Nutz > @MktgNutz
Dan Siroker > @dsiroker
Tommy Walker > @tommyismyname
John Teevan > @JohnP_Teevan
Joanna Wiebe > @copyhackers
Rich Page > @richpage
Tiffany Da Silva > @bellastone
Jason Quey > @jdquey
Ton Wesseling > @tonw
Adam Hutchinson > @adamiswriting
Michael Aagaard > @ContentVerve
Matt Gershoff > @mgershoff
Andy Johns > @ibringtraffic
Brian Balfour > @bbalfour
Oli Gardner > @oligardner
Tim Ash > @tim_ash
Paul Rouke > @paulrouke
Linda Bustos > @edgacentlinda
Theo van der Zee > @theovdzee
Get Elastic > @getelastic
Conversion Conference > @ConversionConf
MAA1 > @MAA1
Talia Wolf > @TaliaGw
Justin Rondeau > @Jtrondeau
Tyson Quick > @TysonQuick
KlientBoost > @KlientBoost
Andre Morys > @morys
Conversion.com > @conversion_com
Anna Talerico > @annatalerico
Kelly Cutler > @kfcutler
Brooks Bell > @brooksbell
Andrew Youderian > @youderian
Alhan Keser > @AlhanKeser
Conversion Sciences > @ConversionSci
Alex Birkett > @iamalexbirkett
Steven Jacobs > @StevenJacobs_
Kaitlyn Nelson > @kaitlynelson
Kevin Hillstrom > @minethatdata
Dan Wang > @danwwang
Malachi Leopold > @livethetreplife
Pete Koomen > @koomen
Aaron Orendorff > @iconiContent
Chief Conversionista > @Conversionista
Joel Harvey > @JoelJHarvey
Outsource your A/B testing with these agencies
Outsourcing your A/B testing can be a great way to still do it without the necessary resources. Here are some of the best ones out there.
Note: we sell an A/B testing tool but we also can handle your entire testing, check out our customer success page