Definition of A/B Testing
Categorized in: Glossary
A/B testing is a method of testing for an advertising campaign. Also known as split-run testing or bucket testing, A/B testing consists of participants who are unaware of the test being conducted. This is a blind study where half of the visitors see version A, which is the control, and the other half of the visitors see version B, which is a variant. Version B includes changes that can improve engagement, conversion, click-through rate, or time spent on a page. The goal of A/B testing is to find out which version is preferred by visitors.
Social media sites like Facebook, Instagram, Google, and LinkedIn use A/B testing to improve user experience, but many professions including marketers, designers, entrepreneurs, and software engineers use A/B testing to improve customer satisfaction and increase their revenue.
If you decide to conduct an A/B test on your website, there are a number of variables that can be tested in addition to the website itself. These variable can include: landing pages, emails, banner ads, colors, images, layouts, copy text, product pricing, testimonials, etc.
You can perform A/B testing on more than one variable, but you want to do them one at a time, otherwise you won’t know which variable was responsible for the change in performance. If you’re curious if adding banner ads would appeal to visitors, add banners to version B to see if it performs better. Marketers can benefits from A/B testing ad copy. They can learn which version of the copy attracts more clicks.
Running two versions at the same time is an important part of A/B testing. This will give you a clearer picture about which one was preferred. If you run them weeks or months apart, you won’t know if it was simply timing that made one more appealing or the actual changes made. You also want to allow enough time for testing so that you can gather a greater sample size. A shorter test run may not show much difference.
A survey or poll can be a good way to collect opinions from real users visiting both version A and version B during your A/B test. Some questions you might want to ask include:
- How likely are you to recommend us to a friend or colleague?
- How easy was it to solve your problem with us today?
- What matters most to you when choosing a ________?
- Why did you decide to click (or not click) a button?
Once the A/B test is complete, you can analyze the results. This can be done with testing software that will present the collected data and show you the difference in performance of version A and version B. The testing software will let you know if there was a statistically significant difference in the two versions tested. If your A/B testing results in a significance level of 95%, you can be very confident that the results are real and not caused by some random event.
A/B Testing Definition Sources:
LIKE AND SHARE THIS ARTICLE: