A/B Testing
A/B testing is a buzz word for experiments with two variants. It is used to determine which of the two variants is better. Focus is to make a thing better than it was before.
A/B testing is frequently used in web design and marketing.
A/B testing tools:
- AB Test Calculator (success/trial style)
- T-test Calculator (score style)
- Visual Website Optimizer (web design)
- Optimizely (web design)
A/B testing lets you make decisions by using hard data. Designers need to make best guess they can about the functionality or layout and then A/B tests will tell them which way to tune it. For testing to work, you need somehow working product and good number of users.
Do not forget the qualitative data. While A/B tests work on quantitative data, consider interviewing, having surveys and using feedback forms to gain qualitative data. Qualitative data gives you clues what to test next.
A and B are different variants of a functionality or content. They should be only slightly different or you do not know which modification caused the change in data. The change can be significant, but there should be only one thing changing.
Does new landing page registration form layout increase registration ratio?
A = Landing page with registration button in blue.
B = Landing page with registration button in red.
Win big, lose big. You cannot get drastic results if you do not do drastic changes. But doing small changes helps you fine tune your design. Just change one thing a time.
A and B can be:
Old Against New: Testing an old design against a new design is not optimal because of the "new" effect. But when users are first time visitors, it can be used without problems. You have no need for prior usage metrics about the old design, but it might help.
New Against New: While testing two new designs, you need to have good metrics on the old design that is currently being hidden. Otherwise you can change your product to the worse.
A = 10% of visitors register. B = 5% of visitors register. So A is better, right? But you had C, which got 25% of visitors to register, but you didn't measure it.
Conduct an A/A test before you do A/B tests. Have same designs in both cases and check that measurements are roughly the same. This validates your testing environment.
Avoid multiple A/B tests at the same time in the same process. They will cause noise to their metrics.
Avoid A/B/C/D tests. Not a problem if you have a big user base but most of the time you do not.
A/B tests require trials. Not having any users is the only thing that can prevent A/B testing. Luckily, you do not need to have a huge amount of users to get decent results, starting from a thousand users. Your A/B testing is obviously slower when you have only 2000 unique visitors weekly against 1000 unique visitors daily.
Here are some rough numbers of how many users you need for some basic A/B tests:
- Testing attraction changes to free email signup: 3000 users to detect most improvements.
- Testing usability changes to a feature in your application: 1500 users to detect most improvements, note that user must use the feature.
- Testing persuasion changes to push user from free to paid: If you have over 5000 daily free trial users, you should be able to detect most improvements. But anything past 2000 helps to detect big changes.
A/B Testing Websites
You should aim to minimize friction. Measure where users abandon the site and improve that part.
How to identifying users:
- If you have session cookies in use, use those as user identifier.
- If you have no session cookies in use, you can take user IP, hash it and save to cookie as user identifier.
- If you do not want to use cookies at all, just take IP, hash it and use it.
Promising page elements to test:
General layout e.g. button colors, order of elements.
Headline text and layout.
Remove or add steps in a process.
Social proofs e.g. testimonials, customer logos.
Call-to-action formatting e.g. color, size, placement.
Few paragraphs of text or 2 min introduction video.
Long form or multi-page form.
Pricing and presentation of prices.
Signup bonuses e.g. first 100 get bonus, free bonus valued $300.
Up-sells and cross-sells.
Add "Free Trial" button next to "Pricing" or "Buy Now". Require credit card for free trial or not. Include trust symbols on forms or not. Offering a live chat or not.
Testing copy is just as important as testing design.
Dog Kennel Site:
- Pictures and texts of miserable puppies that could be saved.
- Pictures and texts of cute puppies that would make a great addition to family.
A/B Testing Keywords
Site traffic should be calculated from your web analytics software, focus on non-paid search traffic. After each change, wait until Google shows the new title.
Measure site traffic for 10 days without the keyword in the title tag.
Measure site traffic for 10 days with the keyword in the title tag.
Measure site traffic for 10 days without the keyword in the title tag.
Measure site traffic for 10 days with the keyword in the title tag.
Open two sample t-test calculator to calculate the difference. T-test calculator.
Add all traffic data from days without the keyword to the left column (control data).
Add all traffic data from days with the keyword to the right column (experiment data).
Check unequal variances. You have a measurable difference if the interval doesn't include zero. When the interval is fully negative, experimental data in right column won.
95% Confidence Interval for the Difference (-30.5708 , -14.4455) means that you have between 15% and 30% increase in traffic with 95% confidence.