A/B testing is a way to give users multiple variations of a design to see which one performs better. It can be used to determine which design elements and messaging are most effective on your audience.
For accurate results you need to have a sizable number of visitors assigned randomly into “group A” and “group B”. If you do not have good number of visitors, your results will not be conclusive. If you do not assign users to groups in a truly randomized way, you will mess up the results.
For A/B tests to be really useful you should test one small change at a time. If you test two completely different designs, even though it will help you find a better version, it will not tell you why one version was better than the other. There is no way to know the real reason for the success.
A/B tests are a great way to test things because users are not even aware they are participating in a test. It is better than asking users what they like, because people will change behavior when they know they are being watched. A/B tests let you observe natural behavior of users without their knowledge. It’s like a controlled test in science experiments, where variables can change according to your objectives.
Where to Start?
You need to go into A/B testing with a plan. You should give it some time and don’t rush things too much.
Define Goals: If you don’t know your objectives you can’t know what type of data you need to collect. First we start asking some simple questions to know what we are testing for. We defined test objectives on an existing page and labeled it as “A”.
- Can users find the information they need?
- What are the main user goals in this screen? How many of them achieve their goals?
- How many users can find a particular button? Can we improve this number? Should we change the placement or the wording of the button?
Analyzing the Data: Then we start analyzing the data of the “A” page before designing “B”.
So, if your page does not get the results you’ve imagined, then A/B testing will help you improve the page. According to the data we have on the original version “A” we start to design the “B” version of it by focusing our goals and objectives.
What’s useful to A/B test?
Truthfully you can test anything in your design:
- Layout & Placements
- Paragraph Text
- Call-to-Action Texts / Links
- Call-to-Action Buttons
- Images & Colors
Don’t limit yourself when you are trying to choose which objective to test. Basically if you can change it, you can test it.
However, make sure to always test one thing at a time. You win the war when you win enough battles, but you can’t test everything at once. It will take you long time to test things one at a time, but there are ways to do it efficiently.
Instead of testing every little single thing on your design, focus on the things which are most likely to make a big impact.
Split your visitors into three (“A”, “B”, “C”) groups instead of two and test them at once. Let’s say you’d like to test your button placement but you have three ideas about for it but you aren’t sure which one to use.
How long should you run your A/B Test?
Keep in mind that testing correctly takes time. Even if you have high traffic, if you run your test for a too short a period of time, your results may turn out to be a statistical anomaly instead of hard data. For example, you might get inputs from only the same time zones and regions. For the reasons above I recommend running a test a minimum of 2 weeks so you have a good sample size of weekday and weekend traffic. Your optimization tool will also provide guidance on when statistical significance is reached.
What are the potential benefits of A/B Testing?
A/B testing is one of the best friends a designer can have. Potential benefits include higher open rates, elimination of guesswork, better user feedback, and most importantly, more sales.
3 Real-World A/B Test Examples from JotForm:
Enough with theoretical talk. Let’s see some real world example!
We wanted to improve the number of times people email a form to another person on the Publish tool. In the “Version A”, email was under the “Platforms” section, in “Version B” we moved it to under the “Share Form”. And here is how we have changed the email placement and increased the send numbers by 6%.
Next, by changing the color of the “Share Form” button, we increased the usage of share form link by 2%.
When we started making form URLs secure by default, some people were confused. They kept asking where they can find the old button that let them find the secure URL.
So we found a solution: In the Version B, we added a “Tooltip” about this new feature. The result? We decreased the number of questions about Secure Forms in our Support Forum by 97%.
I hope these tips and live examples can prove there is real value in A/B testing for any digital product.