No matter what your media business is, whether it is a national newspaper, a small regional issue, or a fashion magazine, you need to know who your readers are, what they prefer and what keeps them reading your content. You need to improve the quality of their user experience. You could just guess or could be data-driven and collect data about users’ preferences to see what should be improved. A/B tests are a way to comfortably experiment with your readers.
How A/B Testing Works
"A/B test" or "split test" is an experiment with two variables. First, you take one element (a picture, a headline or a button) and create an alternative for it. After that, 50% of your audience sees the original version while another half of the audience sees the renewed alternative page.
The version with higher conversions or engagement is left as the main one.
Make a hypothesis to check the results of your split testing. You can suppose that the results will be negative, positive, or will have no effect.
Seeing what works better helps teams:
- Collect data about the audience and readers’ preferences.
- Find out what site elements impact users’ behavior and decisions.
- Make further decisions on which on-site elements should be changed.
What Can Be Tested?
First, you should figure out what elements of your webpage, email or advertisement should be tested.
As for on-site content testing, here are the things to experiment with:
- Headlines and sub-headlines.
- Texts and paragraphs.
- CTA: buttons and texts.
- Images and their placements.
- Links and anchors.
- Testimonials, social proof elements, badges and awards.
Mostly everything can be tested. Experiment with different locations, sizes, colors, fonts, etc. Align what you are going to test with your business goals. It’s better to have an idea of the reason for split testing.
You can test several things at a time; for example, email A + landing page A and email B + landing page B, and then switch to email A + landing page B. However, try not to take too many variables in your test, as you risk missing the real cause of why users’ behavior change. Test each variable one by one and see the precise and focused results.
A/B Testing Checklist
You need to proceed through several steps before conducting any A/B test. Here is a shortlist of actions that should be taken before conducting a successful split test:
Collect and research your site data.
Data will show you potential or real problematic places in the conversion funnel. It will also demonstrate what works effectively. For example, if one of your pages has 80% exit rate while the other has 50%, pay attention to it and experiment to find out why people leave. Monitor users’ behavior with split testing. If not more than, let’s say, 500 daily visitors, hit your CTA button, find out why it is so. Maybe it should be moved to another place on the page or enlarged, or colored to a different tone.
Set specific goals.
Think about the metrics you will use to measure results. For instance, if you are going to change your subscription button, look at the number of new subscriptions. In case you want to conduct the headlines testing, check the CTR for each of them. Don’t measure your success in impressions or articles readability.
Create a hypothesis.
It should be aimed at conversions increasing and be relevant to your goals. For instance, you may assume that enlarging the CTA button will grow conversions. List all your business goals and think of hypothesis you would like to check for each of them.
Create variations and test your hypothesis.
This is the main part of the content testing process, and we are going to provide you with a detailed manual on performing testing. Long story short, the gist lays in calculating the time period for testing, current and forecasted conversion rate, the percentage of possible improvement.
Get the data and analyze it.
Check which version (test the original) performed better, look at user journey and "save the changes" in case they are effective. Usually, A/B testing software generates reports automatically. You just have to look at the results and make a decision on what should be left.
Focus on your goal metric.
The point is not to be confused; the same rule applies, which is that it is always reasonable to choose a few variables for content testing. If you want to test one more variable, better start another test and measure its results with another metric.
Best A/B Tests Practices
Walter Menendez, a data infrastructure engineer at BuzzFeed, shared an overview of the newsroom’s testing strategy during the talk at MIT in 2017. Their headlines testing strategy can impress, as they experiment with every element.
"We A/B test pretty much everything," said Menendez. "Not just the headline, but also the number in the headline. The thumbnail that’s rendered on Facebook or Twitter: Sometimes people don’t even see the headline, they just see an image and say, ‘Wait. What is in that image?"
The New York Times
This newspaper shares the story of how and why they created their ABRA (A/B Reporting and Allocation architecture) technical solution. They had spent 2 years building the software. They stated that ABRA solution is engaged in a wide range of experiments in The New York Times both for desktop and mobile platforms. What’s more, it also allows for experimentation with monetization methods. Due to this A/B testing, you could notice that they have changed their paywall toward stricter direction by the end of 2017, as we mentioned earlier.
Journalists and editors are also experimenting with headlines. They conduct A/B tests on the homepage headlines. 50% of readers see one variant and the other 50% see the alternative option. Tests usually last for half an hour and then editors use the "winning" one to attract more readers. However, this is not as easy as everyone may think. Split testing is a hard work too.
Mark Bulik, a senior editor who oversees digital headlines at The New York Times, notes: "People think if you change a headline, that it’s some kind of ‘Gotcha!,’ and it’s just not. People who think it’s a ‘gotcha’ just don’t have a full understanding of news in the digital world."
What to Learn from This
Experiment, conduct split tests, keep improving the quality of your website contents and its elements:
- Article topics.
- CTA buttons.
- And everything that can make your audience loyal and conversions higher.
In the end, improving the quality leads to the growth of user experience and engagement, and finally, increases the newsroom’s revenue.
Stay tuned and keep an eye on our updates. An article about our IO A/B testing solution is coming soon. We will share tips on how to use it.