We Analysed 6379 A/B Headline Tests and This Is What We Learned

You know you should run A/B tests on your headlines.

However, very few editors understand the real value of headline testing, and most publishers are testing ineffectively. If you want to improve your efficiency and benchmark your A/B test results, you can of course refer to our articles on headline testing tips and scientific findings on headline testing. But what’s more important though is that you know the answer to two questions:

  1. What’s the relative effect of A/B testing headlines on pageviews?
  2. How many alternative headlines should you test?

To find out, we analyzed 6,379 A/B tests for news and media sites, carried out over 10 months. This included over a billion content sessions, 467 million unique visitors, and over 25 million hours spent viewing content.

In any A/B test, you have three possible outcomes:

  • Your original headline wins.
  • Your alternative headline wins.
  • No clear winner.

But how do you define a winning headline? Some rely on measuring CTR, but that only tells half the story. For example, clickbait headlines get a lot of clicks, but no one’s going to stick around when they discover the article’s rubbish. To filter out these kinds of headlines, our software uses an algorithm to determine winning headlines using a combination of three parameters:

  • CTR,
  • Time on page,
  • Recirculation.

How does A/B testing headlines affect pageviews?

It’s important to know whether or not A/B testing actually makes a difference. After analyzing the results of each test (and drinking way too much coffee), the answer was a definite yes. In fact, running an A/B test increased pageviews by 38.8%.

Some numbers:
Total tests in the dataset: 6,379; total winners: 4,498 (2,579 custom winners and 1,919 original winners); tests without winners: 1,881; basic pageviews: 20,374,503; additional pageviews: 7,897,457; median headline length, words: 10.0; average headline length, words: 10.6; median headline length, chars: 66.0; average headline length, chars: 67.75; median amount of variants: 2.0; average amount of variants: 2.19; pageviews growth rate: 38.8%.

However, like all things statistical, things aren’t quite as clear cut. A significant number of tests resulted in no clear winner. In other cases, the winning headline increased engagement metrics but not the actual number of pageviews. This leaves us with a mean long-term growth rate of 28.1% per test.

ab headline testing impact

Sure, A/B testing is great, but how likely is it your original headline will win? According to our research:

  • Probability of a custom winner: 40.4%
  • Probability of an original winner: 30.1%
  • Probability of no clear winner: 29.5%

TL;DR: A/B testing increased pageviews by 28.1% on average and by 38.8% in total.

How many headlines should you test?

Our research found the vast majority of publishers use A/B testing to test only two potential headlines. When we look at the overall results, it’s clear this is a big mistake:

research findings table

If we look at just the statistically significant results, testing four different headline options—instead of only two—increases the chance of a custom headline winning from 37.7% to 60.8%.

ab headline options winner

The results also conclusively showed that each additional headline option increased the number of additional pageviews. This applied for both tests with winners and all finished tests. Mean additional pageviews more than doubled from 25% (in tests with 2 options) to 51% (in tests with 4 options).

ab headline options pageviews

TL;DR: the more headline options for an A/B test you use, the higher the chance you will get a custom winner and the more effective the winner will be.

Headline testing is a powerful tool that takes less than 1 minute to launch and 15-20 minutes to wait for results. With the right software, it’s easy to learn and easy to use. The results are clear: Carrying out A/B testing on your headlines increases your pageviews, while more options increase the chance you’ll find a winning headline and get even more pageviews.

Additionally, testing allows your editorial team to understand your audience and which headlines truly engage them. This means you can grow your pageviews, time on page, and convert casual readers into a loyal audience.

ab headline test results

Methodology

To ensure the results were reliable, we compared the significance of custom winners distribution across the number of headline options with Pearson's chi-squared test, using a standard significance level of 0.05.

Using 2-headline tests as the control group, we found results for tests with 3, 4, and 5 options were statistically significant. However, due to the lack of tests with 6 options and above, we can’t state whether or not results for 6-headline tests and above are significant.

This means it’s statistically correct to state that increasing the number of headline options to 5 more than doubles the chance of a custom headline winning the test, from 37.7% to 77.8%. However, as tests with 5 headline options represent only a fraction of the tests, with a low reproducibility of results, we can’t categorically state the benefits of running a 5-headline test.

When it comes to the average effect on pageviews, we applied three types of significance tests (Student's t-test, Mann–Whitney U test, and Wilcoxon signed-rank test), which all found the tests with 3 and 4 options statistically relevant. The Student and Mann-Whitney tests found the 5-headline tests statistically significant, but the Wilcoxon test didn't, meaning the 5-headline tests may or may not be statistically significant.