How does page load time impact engagement?
A case study in metrics politics, and how we gave engineering a number to fight back with
Originally published on the Optimizely blog in 2016. Republished here with less waffle, fewer Telegraph PR edits, and some hindsight.
In 2016, newspapers were in freefall.
In my first week working as the first in-house CRO at The Telegraph, I remember digging through Adobe Analytics and making a chart that felt very representative of the state of things: over the last few years, traffic referred from Facebook had increased in a long, slow, and steady line. At the same time, the number of pageviews per visit (a proxy for ads viewed and money made) had the exact same shape on another axis.
Where previously people would visit the homepage and click around on news items they found interesting, they had learned to wait for news to come to them.
Revenue was collapsing, and publishers were desperate enough to try anything—even handing control to the platforms that were killing them. Google's AMP and Facebook's Instant Articles promised 10x faster pages by stripping out tracking bloat. The trade-off, unstated but obvious: make your pages faster or lose the right to directly serve content to your audience.1
"It's just one line of JavaScript"
With traditional revenue streams collapsing, the advertising team was one of the few departments making money. The owners (reclusive twin brothers who lived in a neo-Gothic castle on an island they bought off the coast of Normandy)2 wanted profit, and management was under pressure to squeeze it out however they could.
As a result, when the ad team asked for something, they usually got it.
And what they kept asking for was more tracking scripts. Ad networks, retargeting pixels, data management platforms. Each one promising to squeeze a bit more revenue out of every pageview.
The request was always the same: "It's just one line of JavaScript. This tracker will generate £X in revenue." They had numbers. They could prove it.
Engineering would push back. "It slows the site down. It's bad for users."
"Prove it costs us money," the ad team would respond.
Of course, no one could.
They had no number. Just a general feeling that making the site slower wasn't a fantastic idea.
And without a number, they always lost the argument.
How we broke the site on purpose
Developer Stephen Giles and I set out to change this by running an A/B test.
Originally, we wanted to deploy an A/B test that blocked trackers for a percentage of users. But of course, that was already money in the bank for the ad team and we weren't allowed to.
But we could do, was run an A/B test that simulated the future.
At the time, we were adding something like 3 to 4 seconds of latency per month. This is a number I struggle to believe myself, but it tracks with what others were reporting. One memorable Monday Note headline from that year: "News Sites Are Fatter and Slower Than Ever".
Our test variants weren't arbitrary:
Four seconds was next month, 16 seconds was four months out.
If we waited for engagement to decline naturally, correlation and causation would be impossible to unpick.
Stephen found a clever method to simulate adtech bloat by making callbacks to an AWS box designed to generate latency. More callbacks, more latency, longer pageload times.
After extensive internal discussion, we received sign-off to test four variants across a few hundred thousand visitors over two weeks.
Finally, a number!
As expected, slower pages meant fewer pageviews, but the resilience of readers surprised us.
- Variant A: ~4 seconds delay = -11.02% pageviews
- Variant B: ~8 seconds delay = -17.52% pageviews
- Variant C: ~16 seconds delay = -20.53% pageviews
- Variant D: ~20 seconds delay = -44.19% pageviews
I guess a dyed-in-the-wool Torygraph reader won't suddenly start reading The Guardian, even if they have to wait 16 seconds for a page to load. Users on other sites are unlikely to be so patient.
Using the Telegraph's internal metric for pageview value, we could model the revenue impact of any change that slowed the site down.
For the first time, the Product and Engineering teams could fight back in the only language that mattered: money. Every tracker had a cost. Every delay had a price.
The ad team's numbers weren't the only numbers in the room anymore.
In 2021, Google made page speed an official ranking factor with Core Web Vitals and the argument we were making internally at The Telegraph became Google policy five years later.