These days, it's almost a painful experience reading newspaper news online.
It's just generally a pain to read these news sites. Because they are soooooo sloooooooow. Your whole browser is stuttering and coughing blood and every click feels like a battle.
Here's a slight comparison (in no particular order) of various popular news websites (using a DSL connection):
|Site||Load Time||Requests||Bytes in||SpeedIndex|
|BBC News homepage||14.4s||163||1,316 KB||6935|
|Los Angeles Times||35.4s||264||1,909 KB||13530|
|The New York Times||28.6s||330||4,154 KB||17948|
|New York Post||40.7s||197||6,828 KB||13824|
|USA Today||19.6s||368||2,985 KB||3027|
|The Washington Times||81.7s||547||12,629 KB||18104|
|The Verge||18.9s||152||2,107 KB||7850|
|The Huffington Post||22.3s||213||3,873 KB||4344|
Wow! That's an average of...
- 34.2 seconds per page
- 278 requests per page
- 4,643 KB per page
That is just too much. I appreciate that these news companies that make these sites need to make a living and most of the explanation of the slow-down is the ads. Almost always, if you click to open one of those pages, it's just a matter of sitting and waiting. It loading in a background tab takes so much resources you feel it in other tabs. And by the time you go to view and you start scrolling your computer starts to cough blood and crying for mercy.
But what about the user? Imagine if you simply pull out of some of the analytics trackers and opt for simple images for the ads and just simplify the documents down to the minimum? Google Search does this and they seem to do OK in terms of ad revenue.
An interesting counter-example; this page on nytimes.com is a long article and when you load it on WebPageTest it yields a result of 163 requests. But if you use your browser dev tools and keep an eye on all network traffic as you slowly scroll down the length of the article all the way down to the footer, you notice it downloads an additional 331 requests. It starts with about 2KB of data but by the time you've downloaded to the end and scrolled past all ads and links it's amassed about 5.5KB of data. I think that's fair. At least the inital feeling when you arrive on the page isn't that of disgust.
And then it hit me!
My parting advice to newspapers who can do something about their website; Radically refactor your code. Question EVERYTHING and delete, remove and clean up things that doesn't add value. You might lose some stats and you might not be able to show as much ads but if your site loads faster you'll get a lot more visitors and a lot more potential to earn bigger $$$ on fewer more targetted ads.
But I can so easily see it happen. A middle-manager whose job is to increase revenue by 5% is going to say to the developers "One little extra tracker can't hurt. It's only 100 lines of code and one small png." It's hard to argue against that because it's true.
It may be true, but it's also true that by using that time to optimize the site instead/first, you can make even more profit from users choosing to explore and possibly return to your site.
That's kinda the point of the article. Faster site, happier users, more profit.