News sites suck in terms of performance

08 August 2015   5 comments   Web development

These days, it's almost a painful experience reading newspaper news online.

It's just generally a pain to read these news sites. Because they are soooooo sloooooooow. Your whole browser is stuttering and coughing blood and every click feels like a battle.

The culprit is generally that these sites are so full of crap. Like click trackers, ads, analytics trackers, videos, lots of images and heavy Javascript.

Here's a slight comparison (in no particular order) of various popular news websites (using a DSL connection):

Site Load Time Requests Bytes in SpeedIndex
BBC News homepage 14.4s 163 1,316 KB 6935
Los Angeles Times 35.4s 264 1,909 KB 13530
The New York Times 28.6s 330 4,154 KB 17948
New York Post 40.7s 197 6,828 KB 13824
USA Today 19.6s 368 2,985 KB 3027
The Washington Times 81.7s 547 12,629 KB 18104
The Verge 18.9s 152 2,107 KB 7850
The Huffington Post 22.3s 213 3,873 KB 4344
CNN 45.9s 272 5,988 KB 12579

Wow! That's an average of...

That is just too much. I appreciate that these news companies that make these sites need to make a living and most of the explanation of the slow-down is the ads. Almost always, if you click to open one of those pages, it's just a matter of sitting and waiting. It loading in a background tab takes so much resources you feel it in other tabs. And by the time you go to view and you start scrolling your computer starts to cough blood and crying for mercy.

But what about the user? Imagine if you simply pull out of some of the analytics trackers and opt for simple images for the ads and just simplify the documents down to the minimum? Google Search does this and they seem to do OK in terms of ad revenue.

An interesting counter-example; this page on nytimes.com is a long article and when you load it on WebPageTest it yields a result of 163 requests. But if you use your browser dev tools and keep an eye on all network traffic as you slowly scroll down the length of the article all the way down to the footer, you notice it downloads an additional 331 requests. It starts with about 2KB of data but by the time you've downloaded to the end and scrolled past all ads and links it's amassed about 5.5KB of data. I think that's fair. At least the inital feeling when you arrive on the page isn't that of disgust.

Another realization I've found whilst working on this summary is that oftentimes, sites that are REALLY really slow and horrible to use don't necessarily have super many external resources from different domains but they just have far far too much Javascript. This random page on Washington Times for example has 209 Javascript files that together weighs 9.4KB (roughly 8 times the amount of image data). Not only does that need to be downloaded. It also needs to be parsed and I bet a zillion event handlers and DOM manipulators kick in and basically make scrolling a crying game even on a fast desktop computer.

And then it hit me!

We've all been victims of websites annoyingly try to lure you to install their native app when trying to visit their website on a smartphone. The reason they're doing that is because they've been given a second chance to build something without eleventeen iframes and 4MB of Javascript so the native app feels a million better because they've started over and gotten things right.

My parting advice to newspapers who can do something about their website; Radically refactor your code. Question EVERYTHING and delete, remove and clean up things that doesn't add value. You might lose some stats and you might not be able to show as much ads but if your site loads faster you'll get a lot more visitors and a lot more potential to earn bigger $$$ on fewer more targetted ads.

Comments

Chris Adams
Scary as those performance numbers are, it's even more dramatic if you disable JavaScript and many sites lose nothing of value to the user. If I worked for a media company I'd be rather worried that the collective disregard for the user experience will lead to widespread blocking; a little respect and reconsidering business models would go a long way towards not poisoning that well.
Peter Bengtsson
But I can so easily see it happen. A middle-manager whose job is to increase revenue by 5% is going to say to the developers "One little extra tracker can't hurt. It's only 100 lines of code and one small png." It's hard to argue against that because it's true.
Damian J Pound (Chinoto Vokro)
It may be true, but it's also true that by using that time to optimize the site instead/first, you can make even more profit from users choosing to explore and possibly return to your site.
Peter Bengtsson
That's kinda the point of the article. Faster site, happier users, more profit.
Juib Morrowind
I have been wondering why this issue has not been in discussion as much. Most websites are so heavily loaded with javascript that even the best hardware cannot give good performance. Just try scrolling down the comments section of any news website. Continuous ad refreshes and dancing monkeys will get your page stuck for a few seconds every time.

Your email will never ever be published


Related posts

Previous:
Worst performing website of the year 07 August 2015
Next:
Introducing optisorl 18 August 2015
Related by Keyword:
To CDN assets or just HTTP/2 17 May 2018
Webpack Bundle Analyzer for create-react-app 14 May 2018
Even more aggressively trying to preload your next page load 22 January 2018
Another win for Tracking Protection in Firefox 13 December 2017
Ultrafast loading of CSS 01 September 2017
Related by Text:
jQuery and Highslide JS 08 January 2008
I'm back! Peterbe.com has been renewed 05 June 2005
Anti-McCain propaganda videos 12 August 2008
Ever wondered how much $87 Billion is? 04 November 2003
Guake, not Yakuake or Yeahconsole 23 January 2010