I just rolled out a change here on my personal blog which I hope will make my few visitors happy.

Basically; when you hover over a link (local link) long enough it prefetches it (with AJAX) so that if you do click it's hopefully already cached in your browser.

If you hover over a link and almost instantly hover out it cancels the prefetching. The assumption here is that if you deliberately put your mouse cursor over a link and proceed to click on it you want to go there. Because your hand is relatively slow I'm using the opportunity to prefetch it even before you have clicked. Some hands are quicker than others so it's not going to help for the really quick clickers.

What I also had to do was set a Cache-Control header of 1 hour on every page so that the browser can learn to cache it.

The effect is that when you do finally click the link, by the time your browser loads it and changes the rendered output it'll hopefully be able to do render it from its cache and thus it becomes visually ready faster.

Let's try to demonstrate this with this horrible animated gif:
(or download the screencast.mov file)

Screencast
1. Hover over a link (in this case the "Now I have a Gmail account" from 2004)
2. Notice how the Network panel preloads it
3. Click it after a slight human delay
4. Notice that when the clicked page is loaded, its served from the browser cache
5. Profit!

So the code that does is is quite simply:

$(function() {
  var prefetched = [];
  var prefetch_timer = null;
  $('div.navbar, div.content').on('mouseover', 'a', function(e) {
    var value = e.target.attributes.href.value;
    if (value.indexOf('/') === 0) {
      if (prefetched.indexOf(value) === -1) {
        if (prefetch_timer) {
          clearTimeout(prefetch_timer);
        }
        prefetch_timer = setTimeout(function() {
          $.get(value, function() {
            // necessary for $.ajax to start the request :(
          });
          prefetched.push(value);
        }, 200);
      }
    }
  }).on('mouseout', 'a', function(e) {
    if (prefetch_timer) {
      clearTimeout(prefetch_timer);
    }
  });
});

Also, available on GitHub.

I'm excited about this change because of a couple of reasons:

  1. On mobile, where you might be on a non-wifi data connection you don't want this. There you don't have the mouse event onmouseover triggering. So people on such devices don't "suffer" from this optimization.
  2. It only downloads the HTML which is quite light compared to static assets such as pictures but it warms up the server-side cache if needs be.
  3. It's much more targetted than a general prefetch meta header.
  4. Most likely content will appear rendered to your eyes faster.
voracity - 21 August 2014 [«« Reply to this]
Nice. Something I've tried in the past is to execute the page navigation on mousedown (rather than waiting for the mouse up as per default). (With something like $("a").mousedown(function() { location.href = $(this).attr("href") }).) Curious to know what the combo of the two would be like.
Peter Bengtsson - 21 August 2014 [«« Reply to this]
I'm not convinced by that at all. Since, with any button or link, if you change your mind after you have clicked down you can move the cursor out and release the click outside its original place it cancels the action.
sam - 21 August 2014 [«« Reply to this]
Why not implement this feature in firefox, so every site can benefit?
Christian Heilmann - 21 August 2014 [«« Reply to this]
It is a nice idea, but I am not sure about the benefits. Many times I hover over a link to see if it is suspicious, preloading that one would be a bad plan (even if it is only the HTML). Clicking a link is a conscious decision - I want to go there. I should not load lots of content in the background that I might want if for example I scroll with two fingers. This very much annoyed me at Facebook lately - playing videos just when your mouse hovered them which made scrolling very slow indeed.

It is a nice idea but I think our efforts are spent more wisely in making the overall load of a page smaller and quicker rather than prefetching. Replacing a "I spy on your users" like button with a simple link pointing to that social services's REST API for example saves a lot of bytes.
Peter Bengtsson - 21 August 2014 [«« Reply to this]
But loading videos has a significant effect that some percentage of people don't like. I can't imagine many people NOT liking faster loading times.

To be honest, I'm not 101% convinced about this hack myself. That's why I'm using my personal blog as an experimental ground. Let's see how it feels as it's brewing.
Anon - 21 August 2014 [«« Reply to this]
Hi,
Really interesting idea, thanks!

It inspired me to create a rather similar (but enhanced) Greasemonkey/Tampermonkey script that works with any website (and additionally prefetches cross domain links):

https://greasyfork.org/scripts/4397-prefetch-links-when-hovered

There's one problem though! some links may actually be dangerous to prefetch as they invoke actions that are not desired by the user. For example "log in/out", delete, confirm, send etc.. I added a basic filter for these (English only for now), but there's still a lot of work to be done..
Peter Bengtsson - 21 August 2014 [«« Reply to this]
Add that to the list of reasons why Log out should ideally always require a POST request.
Robert Kaiser - 21 August 2014 [«« Reply to this]
The problem with this is that it makes you being tracked even more firmly and spread your fingerprint and other info even further even though you end up not clicking a link. I for example often hover over a link to see the URL so I can decide if I even want to go there - there's sites I just do not want to visit or load. This behavior is completely subverted by your approach. Of course, you might argue that this is completely non-standard behavior.
Peter Bengtsson - 21 August 2014 [«« Reply to this]
Point taken. Granted, in my implementation I only do this for links here on my site by only applying it to links whose href URL starts with a "/". In that sense it spread your fingerprint across the web.
Peter Bengtsson - 21 August 2014 [«« Reply to this]
I wonder what Google, Bing, DuckDuckGo, etc. does? They know that when people make a search they most definitely will click the first result link.
Anon - 22 August 2014 [«« Reply to this]
Hi Peter,

I thought a bit about this and realized that even if I manage to identify and prevent 90% of undesired/dangerous links being fetched there would still be a good chance of disastrous outcomes (user logged out, files deleted, embarrassing message sent, missiles being launched etc.).

So I came up with a different, more limited (though safer) but still effective idea:

SSL Accelerator:
https://greasyfork.org/scripts/4419-ssl-accelerator

It doesn't prefetch link content but shortens load times for secure sites by cutting on the lengthy SSL/TLS handshakes. Basically what the script does is send HEAD requests to the base path of the URL's host when a secure link is hovered on. In general there's shouldn't be any possible danger doing that and most sites won't even track it (aside from appearing on the server log perhaps). When a link on the (pre-handshaked) host is eventually opened, it will probably perform an abbreviated handshake, or none at all (need to observe the raw packets to check this, haven't got to this yet).
Andy Huang - 26 August 2014 [«« Reply to this]
I'm wondering how much of the benefit is real, and how much of the benefit is placebo effect?

I'm more of an avid laptop user, and I live exclusively in trackpad land. That means there's no mouse for me to fumble around with, and so I'm not actively moving my mouse over links. Instead, I click the links immediately after I move my mouse into the said link, leaving very little time for it to pre-load the contents.

Additionally, with the inspector open, there is an observable (if nothing else, at least by gamers, so <1/30th of a second?) delay from hovering into the link, and start seeing network activity.

So, unless the user is slow, or have a crazy good connection with your server, I think it is entirely possible that between the time of entering the link, and clicking, the preload doesn't finish, and a new request gets made instead of pulling from cache, thus using more bandwidth?
Peter Bengtsson - 26 August 2014 [«« Reply to this]
It's not for every clicker. Really fast clicker won't get the benefit.
Also, if the prefetch starts without finishing before the click happens the browser terminates the request and that's fine.

My next endeavor ought to be to try to measure how many percent of people benefit from this prefetching. I wonder if it'd be possible to do with google analytics.


Your email will never ever be published