Filtered by This site


Aggressively prefetching everything you might click

August 20, 2014
13 comments This site, Web development, JavaScript

I just rolled out a change here on my personal blog which I hope will make my few visitors happy.

**Basically; when you hover over a link (local link) long enough it prefetches it (with AJAX) so that if you do click it's hopefully already cached in your browser. **

If you hover over a link and almost instantly hover out it cancels the prefetching. The assumption here is that if you deliberately put your mouse cursor over a link and proceed to click on it you want to go there. Because your hand is relatively slow I'm using the opportunity to prefetch it even before you have clicked. Some hands are quicker than others so it's not going to help for the really quick clickers.

What I also had to do was set a Cache-Control header of 1 hour on every page so that the browser can learn to cache it.

The effect is that when you do finally click the link, by the time your browser loads it and changes the rendered output it'll hopefully be able to do render it from its cache and thus it becomes visually ready faster.

Let's try to demonstrate this with this horrible animated gif:
(or download the file)

1. Hover over a link (in this case the "Now I have a Gmail account" from 2004)
2. Notice how the Network panel preloads it
3. Click it after a slight human delay
4. Notice that when the clicked page is loaded, its served from the browser cache
5. Profit!

So the code that does is is quite simply:

$(function() {
  var prefetched = [];
  var prefetch_timer = null;
  $('div.navbar, div.content').on('mouseover', 'a', function(e) {
    var value =;
    if (value.indexOf('/') === 0) {
      if (prefetched.indexOf(value) === -1) {
        if (prefetch_timer) {
        prefetch_timer = setTimeout(function() {
          $.get(value, function() {
            // necessary for $.ajax to start the request :(
        }, 200);
  }).on('mouseout', 'a', function(e) {
    if (prefetch_timer) {

Also, available on GitHub.

I'm excited about this change because of a couple of reasons:

  1. On mobile, where you might be on a non-wifi data connection you don't want this. There you don't have the mouse event onmouseover triggering. So people on such devices don't "suffer" from this optimization.
  2. It only downloads the HTML which is quite light compared to static assets such as pictures but it warms up the server-side cache if needs be.
  3. It's much more targetted than a general prefetch meta header.
  4. Most likely content will appear rendered to your eyes faster.

Wattvision - real-time energy monitoring

April 27, 2014
2 comments This site

The camera sensor
Last weekend I installed a Wattvision ("real-time energy monitoring sensors") in my house. It's so you can measure how much electricty your house is using. In real-time.

So it comes in two parts:

1) A camera sensor that is attached to the electricty meter. It stares at the rotating disk all day.

2) A little router/sensor thing that is connected to the camera and connects, by Wi-Fi, to your home router.

Then, the little router/sensor sends all your measurments to's servers. After that, I sign in to Wattvision (using my Google account) and there I can get all the statitics about my house electricity. Simple, ah?

Down in my basement
Wattvision started as a Kickstarter project two years ago and since I sponsorered that project they sent me a kit now that it's fully tested and working. Yay!

The installation was almost jokingly simple to set up! It had that lovely "just works" feeling to it. The only challenging part was to pull the sensor wire from the corner of the house to a good spot in our basement. My wife, who is much shorter than me, crawled under our crawl-space and helped me hook it all up. I was just so impressed with the instructions. They were very well written.

Now that it's set up, you get all your statistics and graphs by signing in to and it works great on mobile as well. I have to admit, at this point, I really haven't understood what it all does and what it all means. Besides, because I only installed it a week ago, I don't yet have enough data to compare current usage with historic usage. By the way, you can download your data in CSV form too.

The sexiest feature is to be able to sit and watch your graph and then you deliberately switch something on in the house and you can see the graph "spike". Obviously the height of the spike depends on what you're switching on. For example, an LED light I don't even think it registers (admittedly, haven't tested that yet).

I think this is the key reason to have Wattvision; to get an insight into what in your household causes the most energy consumption. Having said that, we're not going to stop taking showers.

In conclusions...

Comparison chart
You simply can't have data analysis without data collection. Also, if there's anything you want to trim, such as body fat, awareness is usually a very good weapon.

I don't know if I'll be checking back into the statistics very often. The novelty might just wear off after a while. We'll see.

10 years of blogging

September 19, 2013
2 comments This site

I'm now off by about two months but in June 2003 I posted my first ever blog post.

My first website was launched in 1997 but that one is long lost. The next version, which actually used a database and a real web framework was launched in 2001 and this is the oldest screenshot I could find.

A really old version of my blog
Back then the site was built in Zope which at the time was the coolest shit you could possibly use. Back in 2003 I was renting a room in an apartment in London when I was studying at City University. The broad band (american's know this as DSL) we had had a static IP address so I could tie my domain name directly to my bedroom basically. If you're born in the nineties or anything sooner you wouldn't remember this but for almost 20 years you could either buy a laptop (small but slow) or a stationary computer (clunky but fast) and this laptop I was running on was no exception. Not to mention it was an abandonned laptop too. I think it had about 8 MB of RAM. I ran a stripped down version of Debian on it without any graphical interface. I managed the code by scp'ing files into it from my Windows computer.

Anyway, running on a home DSL line with on a rusty old laptop blinking away under my bed meant that site would be ultra-slow if I didn't pre-optimize it. And that was something I did. The site had a Squid cache in front of it and the HTML, CSS and Javascript was compressed by a script I wrote called slimmer.

Back in 2003 blogging was getting hotter than celebrity spotting and I was very much interested in something that later became called "SEO" and the rumor at the time was that "blogs" got penalized by Google because blogs usually just re-posted stuff from real web pages. So I decided to prefix all my content with the word "plog". It's was a mix of "p" for Peter and sufficiently different from the word "blog".

In the first couple of years of blogging I would blog about all sorts of stuff that caught my interested. Not just genuine thoughts or real technology notes but any fun link I came across. That became a massive trend later (and still is I guess) by the giants like Digg and Reddit so I stopped doing that with my own blog. In the last 7 years (give or take) I only blog about things that are genuinely close to heart or something I've actually worked on.

Some stats:

Total number of blog posts: 949
Total number of approved blog comments: 8,086
Number of email addresses collected: 4,292
Maximum number of comments on any one post: 2,749
Number of Cease or Desist letters received: 1

To me, blogging used to be a form of shouting out to the world what I found interesting in the hope that you'll also find it interesting and that you'll thank me for finding that. Now it's a way for me of either documenting something I've learned recently or some other announcement that is related to what I do on some technical thing.
I wonder how this will change for me in the next 10 years.

What makes my website slow? DNS

October 23, 2009
14 comments This site, Linux

Pagetest web page performance test is a great tool for doing what Firebug does but not in your browser. Pagetest can do repeated tests to iron out any outliers. An alternative is Pingdom tools which has some nifty sorting functions but is generally the same thing.

So I ran the homepage of my website on it and concluded that: Wow! Half the time is spent on DNS lookup!

First Second Third

The server it sits on is located here in London, UK and the Pagetest test was made from a server also here in the UK. Needless to say, I was disappointed. Is there anything I can do about that? I've spent so much time configuring Squid, Varnish and Nginx and yet the biggest chunk is DNS lookup.

In a pseudo-optimistic fashion I'm hoping it's because I've made the site so fast that this is what's left when you've done all you can do. I'm hoping to learn some more about this "dilemma" without having to read any lengthy manuals. Pointers welcomed.

Previous page
Next page