23 October 2009 14 comments This site, Linux
Pagetest web page performance test is a great tool for doing what Firebug does but not in your browser. Pagetest can do repeated tests to iron out any outliers. An alternative is Pingdom tools which has some nifty sorting functions but is generally the same thing.
So I ran the homepage of my website on it and concluded that: Wow! Half the time is spent on DNS lookup!
The server it sits on is located here in London, UK and the Pagetest test was made from a server also here in the UK. Needless to say, I was disappointed. Is there anything I can do about that? I've spent so much time configuring Squid, Varnish and Nginx and yet the biggest chunk is DNS lookup.
In a pseudo-optimistic fashion I'm hoping it's because I've made the site so fast that this is what's left when you've done all you can do. I'm hoping to learn some more about this "dilemma" without having to read any lengthy manuals. Pointers welcomed.
It would be very interesting to know how you got this far in optimizing it.
This delay should only happen when your nearest DNS don't have the domain cached.
You have 900 seconds TTL on www.peterbe.com, which means that local DNS servers needs to ask mydomain again quite often...
BTW I tried it on one of my static sites and got a lot of bogus results. But the DNS resolving time for my .se address hosted on my own TinyDNS was a third of yours with the same UK testserver as you used (I think). Maybe is .SE faster than .COM also.
BTW2 We meet @ Euro DjangoCon where we had a short lunch together one day at the conferance.
Do you have nscd (Name Service Cache Daemon) installed on the machine? I guess it will help
We're working on making the connectivity in the UK location more like what we have in the US and NZ locations where the connectivity is more in line with a consumer experience.
http://gtmetrix.com
http://tutor.rs