I've put all the static resources behind this site now on AWS CloudFront For example this: http://static.peterbe.com/misc_/Peterbecom/home/grey_face.1282513695.png
This site doesn't really have much traffic. About 50,000 pageviews per month. The bill for the last month: $0.55
I think I can afford that :)
What makes my website slow? DNS
23 October 2009
This site, Linux
Pagetest web page performance test is a great tool for doing what Firebug does but not in your browser. Pagetest can do repeated tests to iron out any outliers. An alternative is Pingdom tools which has some nifty sorting functions but is generally the same thing.
So I ran the homepage of my website on it and concluded that: Wow! Half the time is spent on DNS lookup!
The server it sits on is located here in London, UK and the Pagetest test was made from a server also here in the UK. Needless to say, I was disappointed. Is there anything I can do about that? I've spent so much time configuring Squid, Varnish and Nginx and yet the biggest chunk is DNS lookup.
In a pseudo-optimistic fashion I'm hoping it's because I've made the site so fast that this is what's left when you've done all you can do. I'm hoping to learn some more about this "dilemma" without having to read any lengthy manuals. Pointers welcomed.
Every week I get an email via this website from someone who wants me to help them hack something. I've written things about the subject "hacking" but that doesn't make me a hacker. I'm not a hacker. Here's this week's nutter email I got:
im suraj from India.
Actually i want to b a computer expert.There must b nothin wth the coputer tht i cant do.so i think it can b done only wth a hacker.So can u plz help me wth this.pPleae tel me wht i hav to learn.
Does that make any sense?
Yesterday I got another one of these:
dear sir, yesterday some body stolen my cell cell number 9848133384 please search for the same and freeze my account so that no body can use it thank you yours faithfully kgdeekshitulu
It seems that a lot of people in India lose their mobile phones these days and that a lot of people in India think that by emailing me about it that I can do something about it.
I'm just a blogger. I don't have a machine that can cancel mobile SIMs.
I've recently registered now set up a new domain name for this website. It's peterbe.mobi and is there to the be mobile version of the pages.
I've also made some of these slight improvements to the css for mobile and removed the category images on the blog items. There is so much more that I can do but I just haven't had time. For example, there's no search on the mobile version.
The screenshot here to the right is from a Firefox extension I have called "Small Screen Rendering" which is useful if you want to get a guessimate look of how your page might appear in browser with a very small screen. Immediate conclusion: there's a hell of a lot of scrolling :)
Photos from FWC China 2005
27 January 2006
Photos, This site
I've now finally uploaded all my photos from the trip to China.
From 1,000 huge jpegs (at 1.6Gb) down to 300 resized ones (at 43Mb) it took quite a long time to rotate, chose, colour modify and title. To do it I had to use digikam which is the best photo album organiser program available on Linux. Even though it's the best I've found so far it still sucks. It's frustrating when you have lots to do but it's free and works better than nothing and I haven't donated any money to it.
As you might have noticed I have had to reduce the image quality quite a bit especially of the thumbnails. Sorry about this but I see thumbnails as navigation, not the real content. If you want higher resolution images I might be able to get you the original JPG if you ask kindly for it.
At the time of writing, if you do any of these searches on google:
...you'll notice two patterns:
- My site is no. 1 on all three
- They're all badly spelled
I noticed this because these are referals from my logs that I've backtracked and listed. In all cases it's unintentional misspellings either by myself or people commenting within the pages. If it wasn't for these "double misstakes" certain people who never have reached my site. You can't control how bad people spell (especially when it comes to names) but you can control your own content.
In the case of 7 wounders of the world I could have delberately included a spelling misstake of "wounders" to catch both those who can spell and those who can't, thus increasing your reach via Google.
Personally I think you should, as a web master/content developer, avoid as many misspellings as possible and not try any cheap tricks but perhaps in some extreme cases, this phenomena can turn to your own advantage. Interesting.
We'll see what effect this might have and if it's worth. I guess 99% of all visitors on this site get it right but this tightens the "fool-proofness" even more. Google have one such alias set up on ww.google.com but not wwww.google.com
At the moment I'm not running Squid for this site but if experimentation time permits I'll have it running again soon. One thing I feel uneasy about is how to "manually" purge cached pages that needs to be updated. For example, if you read this page (and it's cached for one hour) and post a comment, then I'd like to re-cache this page with a purge. Setting a HTTP header could be something but that I would only be able to do on the page where you have this in the URL:
which, because of the presence of a querystring, is not necessarily cached anyway. The effect is that as soon as the "?msg=Comment+added" is removed from the URL, the viewer will see the page as it was before she posted her comment.
squidclient might be the solution. ...sort of.
squidclient is an executable program that you get when you install the squid cache server. As described in this documentation you can manually purge any cache on a site which would have the desired effect to the problem mentioned above. The only problem is that I have about 30-60 posted comments per day and that would be a hell of a lot of command line calls to squid. Secondly, they'd probably be quite slow and the person posting a comment won't be prepared to wait that long. The code for this would be something like this:
cmd = 'squidclient -m PURGE %s' % self.absolute_url() result = os.popen4(cmd).read() return result.find('200 OK') > -1
(obviously this would need to be wrapped in some security assertions)
An even better solution exists only as a dream. A Python binding for
squidclient that I can use directly from my Zope Python code:
# <pseudocode> import pysquidclient server = pysquidclient.Server('localhost', 80) r = server.purgeURL(self.absolute_url()) return r.isOK() # </pseudocode>
Imagine that! You could create the
server instance for the duration of the Zope server running just call the
purgeURL() function multiple times thus saving looooads of time. I guess it might be worth testing the
os.popen4() method to see if it works. If it doesn't work, then maybe it's time to start looking at ESI
Thanks Kevin (and Seb) for pointing this out. The solution is something like this:
(scheme, host, path, params, query, fragment ) = urlparse.urlparse(objecturl) h = httplib.HTTPConnection(host) h.request('PURGE', path) r = h.getresponse() return r.status, r.reason