tl;dr; Guetzli, the new JPEG compression program from Google can save a bytes with little loss of quality.
Inspired by this blog post about Guetzli I thought I'd try it out with something that's relevant to my project, 300x300 JPGs that can be heavily compressed.
So I installed it (with Homebrew) on my MacBook Pro (late 2013) and picked 7 JPGs I had, and use in SongSearch. Which is interesting because these JPEGs have already been compressed once. They are taken from converting from much larger PNGs with
PIL (Pillow) at quality rating 80%. In other words, this is Guetzli on top of
I ran one iteration for every image for the following qualities: 85%, 90%, 95%, 99%, 100%.
|Image||Average Size (bytes)||% Smaller|
So, for example, if you choose the 90% quality you save, on average, 4,667B (4.6KB).
As you might already know, Guetzli is incredibly memory hungry and very very slow. On average each image compression took on average 4-6 seconds (higher quality, shorter times). Meaning, if you like Guetzli you probably need to build around it so that the compression happens in a build step or async somewhere and ideally you don't want to run too many compressions in parallel as it might cause CPU and memory overloading.
Go to https://codepen.io/peterbe/pen/rmPMpm and stare at the screen to see if you can A) see which one is more compressed and B) if the one that is more compressed is too low quality.
Is it worth it?
Is the quality drop too much to save 10% on image sizes?
Please share your thoughts. Perhaps we can re-do this experiment with some slightly larger JPGs.