Batch Optimising Images on a Linux server

If you’re a web developer then you know how important it is to ensure that your webpages serve content as quickly as possible.

Whilst we can optimise our code, minify javascript and CSS, combine files to reduce the number of http requests (which should become less important as SPDY gains traction – although there’s still a benefit in reduced disk I/O with combined files), the one thing that weighs far more than any of this is imagery.

To ensure fast downloading of images there are a number of things that you can do to reduce image sizes, and best of all you can batch retroactively apply these on images you’ve previously uploaded to your website.

Install tools

First make sure you’ve got the tools needed.

sudo apt-get install jpegoptim
sudo apt-get install optipng


This is a great tool for lossy or lossless reduction of image sizes. By default the tool is lossless and allows you to reduce image size by deleting metadata.

jpegoptim --strip-all file.jpg

This command will strip all comment and exif data from a jpg file and should reduce the size of your images.

However, if you want to make sure that the images on your site are a of consistent lossy level, then you can save even more by setting a max lossy level.

jpegoptim --max=75 file.jpg

This can, of course, be compounded with –strip-all to increase the size savings, although you may want to keep exif data in some cases.

I have found that running some lossy jpegoptim was reducing the image size, on average, by 40-60%. I don’t think image quality has really noticeably suffered either.


optipng -o7 file.png

Sample Output from OptipngThis allows you to optimise PNG images on your server and I’ve found that image sizes were being reduced by between 15-60%.

Our -o flag is the optimisation level which can be run between -o1 and -o7 with 7 being the highest (and slowest) setting.

I found this tool to be quite a bit slower than jpegoptim (at least running it at -o7), but it is fully lossless and does an excellent job. Check out the man page for more details.

Batch Optimising

Naturally doing this on files individually isn’t very useful, so it’s much better to batch process everything.

I found the easiest way to handle this, which is especially useful for a WordPress uploads folder which is divided into year/month sub-folders, was to use the “find” command and exec our optimisation on each file. Perhaps there are better approaches to this but it certainly works and got my job done rapidly. For my jpegoptim I decided not to strip-all as I need to check whether my photoblog exif data sniffing would still work or not.

find . -iname '*.jpg' -exec jpegoptim --max=70 {} \;
find . -iname '*.png' -exec optipng -o7 {} \;


  • CedricTissotNV

    Valuable post ! Apropos , if you has been needing to merge PDF or PNG files , my business used a service here