Optimize all your PNG and JPEG images with one command using imgopt
Here’s something useful for the web developers out there. It’s a script I’ve been using for a while that makes it super-easy to losslessly compress entire folders of PNG and JPEG files for the web.
If you’re familiar with PNG optimization, then you know about programs like advpng, optipng and pngout that optimize and recompress PNG files, making them much smaller than the likes of Photoshop can. My shell script combines all three of these utilities to achieve minimum file size, while hiding their command-line syntax and adding the ability to recursively process entire directory trees.
And, it works with JPEGs, too! It uses jpegtran
(included with libjpeg) and another small utility I’ve included to optimize and strip all metadata from JPEG files. Since my script searches directories recursively, all you need to do is type, say, imgopt myproj/htdocs
and it’ll take care of your entire website.
All compression is lossless, which means no pixels are changed and no information is lost, the files just get smaller — chances are your layout can shrink by as much as 50%, which is like getting free bandwidth, and it means your site will snap into place that much faster for users.
Read on for:
How to use: the quick and dirty
imgopt
is a command-line bash shell script, so it’ll run on *nix systems like Mac OS X, Linux, BSD, and Windows running cygwin. I’ve been running it on OS X and Linux (both running bash 2.05), but please let me know if you run into any problems on your platform.
The current version is 0.1.2, released on 2009-04-02 (see changelog for details). This post has been updated to reflect the current version, but some comments (especially those regarding bugs) are about prior releases. New bug reports and feature requests in the comments are welcome!
Download imgopt-0.1.2.tar.gz and check out the included README. Basically, you just copy the imgopt
script into your path and download and install all the helper programs if you don’t have them already.
Then you can use imgopt
to optimize any combination of files, directories and wildcards in one fell swoop. Examples:
$ imgopt A30.jpg
A30.jpg reduced 40 bytes to 1327 bytes
$ imgopt under60.jpg files/*png
under60.jpg reduced 36559 bytes to 49510 bytes
files/A11.png reduced 130 bytes to 669 bytes
files/A256.png reduced 55 bytes to 2106 bytes
# It is totally fine to run imgopt on the same file more than once, since it checks
# and only overwrites files when they have been reduced in size.
$ imgopt *
A30.jpg unchanged at 1327 bytes
A60.jpg reduced 36 bytes to 2119 bytes
files/A11.png unchanged at 669 bytes
files/A256.png unchanged at 2106 bytes
under50.jpg reduced 48 bytes to 18499 bytes
under60.jpg unchanged at 49510 bytes
Really, that’s all you need to know. Keep reading only if you’re curious or have time to kill.
What it does and why you should care
If you’re a website builder, hopefully you already understand the value of keeping website graphics as small as possible — even though it’s 2009 and most of us have megabit connections at home, each byte saved multiplied by every website visitor adds up to a huge reduction in bandwidth and server load, and a faster experience for users. (See: rules for exceptional performance)
Hopefully, your image workflow looks like this. First, you know to sprite your images, especially those with shared color palettes, to minimize total file size and reduce HTTP request overhead. And you are a total expert at Photoshop or Illustrator’s “Save for Web” feature — you know when to save graphics as JPEG (continuous-tone images like photos) and when to use indexed-color PNG (line art, so probably most of the images in your website’s theme). You know a few of the manual color palette reduction tricks (like deleting colors with R/G/B values all > 250, which reduces palette size and increases edge sharpness of type), so your PNGs only have 13 colors when that’s all they need, and not 256 just because that’s the default setting of some wizard thingy used by children and former print designers.
My imgopt
script steps in at this point and further optimizes all the files that Photoshop (or whatever your graphics program is) spits out. Unfortunately, typical web graphics production software doesn’t really compress image data as much as it could — it doesn’t take the CPU time to really optimize the way the image data is ordered (because that would take too long), it inserts gamma and empty metadata structure that’s wasted on web layout graphics (yes, even in “Save for Web” mode), and, last but probably most important, uses an inferior zip compression algorithm.
One warning: My script strips all metadata (i.e. EXIF and JFIF) and gamma channels (which you should be removing for web use anyway; otherwise people running IE or Safari will complain about the color in your PNGs being “off”), so if you have image files with metadata, color profiles, masks or channels you want to keep, definitely make backup copies before optimizing!
I’ve commented the script and made it easy for you to go in and customize the helper programs I’m using and the parameters they’re passed. I’ve chosen options that minimize file size, so you could, for instance, make things run faster by choosing less compression, or you could opt to preserve some metadata, like photo credit information.
Here’s what it’s doing, to PNG and JPEG files, respectively …
PNG
My script piggybacks on the excellent work done by optipng, advpng and pngout. Now, I could go into a long explanation about how one of these (pngout
) is good because it has a proprietary deflate algorithm that’s superior to zlib, while another is great at optimizing IDAT order (optipng
), but the bottom line is that running all three of these utilities on your PNG files will produce smaller files than any one of them separately.
Very occasionally, you’ll be able to get a few more bytes of compression on PNGs just by running files through imgopt
twice. So, feel free to try this and make it part of your routine if you like. Since it’s 2x the execution time to run the program twice, and the savings are very small, I elected not to just have imgopt
do this on its own (and it’s so easy to just hit up arrow to run the command again). The reason a second pass occasionally works is related to the order the utilities are run in — sometimes a file, once deflated by pngout
, can then be further optimized on a second pass thru optipng
— and just doing a second run of the whole script is the easiest way to take care of any and all interaction cases like these. (FYI, in no instance have I ever found a third pass to do any good.)
JPEG
Here, the story is simpler: imgopt
uses jpegtran
(probably already on your system, since it’s part of libjpeg) to optimize the Huffman table and strip color profiles, EXIF data and other metadata. Second, it uses a small utility called jfifremove
to remove JFIF metadata (jpegtran
leaves at least a minimal 18-byte JFIF segment in place).
Results will depend mostly on how much metadata there is to strip from your images. If you’re running imgopt
on files produced by Photoshop’s “Save for Web,” the savings may only be a few dozen bytes. Whereas, running it on random photos uploaded by users, files produced by Photoshop’s regular “Save” command or files straight from your digital camera will easily save 10-30K or more.
A word again about lossy compression vs. lossless. JPEG is a lossy format — most of the benefit of JPEG compression comes when you choose a compression level (of course it is more complicated than that and there are some techniques you can use to get better control than what Photoshop provides, but I digress), and ultimately that’s what’s going to make the biggest difference to the final sizes of your JPEG files. My script and jpegtran
are lossless and do not decode/recompress the JPEG image, they just optimize the existing JPEG data. Of course, if you want imgopt
to automatically recompress JPEGs at a different quality setting, you could easily add that feature to the script by inserting a line before jpegtran
that calls something like ImageMagick’s convert, but I’ll leave that to you.
About jfifremove
— this is a very simple utility written in C that I found here, debugged and compared against the JFIF standard. So, while I don’t claim authorship, I am the maintainer of the version included with my script. I’ve tested it thoroughly only as it is used in imgopt
— i.e., for deleting the JFIF segment from files that already have had all other optional headers stripped by jpegtran
. I have not tested it for standalone use (i.e. on files containing color profiles, thumbnails, EXIF data etc.), so I don’t recommend you use it outside my script unless you know what you’re doing. (I suspect it might fail to strip the JFIF segment from a file where the EXIF APP1 precedes the JFIF APP0, but at least do nothing destructive … but I’m only speculating.)
In action: How much can we optimize Reddit?
OK, let’s have some fun and optimize the graphics from a real website. I picked Reddit because it uses very few images, so this won’t take long. :)
Reddit is doing a pretty good job at performance optimization already, because they’re not overusing images, and the images they do use are small with few colors. But, they aren’t spriting their images or reducing colors effectively. Now, I’m not going to rebuild their website for them, but let’s see what happened when I ran imgopt
on their homepage’s images, and also when I took another 5 minutes to do basic indexed color reductions on their PNGs. For the autogenerated thumbnail images, it would be unfair to do any manual reductions — but, it turns out they’re making another big mistake by using PNG instead of JPEG for these files. So, for these images, the third column is just a batch convert to JPEG with quality=60 (actually a pretty high setting for small thumbnails) to show what would happen if they fixed this.
Filename | Bytes | imgopt only | Colors reduced plus imgopt | ||||
---|---|---|---|---|---|---|---|
reddit.com.header.png | 2493 | 1112 | 871 | ||||
static/adowngray.gif | 214 | 214 | 214 | ||||
static/adownmod.gif | 145 | 145 | 145 | ||||
static/aupgray.gif | 213 | 213 | 213 | ||||
static/aupmod.gif | 148 | 148 | 148 | ||||
static/create-a-reddit.png | 1187 | 765 | 579 | ||||
static/droparrowgray.gif | 67 | 67 | 67 | ||||
static/mailgray.png | 223 | 127 | 127 | ||||
static/next_organic.png | 847 | 208 | 168 | ||||
static/noimage.png | 1997 | 1158 | 668 | ||||
static/prev_organic.png | 832 | 225 | 167 | ||||
static/submit-alien.png | 2065 | 1054 | 607 | ||||
static/wired_w.png | 650 | 95 | 95 | ||||
Total | 11081 | 5531 | 4069Filename | Bytes | imgopt only | JPEG convert plus imgopt | |
thumbs/t3_80p41.png | 8472 | 7037 | 1762 | ||||
thumbs/t3_80rcr.png | 9784 | 7944 | 1643 | ||||
thumbs/t3_80sd0.png | 10383 | 8391 | 2089 | ||||
thumbs/t3_80sfe.png | 2611 | 1759 | 1460 | ||||
thumbs/t3_80ten.png | 2072 | 1098 | 1835 | ||||
thumbs/t3_80tge.png | 4732 | 3779 | 1173 | ||||
thumbs/t3_80ujf.png | 8660 | 7064 | 1840 | ||||
thumbs/t3_80vt1.png | 2072 | 1098 | 1835 | ||||
thumbs/t3_80xg6.png | 2136 | 594 | 816 | ||||
Total | 50922 | 38764 | 14453 |
So, the site’s main theme graphics shrunk by 50% just using imgopt
alone, and another 25% on top of that with just a few minutes’ common-sense color reduction. The autogenerated PNG thumbnails shrunk 20% — or, make them JPEGs and they’re down 70%. Reddit is a pretty graphics-sparse site, so saving 42K per page visitor isn’t too shabby. (Here’s a tarball of the end results, in case anyone from Condé is reading.)
What about a site with a graphics-rich design? I gave the same experiment a whirl on a former top-20 blog that I used to be tech lead for. Naturally, when I was there, the theme images were sprited and optimized and added to a total of maybe a dozen files and 20K — since then, there’s been a redesign and it’s now about 60 files and 150K (including some bad reminders of the dark ages, like sliced images and “blank.gif”). Result of a quick optimization on those files: I got them down to 50K, 1/3 the original size — so you know that with spriting and some sense you could get them down to 10-20 files and 30-40K pretty easily. I sent the result to a buddy, so hopefully the optimized versions will be making an appearance soon.
Still reading? Well, thanks for hanging in there. Now it’s time to go and make your own website go fast!
David Riecks
March 1, 2009, 2:04 pm
While it certainly makes sense to optimize graphics for a website, you might want to caution your readers that they should take care and not necessarily consider these “image optimization” routines for all images.
There are a few instances in which removal of copyright or rights holder information (contact information for the owner of the image) could lead to some undesirable consequences.
If any of a number of proposed “Orphan Works” legislation acts pass (in the US, UK or elsewhere), you may find that any image that does not contain identifying information may no longer remain under your control, and may be freely used by others without your permission or any other form of compensation. Any website owner that derives a part of their income through the licensing of their images to others will want to be sure that, at minimum, their contact information remains with the image. Other descriptive metadata may be of use to potential end users that have downloaded a preview (or even a thumbnail image) from their site. Indeed the contact information within the image metadata may be their only way of knowing where the image came from after it is downloaded to their hard drive and likely renamed.
Those that are interested, might find the Metadata Manifesto (http://www.stockartistsalliance.org/metadata-manifesto-1/) an interesting read.
In addition, website developers that are removing metadata from images that belong to others, might find that this is an infringement under the Digital Millennium Copyright Act (DMCA) and find themselves involved in litigation that could have been easily avoided.
So I would advise your readers to carefully choose when to employ such image optimization, and be sure that images that need to be protected remain that way.
Hope that helps.
David
sticks
March 4, 2009, 6:51 am
This post was due for a copy-edit anyway, so I took what you’ve said to heart and tried to make it more clear to people that, while the script by default strips all metadata to minimize filesize, they can change the parameters it passes to the optimization programs so that it won’t.
But, this post really is about how to “optimize graphics for a website” — not about photography, illustration etc. — and wading any further into copyright enforcement would be going too far OT. To use Flickr as an example, this post is about techniques to make Flickr’s website “chrome” operate as quickly as possible, much the same way Flickr actually does it. But not about the user-uploaded and user-owned photo content on Flickr, which of course do retain metadata — and which, from my web admin perspective, aren’t really subject to the “every user who visits the site has to load these 12 images just to see the first page, how do I make them faster” optimization problem, either.
Raphael
March 4, 2009, 9:16 am
A little borkus on OSX 10.5 for me – I get:
imgopt foo.png
usage: mktemp [-d] [-q] [-t prefix] [-u] template …
mktemp [-d] [-q] [-u] -t prefix
Brian Cardarella
March 4, 2009, 11:23 am
Raphael:
Edit the imgopt script. Whenever you see `mktemp` change it to `mktemp tmp.XXXXX`
The BSD toolset seems to differ slightly from the GNU toolset.
You’ll still get errors with the `stat` command but the script still runs the compression. You just won’t see the file size changes.
Brian Cardarella
March 4, 2009, 11:24 am
Crap, make that:
`mktemp -t tmp.XXXXXX` instead…
You can put as many ‘X’s as you want. You’re basically passing a template to be used for creating a tmp file.
sticks
March 4, 2009, 4:22 pm
@Raphael and @Brian
Thanks very much for the bug report and help … I was afraid there might be some more GNU/BSD issues! Turns out my OS X system has lots of the GNU coreutils installed, so it was leading me astray.
I uploaded a new version 0.1.1 of the script that works with the BSD versions of
mktemp
andstat
, and also works around an intermittent bug I found inpngout
.Because of the last fix, everyone should download the new version, even if the previous one worked fine for you.
Elias Zamaria
March 4, 2009, 12:07 pm
I am curious about the output shown in this post. Here is what I see:
under60.jpg reduced 36559 bytes to 49510 bytes
files/A11.png reduced 130 bytes to 669 bytes
files/A256.png reduced 55 bytes to 2106 bytes
Is this program reducing the sizes, or increasing them?
sticks
March 4, 2009, 2:50 pm
It’s reducing the sizes, i.e. the first line means:
under60.jpg was originally 86069 bytes, and it’s been reduced by 36559 bytes to 49510.
The script does a check to make sure the new file is actually smaller before it copies it over the old one (not all of the helper programs do a good job of being non-destructive on their own). So it should never make a file larger (in that case it just leaves the original file intact).
Sam
March 11, 2009, 10:57 am
Wow. That was impressive: i just reduced a folder of .png s by 30%.
sticks, thank you, good sir.
Question, though: how specifically would one leave in the metadata? I see the
#(edit here to add/remove PNG optimizing steps or change parameters)
but I’m not sure exactly what to do.
Again, much obliged.
sticks
March 11, 2009, 12:38 pm
You’re going to have to look at the command options of the individual programs. For instance,
optipng
has--preserve and
jpegtran
has-copy
:-copy none Copy no extra markers from source file
-copy comments Copy only comment markers (default)
-copy all Copy all extra markers
So for JPEG you would use those options to preserve EXIF data, and to preserve JFIF just skip running jfifremove.
For PNG you may have to figure something else out — I don’t see options for
advpng
orpngout
. My suggestion would be to not fiddle with the individual commands at all, and just modify the do_png function in the script to work like this:1. Back up the original PNG to somewhere, like original.png.
2. Run all the PNG optimization steps as before.
3. Copy all the metadata from original.png back into the optimized file, and delete original.png.
You could use
exiftool
and its-tagsfromfile
option to copy the metadata — there are a bunch of examples on its man page. (I coincidentally posted on lyncd a month ago about doing something similar with mp3 files, copying over ID3 tags from a backup file after they get trashed by other operations.) Hope that helps!Sam
March 11, 2009, 2:42 pm
Very, very helpful indeed.
Thanks again!
kross
March 11, 2009, 12:25 pm
I am having some trouble getting imgopt to work correctly under Ubuntu 8.10. jpg optimizations work fine, but pngout is giving me an “mv: cannot stat” error.
sticks
March 11, 2009, 1:01 pm
If you look inside the script, this is the
pngout
line:pngout "$1" $TMPF -q -y && mv -f $TMPF.PNG $TMPF
There are actually two commands running … the first one is pngout and the second one is a
mv -f
to move/rename the output filename back to$TMPF
, because pngout has a bug where it sometimes appends.PNG
to the output filename (but not always, it varies based on the input file).mv
is going to complaincannot stat
when its first file argument doesn’t exist, so I suspect there’s actually no problem here, and pngout is running fine. I’d suggest you try this:1. Temporarily edit the script to remove the
-q
option from the pngout command and to comment out the other PNG optimization commands (advpng, optipng).2. Run imgopt just on the PNG file that’s producing the error.
3. Verify that the file’s size has been reduced (or not, if pngout said it couldn’t reduce it further).
If everything is working, then it’s OK to just bury the
mv
error by sending its output to/dev/null
:pngout "$1" $TMPF -q -y && mv -f $TMPF.PNG $TMPF &> /dev/null
Please let me know if that works and I’ll add that (or something similar) to the script. Also, if you could send me or upload somewhere the original version of the PNG file that’s causing the problem so I can download it and try to reproduce the problem, I’d appreciate it!
A83
March 31, 2009, 7:55 am
I’ve just tried the script and on my test image it cuts a 1 pixel wide column in the middle of the file…
…anyone recognize this problem and know what to do to solve it?
Thanks in advance!
sticks
March 31, 2009, 6:09 pm
I would guess it’s something wonky with your test image … maybe you can upload it (both before and after versions) somewhere so we can test it? Also, what program are you using to view the final image (where it shows with the white line)? Stripping the (optional) JFIF header, for example, isn’t going to cause any problems with standard browsers and picture viewers, but some badly written viewers may have trouble with it; I’ve also seen some viewers have problems decoding 24-bit PNGs compressed with pngout. And there are some well-known incompatibilities, like IE before version 7 doesn’t work with 24-bit PNGs.
You didn’t say whether your image was JPEG or PNG … the first thing to do would be to run the helper programs (advpng, optipng and pngout for PNG, jpegtran and jfifremove for JPEG) on it individually to isolate which one of them is causing the problem. The problem is going to be with your image and one of these programs — imgopt is just a batch script that controls the helper programs.
Without knowing what kind of image it is, what programs you’re using, and what the test image is, I’m only making guesses. Definitely post the before and after images for me if you can, I’ll be interested to take a look.
A83
April 2, 2009, 7:31 am
It seems like it was the preview application on windows that couldn’t display them. It worked fine on my mac. But now when I did a batch conversation I got these error messages:
mv: cannot stat `tmp.f24684.PNG’: No such file or directory
img_compressed/icons/numbers/outline_15.png reduced 92 bytes to 196 bytes
mv: cannot stat `tmp.f24684.PNG’: No such file or directory
img_compressed/icons/numbers/outline_7.png reduced 89 bytes to 185 bytes
mv: cannot stat `tmp.f24684.PNG’: No such file or directory
img_compressed/icons/numbers/pink_13.png reduced 72 bytes to 237 bytes
mv: cannot stat `tmp.f24684.PNG’: No such file or directory
img_compressed/icons/numbers/outline_1.png reduced 86 bytes to 169 bytes
mv: cannot stat `tmp.f24684.PNG’: No such file or directory
img_compressed/icons/numbers/pink_11.png reduced 49 bytes to 202 bytes
sticks
April 2, 2009, 10:26 am
Scroll up to this thread … if it’s the same thing, these messages are harmless, even though they’re annoying. I just updated the script, so if you use imgopt-0.1.2 you shouldn’t see these
mv
errors any more.If you could upload or send me one of the original PNG files that produced one of these errors so I can throw it in my test collection, I’d appreciate it!
Steven
June 17, 2009, 6:20 pm
tried it and it seems to work fine on directories. But when I tried to shrink a specific file:
imgopt /Users/steven/Desktop/man.jpg
I got this printout:
usage: cp [-R [-H | -L | -P]] [-fi | -n] [-pvX] source_file target_file
cp [-R [-H | -L | -P]] [-fi | -n] [-pvX] source_file … target_directory
/Users/steven/Desktop/man.jpg reduced 519 bytes to 15532 bytes
However the file remains unchanged, the reduced image is not written back to disc. If I move the same image to a dedicated folder and run imgopt on the whole folder, it works fine.
Val Kayser
Aug. 11, 2009, 2:59 pm
Hello,
nice tool, since there are still a lot of gif files on the web could you add gifsicle to your script?
Matt
Aug. 13, 2009, 10:37 am
Hello, this script looks like a very unique and useful tool. I’m a windows user and have been searching for something like this that could easily automate reducing image sizes of various formats but have not found one that works very well yet. The idea of using cygwin to try out your script seems rather complex to me so I’m more likely to just setup a Ubuntu machine in VMware. Before I go that route though, I though I would ask if there are any other windows users out there that have been excited with the prospect of using this script and found an easy method to do so, I’d love some guidance.
Tim
Nov. 18, 2009, 8:18 am
Matt,
It’s been a few months so not sure if you still need this info, but here are directions for running this script (and other #nix scripts) natively in Windows without the overhead of Cygwin.
First, as noted above install all the helper programs into your PATH.
Changes to the imgopt script:
1. Comment out the jfifremove line. The jfifremove.c file states “Won’t work unmodified on Windows because of the text/binary problem.” Maybe this will be ported in the future, it is a very small program however I’m no programmer.
2. That’s it! no other changes to the script are necessary.
Install the following programs into your PATH:
* Install GnuWin32 CoreUtils. I beleive you really just need mv & cp for this script.
* Install GnuWin32 MkTemp.
* Install the old windows port of zsh 3.0.5 (renamed as sh.exe). I got my copy from the UnxUtils package. I also found a copy (with sources!) at http://www.xs4all.nl/~waterlan/#ZSH. I haven’t tested this version, $ZSH_VERSION says it is the same (3.0.5-nt-beta-0.75), however the file size is different so who knows. Note: if you use UnxUtils, DO NOT use the mv & cp comands from UnxUtils, use the GnuWin32 files instead. In my tests the UnxUtils version could not mv or cp over a file if it existed, even with the -f switch. The GnuWin32 version worked correctly.
Running the script
Run the program with from the command line with “sh imgopt”. That’s it! In my (very limited) testing everything seemed to work as expected.
Some additional notes:
* There is a Windows native version of bash 1.14.2, win-bash, however this did not work with this script in my tests.
* You said no Cygwin, however there is a version of bash 2.0.3 that is distributed standalone of the main Cygwin package. I have not tested it with this script. In theory you could also follow the directions in the included README to compile a newer version of bash.
Note to the script author:
Since this seems to run fine in zsh, I’m guessing it is really a sh script, and not a bash script. Is this correct, or are you aware of any bashisms in the code? If it is really a sh script, can you change the shebang to #!/bin/sh instead of #!/bin/bash? Also, maybe test with alternate shells such as zsh, dash & ksh?
Tor
Dec. 11, 2009, 12:46 pm
Cool! You may want to include pngcrush too.
A Grant
Dec. 18, 2009, 7:23 am
Hi Just noticed a fix you can make to the imgopt file to handle images with spaces in them.
The second copy command should be have quotes around the 2nd argument; the previous cp command does.
cp -f $TMPF “$1”
manu
Jan. 13, 2010, 9:55 am
to use with GIFs:
do_gif () {
# $1 is filename
TMPG=`mktemp -t tmp.XXXXXX` || return 1
#Install gifsicle http://www.lcdf.org/gifsicle/
gifsicle --careful -O2 --no-warnings $TMPG && mv -f $TMPG "$1"
return 0
}
Nik
Jan. 20, 2010, 5:17 am
Any chance of getting a Windows GUI version??
Rob Scott
March 23, 2010, 4:21 am
Found this looking for something slightly different, but, as ever, this looks like a better script than I was looking for :) Will be testing this on a couple of user-generated sites as that’s where image files start to grow into huge beasts, coders should be doing this before putting them in!
(Actually, that’s what I was looking for, but not a retrospective “run on a whole folder or set of folders” version like this, but a script to optimize images on the fly as they are uploaded by user. Pretty sure this is a good starting point for both!)
Xinjiang Lu
April 23, 2010, 9:04 am
I just tested this script. Interesting, for some images, pngout+imgopt produces a smaller image (~8% reduced) than any other combination.
Ariel
June 29, 2010, 2:32 am
Since advpng just recompresses the file, but does not change the structure of the png in any way, shouldn’t it go last (or at least after optipng)?
Otherwise advpng compresses the image really well, then when optipng goes to work on it, it refuses to save it because its compression does not do as well (so the file _appears_ to be bigger) – even though it actually managed to optimize the png structure of the file.
Ariel
June 29, 2010, 2:56 am
I tested a bit:
Original file: 4398 (from gimp)
optipng: no change
After advpng: 1723
pngout: no change
If I do pngout first though:
pngout: 2466
advpng: 1303
In fact, I think you should first expand the file with advpng -z0f to decompress it, then let optipng work on it – and this time give it a chance to actually optimize it, without being fooled by the strong compression, and pngout, and then advpng.
Another test (same file):
optipng: no change, file size 4398
advpng -z0f
optipng: 1769
pngout: no change
advpng: 1723
Next test:
advpng -z0f
pngout: 2466
advpng: 1303
Optipng actually hurt the compression of pngout!
Final try:
advpng -0zf
pngout: 2466
optipng: 1353
advpng: 1303
At least for this file, the best order is exactly the opposite of what you have.
Ariel
June 29, 2010, 4:39 am
So I ran a full test on a bunch of images (I ran the original twice, and the new one twice).
I reordered the commands to:
advpng -z0f
pngout
optipng
advpng
The results were not as dramatic as I thought they would be. It seems that the file (I picked randomly) just so happened to be one that changed a lot.
Most of the files were unchanged, a couple got larger (but usually only by 4 bytes). And some got about 10-20% smaller.
Overall though I think it’s a win.
BTW thanks for this script.
Ariel
June 29, 2010, 3:01 am
A small bug: When I ran it on a directory it left a tmp file behind.
Ariel
June 29, 2010, 4:04 am
It happened when I hit control-c. It’s possible to catch signals in a shell script using the trap command.
Ping from Why I Didn’t Try To Win Webperf Contest 2010 | Razor Fast
Dec. 6, 2010, 6:15 pm
[…] as much as possible while avoiding noticeable artifacts. All PNGs were processed with the imgopt script, which passes them through 3 different optimizers. The eventual goal was to merge them into a […]
Jon
Feb. 6, 2011, 10:09 am
Excellent work! An easy to use and indispensable tool for any web developer.