Dispelling some misunderstanding about JPEG losses with repeatedsaves

Discussion in 'Digital Photography' started by Ofnuts, Aug 7, 2010.

  1. Ofnuts

    Ofnuts Guest

    I wrote a short utility to check the usual claim that JPEG image quality
    degrades with the successive saves.

    This utility saves an image multiple times, each time after making a
    minor and very localized changed to it. To avoid suspecting that
    "convert" does it cleverly to minimize losses, the image is saved to a
    lossless format (PNG) and then converted from PNG to JPEG. The resulting
    image is then compared with the original image (diff0-*), and with the
    result of the first step (diff1-*) (red pixels are the changed pixels).

    The utility and the results of some runs are available here:

    http://dl.free.fr/rjtMETz9h

    The subdirectories provided are the results of running the utility over
    the same image with JPEG quality 25, 50, 75, and 90.

    Now for the interesting part. This dispels some misunderstandings:

    - In all cases, most of the damage occurs on the 1st save. The
    subsequent saves show very little difference with the first step, even
    at very low quality settings. Save steps beyond the third do not add any
    loss... The JPEG algorithm is "stable", and the decoded values
    eventually get re-encoded the very same way.

    - The amount of "damage" is very low at reasonable quality settings (75
    or above). To get an experimental "feel":

    -- load the original image and the result of any step in a photo
    editing software that support layers
    -- obtain the "difference" between the two layers
    -- the resulting image seems a very uniform black to the naked eye
    -- use a "treshold" transform and lower the treshold value until
    recognizable pattersn appear (besides the marker dots at top left)
    -- At 90 quality, using the result of the 10th step, the first white
    pixel shows up at 20 (artefact at lower border due to picture height not
    a multiple of 8), the first pixel in the image a 11.
    -- At 75 quality, the difference produces a recognizable ghost of the
    linnet. The threshold method shows that most differences are below 20.

    Disclaimers:

    - Global image changes (white balance, contrast, colors) are a whole
    different matter, not adressed here (though, IMHO, the problem with JPEG
    in these operations is more the 8-bit-per-channel limit it puts on the
    picture that in turn leads to a comb-like histogram)

    - The original JPEG uses 1:1:1 sub-sampling and so does 'convert' by
    default.

    -- Unless reproduced by different means, these results only apply when
    the same software is used throughout.

    Note: you can run the utility yourself if you have Perl and the
    ImageMagick toolkit installed. It was written, tested and run on WinXP,
    so I make no promises for other OSses, but if you are running these you
    should anyway be able to fix any problems.

    --
    Bertrand
    Ofnuts, Aug 7, 2010
    #1
    1. Advertising

  2. Ofnuts

    Ofnuts Guest

    On 09/08/2010 20:35, Paul Furman wrote:
    > Ofnuts wrote:
    >>
    >> - In all cases, most of the damage occurs on the 1st save. The
    >> subsequent saves show very little difference with the first step, even
    >> at very low quality settings. Save steps beyond the third do not add any
    >> loss... The JPEG algorithm is "stable", and the decoded values
    >> eventually get re-encoded the very same way.

    >
    > Interesting. Also worth noting; when an image remains open in (photoshop
    > at least), you can save as much as you like for backup and it won't
    > 'damage' the file till you close it and open again.


    This is true of all software, AFAIK. But nitpickers will say your
    intermediate version isn't really a backup. Consider:

    Original (V0) -> local change 1 -> Save as V1 -> local change 2 -> save
    as V2 (V2 is produced directly from V0, round off errors occur only once).

    vs:

    Original (V0) -> local change 1 -> Save as V1 -> reload V1 -> local
    change 2 -> save as V2 (V2 is produced from V1, round off errors occur
    twice).

    And it is even worse with global changes (not speaking of losing
    selections, masks, layers...).

    --
    Bertrand
    Ofnuts, Aug 9, 2010
    #2
    1. Advertising

  3. Ofnuts

    Martin Brown Guest

    On 07/08/2010 21:42, Ofnuts wrote:
    > I wrote a short utility to check the usual claim that JPEG image quality
    > degrades with the successive saves.
    >
    > This utility saves an image multiple times, each time after making a
    > minor and very localized changed to it. To avoid suspecting that
    > "convert" does it cleverly to minimize losses, the image is saved to a
    > lossless format (PNG) and then converted from PNG to JPEG. The resulting
    > image is then compared with the original image (diff0-*), and with the
    > result of the first step (diff1-*) (red pixels are the changed pixels).
    >
    > The utility and the results of some runs are available here:
    >
    > http://dl.free.fr/rjtMETz9h
    >
    > The subdirectories provided are the results of running the utility over
    > the same image with JPEG quality 25, 50, 75, and 90.
    >
    > Now for the interesting part. This dispels some misunderstandings:
    >
    > - In all cases, most of the damage occurs on the 1st save. The
    > subsequent saves show very little difference with the first step, even
    > at very low quality settings. Save steps beyond the third do not add any
    > loss... The JPEG algorithm is "stable", and the decoded values
    > eventually get re-encoded the very same way.


    This is basically correct. The coefficients for each 8x8 or 16x16 block
    usually converge onto an attractor in 5-10 cycles or may bounce between
    a few closely related similar versions in a cyclic way. Not sure I would
    be so bold as to say it is stable, but it is mostly chaotic around the
    same stable attractor giving a series of very similar looking images
    that may repeat with a short period 0,1,2,3 etc.

    Serious damage tends to be mostly caused by the chroma subsampling
    routine which averages YCrCb colour over 4x4 blocks and certain boundary
    condition errors in the classical reconstruction methods.

    > - The amount of "damage" is very low at reasonable quality settings (75
    > or above). To get an experimental "feel":
    >
    > -- load the original image and the result of any step in a photo editing
    > software that support layers
    > -- obtain the "difference" between the two layers
    > -- the resulting image seems a very uniform black to the naked eye
    > -- use a "treshold" transform and lower the treshold value until
    > recognizable pattersn appear (besides the marker dots at top left)
    > -- At 90 quality, using the result of the 10th step, the first white
    > pixel shows up at 20 (artefact at lower border due to picture height not
    > a multiple of 8), the first pixel in the image a 11.
    > -- At 75 quality, the difference produces a recognizable ghost of the
    > linnet. The threshold method shows that most differences are below 20.


    I did one based on an 8x8 test pattern that is designed to distress the
    JPEG algorithm a long while ago. The results are at:

    http://www.nezumi.demon.co.uk/photo/jpeg/2/jpeg2.htm

    The difference between chroma subsampled JPEG saves (the default in most
    applications) and the full chroma JPEG is very significant. A lot of
    info is lost in the chroma subsampling and up sampling step.

    The zoomed version doesn't look good on modern browsers with smoothed
    upsampling. They are 8x8 pixel blocks that should have sharp edges.
    >
    > Disclaimers:
    >
    > - Global image changes (white balance, contrast, colors) are a whole
    > different matter, not adressed here (though, IMHO, the problem with JPEG
    > in these operations is more the 8-bit-per-channel limit it puts on the
    > picture that in turn leads to a comb-like histogram)
    >
    > - The original JPEG uses 1:1:1 sub-sampling and so does 'convert' by
    > default.


    Full chroma sampling is very much better at preserving image integrity
    than subsampled chroma (but the latter are considerably smaller).
    PSPro 8 manages to do both incorrectly resulting in patterns in the sky
    (and other artefacts that can be demonstrated on simple testcases).
    >
    > -- Unless reproduced by different means, these results only apply when
    > the same software is used throughout.


    And you use exactly the same quality settings for every save.

    I agree though that JPEG is blamed for a lot of things that are not its
    fault. You can encode graphics line art quite successfully with the
    right choice of Q and full chroma sampling. The algorithm is optimised
    for photographic images but it is not limited to them. PNG is usually
    more compact for line art but not always.

    Regards,
    Martin Brown
    Martin Brown, Aug 10, 2010
    #3
  4. Ofnuts

    Martin Brown Guest

    On 09/08/2010 19:35, Paul Furman wrote:
    > Ofnuts wrote:
    >>
    >> - In all cases, most of the damage occurs on the 1st save. The
    >> subsequent saves show very little difference with the first step, even
    >> at very low quality settings. Save steps beyond the third do not add any
    >> loss... The JPEG algorithm is "stable", and the decoded values
    >> eventually get re-encoded the very same way.

    >
    > Interesting. Also worth noting; when an image remains open in (photoshop
    > at least), you can save as much as you like for backup and it won't
    > 'damage' the file till you close it and open again.


    A lot of programs do that by just renaming the buffer but without
    reloading the image that results from the JPEG encode and decode cycle.

    This can be misleading and I have seen people ruin images by overwriting
    an original with a lower quality copy because they did not realise what
    they saw on the screen did not reflect what was encoded in the file.
    Applications that allow you to see a zoomable preview of the encoded and
    decoded image and a filesize estimate are better.

    Regards,
    Martin Brown
    Martin Brown, Aug 10, 2010
    #4
  5. Ofnuts

    Ofnuts Guest

    On 10/08/2010 09:40, Martin Brown wrote:
    > On 07/08/2010 21:42, Ofnuts wrote:
    >> I wrote a short utility to check the usual claim that JPEG image quality
    >> degrades with the successive saves.
    >>
    >> This utility saves an image multiple times, each time after making a
    >> minor and very localized changed to it. To avoid suspecting that
    >> "convert" does it cleverly to minimize losses, the image is saved to a
    >> lossless format (PNG) and then converted from PNG to JPEG. The resulting
    >> image is then compared with the original image (diff0-*), and with the
    >> result of the first step (diff1-*) (red pixels are the changed pixels).
    >>
    >> The utility and the results of some runs are available here:
    >>
    >> http://dl.free.fr/rjtMETz9h
    >>
    >> The subdirectories provided are the results of running the utility over
    >> the same image with JPEG quality 25, 50, 75, and 90.
    >>
    >> Now for the interesting part. This dispels some misunderstandings:
    >>
    >> - In all cases, most of the damage occurs on the 1st save. The
    >> subsequent saves show very little difference with the first step, even
    >> at very low quality settings. Save steps beyond the third do not add any
    >> loss... The JPEG algorithm is "stable", and the decoded values
    >> eventually get re-encoded the very same way.

    >
    > This is basically correct. The coefficients for each 8x8 or 16x16 block
    > usually converge onto an attractor in 5-10 cycles or may bounce between
    > a few closely related similar versions in a cyclic way. Not sure I would
    > be so bold as to say it is stable, but it is mostly chaotic around the
    > same stable attractor giving a series of very similar looking images
    > that may repeat with a short period 0,1,2,3 etc.
    >
    > Serious damage tends to be mostly caused by the chroma subsampling
    > routine which averages YCrCb colour over 4x4 blocks and certain boundary
    > condition errors in the classical reconstruction methods.
    >
    >> - The amount of "damage" is very low at reasonable quality settings (75
    >> or above). To get an experimental "feel":
    >>
    >> -- load the original image and the result of any step in a photo editing
    >> software that support layers
    >> -- obtain the "difference" between the two layers
    >> -- the resulting image seems a very uniform black to the naked eye
    >> -- use a "treshold" transform and lower the treshold value until
    >> recognizable pattersn appear (besides the marker dots at top left)
    >> -- At 90 quality, using the result of the 10th step, the first white
    >> pixel shows up at 20 (artefact at lower border due to picture height not
    >> a multiple of 8), the first pixel in the image a 11.
    >> -- At 75 quality, the difference produces a recognizable ghost of the
    >> linnet. The threshold method shows that most differences are below 20.

    >
    > I did one based on an 8x8 test pattern that is designed to distress the
    > JPEG algorithm a long while ago. The results are at:
    >
    > http://www.nezumi.demon.co.uk/photo/jpeg/2/jpeg2.htm
    >
    > The difference between chroma subsampled JPEG saves (the default in most
    > applications) and the full chroma JPEG is very significant. A lot of
    > info is lost in the chroma subsampling and up sampling step.
    >
    > The zoomed version doesn't look good on modern browsers with smoothed
    > upsampling. They are 8x8 pixel blocks that should have sharp edges.
    >>
    >> Disclaimers:
    >>
    >> - Global image changes (white balance, contrast, colors) are a whole
    >> different matter, not adressed here (though, IMHO, the problem with JPEG
    >> in these operations is more the 8-bit-per-channel limit it puts on the
    >> picture that in turn leads to a comb-like histogram)
    >>
    >> - The original JPEG uses 1:1:1 sub-sampling and so does 'convert' by
    >> default.

    >
    > Full chroma sampling is very much better at preserving image integrity
    > than subsampled chroma (but the latter are considerably smaller).
    > PSPro 8 manages to do both incorrectly resulting in patterns in the sky
    > (and other artefacts that can be demonstrated on simple testcases).
    >>
    >> -- Unless reproduced by different means, these results only apply when
    >> the same software is used throughout.

    >
    > And you use exactly the same quality settings for every save.
    >
    > I agree though that JPEG is blamed for a lot of things that are not its
    > fault. You can encode graphics line art quite successfully with the
    > right choice of Q and full chroma sampling. The algorithm is optimised
    > for photographic images but it is not limited to them. PNG is usually
    > more compact for line art but not always.


    PNG is vastly under-used. As a developper, I sometimes get bug reports
    about the GUI from pixel-peepers and the "evidence" is a artefact-laden
    JPEG. I have to teach them the beauties of PNG (which , unfortunately,
    is still not supported as an image format by some "enterprise" software).


    --
    Bertrand
    Ofnuts, Aug 10, 2010
    #5
  6. Ofnuts

    Better Info Guest

    Re: Dispelling some misunderstanding about JPEG losses with repeated saves

    On Tue, 10 Aug 2010 08:44:25 +0100, Martin Brown
    <|||newspam|||@nezumi.demon.co.uk> wrote:

    >On 09/08/2010 19:35, Paul Furman wrote:
    >> Ofnuts wrote:
    >>>
    >>> - In all cases, most of the damage occurs on the 1st save. The
    >>> subsequent saves show very little difference with the first step, even
    >>> at very low quality settings. Save steps beyond the third do not add any
    >>> loss... The JPEG algorithm is "stable", and the decoded values
    >>> eventually get re-encoded the very same way.

    >>
    >> Interesting. Also worth noting; when an image remains open in (photoshop
    >> at least), you can save as much as you like for backup and it won't
    >> 'damage' the file till you close it and open again.

    >
    >A lot of programs do that by just renaming the buffer but without
    >reloading the image that results from the JPEG encode and decode cycle.
    >
    >This can be misleading and I have seen people ruin images by overwriting
    >an original with a lower quality copy because they did not realise what
    >they saw on the screen did not reflect what was encoded in the file.
    >Applications that allow you to see a zoomable preview of the encoded and
    >decoded image and a filesize estimate are better.
    >
    >Regards,
    >Martin Brown


    How about applications that allow you to set the JPG compression level on
    any layer or layout component individually, and edit it in real-time while
    seeing it in preview, like Photoline. (Menu > Layout > Image > JPG
    Compression) The "save-for-web" preview and 3 panel comparison of any two
    file-types against the original, chosen compression methods, bit-depths,
    dithering options, and filesizes of each is a separate built-in function.
    It also supports 16-bit JPG compression conventions (HDPhoto/JPEG XR), for
    years now. Do any browsers yet support that? If so, I might start using it.
    Better Info, Aug 10, 2010
    #6
  7. -----BEGIN PGP SIGNED MESSAGE-----
    Hash: SHA1

    On 8/10/2010 2:40 AM, Martin Brown wrote:

    > I did one based on an 8x8 test pattern that is designed to distress the
    > JPEG algorithm a long while ago. The results are at:
    >
    > http://www.nezumi.demon.co.uk/photo/jpeg/2/jpeg2.htm


    That is a fascinating page; thanks for sharing.

    - --
    - -Ryan McGinnis
    The BIG Storm Picture -- http://bigstormpicture.com
    Vortex-2 image licensing at http://vortex-2.com
    Getty: http://www.gettyimages.com/search/search.aspx?artist=Ryan McGinnis

    -----BEGIN PGP SIGNATURE-----
    Version: GnuPG v1.4.10 (MingW32)
    Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org/

    iQEcBAEBAgAGBQJMYWJKAAoJEIzODkDZ7B1b3q8H/ijDC0LspVBHDAbwRKJGnatK
    LMlOpFh3XHcrA0wwXIr27k2y+GIF+g9++qEd+iDuVVUcff6ATYSLQa2IMZUztkYC
    HtHXAWPwcniQ+VSd+Z/8en1iPGvMhbfMZ0DBg836KmOeRcyy4d8VueZqofSpMpsp
    wiSDYnmvCMSHhgbUkVF1fDO5Y0qlK/7H6dvIXKY2F9IgRqsbYBSamc3wNR9eJWLO
    V7g4SIUhmDImcNivK9LEQ00Kc76IjLWJDg4eOzlnjdHedKdtbB81x5J8aD6/Pw4y
    NCURkHypQ9m/63NU/iiSfDBd7W1NqGSAyIaCWxhfcoqHfrdF0ewp4awI3fYxzMQ=
    =MAP/
    -----END PGP SIGNATURE-----
    Ryan McGinnis, Aug 10, 2010
    #7
    1. Advertising

Want to reply to this thread or ask your own question?

It takes just 2 minutes to sign up (and it's free!). Just click the sign up button to choose a username and then you can ask your own questions on the forum.
Similar Threads
  1. David K
    Replies:
    4
    Views:
    458
    David K
    Dec 12, 2003
  2. Roger N. Clark (change username to rnclark)

    raw versus jpeg losses quantified

    Roger N. Clark (change username to rnclark), Jan 31, 2005, in forum: Digital Photography
    Replies:
    31
    Views:
    760
  3. woo
    Replies:
    2
    Views:
    385
  4. woo
    Replies:
    0
    Views:
    364
  5. woo
    Replies:
    0
    Views:
    385
Loading...

Share This Page