Re: Question re jpeg compression

Discussion in 'Digital Photography' started by ray, May 24, 2008.

  1. ray

    ray Guest

    On Sat, 24 May 2008 22:02:15 +1200, Peter in New Zealand wrote:

    > I am aware of the issue with repeated editing and re-saving of jpeg
    > image files, and the diminishing quality this can cause. I mostly work
    > in jpeg all the time as my little humble p&s Fuji only makes jpegs
    > anyway. I notice in the options settings of my editing software there is
    > the ability to set the default compression of jpeg files. In this
    > package it defaults to 90 (on a scale from 1 to 100), but I have seen
    > similar settings in a few other programs also. The help says the higher
    > the setting the higher the quality and the larger the file. What happens
    > if I set the default to 100? Does that mean no compression at all?
    > Without recompression would this do away with the slow drop in quality
    > over repeated saves, in effect making all editing and saving lossless?


    No. It will still do lossy compression. The difference on any one save
    will not be noticeable, but it will accumulate. Best to either always
    start with the original file or change to a lossless format (png is good).


    >
    > Probably not all that relevant to the question but I am using Microsoft
    > Digital Image Standard Edition 2006 (Library and Editor). I would love
    > to know if this is possible as I love fiddling with my images, and
    > storage space is really not an issue for me.
    >
    > Thanks for any thoughts out there.
     
    ray, May 24, 2008
    #1
    1. Advertising

  2. ray wrote:

    > On Sat, 24 May 2008 22:02:15 +1200, Peter in New Zealand wrote:
    >
    >> I am aware of the issue with repeated editing and re-saving of jpeg
    >> image files, and the diminishing quality this can cause. I mostly work
    >> in jpeg all the time as my little humble p&s Fuji only makes jpegs
    >> anyway. I notice in the options settings of my editing software there is
    >> the ability to set the default compression of jpeg files. In this
    >> package it defaults to 90 (on a scale from 1 to 100), but I have seen
    >> similar settings in a few other programs also. The help says the higher
    >> the setting the higher the quality and the larger the file. What happens
    >> if I set the default to 100? Does that mean no compression at all?
    >> Without recompression would this do away with the slow drop in quality
    >> over repeated saves, in effect making all editing and saving lossless?

    >
    > No. It will still do lossy compression. The difference on any one save
    > will not be noticeable, but it will accumulate. Best to either always
    > start with the original file or change to a lossless format (png is good).


    This brings up a question I've pondered. If png is lossless (and I'm not
    arguing that point), then why does it offer levels of compression? If
    it's lossless, then quality will be the same for least *and* most
    compression; so why not compress maximally?


    --
    Blinky
    Killing all posts from Google Groups
    The Usenet Improvement Project: http://improve-usenet.org
    NEW --> Now evaluating a GG-free news feed: http://usenet4all.se
     
    Blinky the Shark, May 24, 2008
    #2
    1. Advertising

  3. Blinky the Shark wrote:
    []
    > This brings up a question I've pondered. If png is lossless (and I'm
    > not arguing that point), then why does it offer levels of
    > compression? If it's lossless, then quality will be the same for
    > least *and* most compression; so why not compress maximally?


    There is a trade-off between the degree of compression, and the CPU time
    that compression (and decompression) takes. Fast but less compression.
    Slow with more compression. Your choice. There are also some options in
    PNG to take the image line-to-line similarity into account when
    compressing. Again, this takes more time, but may produce better
    compression. The quality is the same in all cases, but more compression
    may reduce the file size.

    Cheers,
    David
     
    David J Taylor, May 24, 2008
    #3
  4. ray

    ray Guest

    On Sat, 24 May 2008 09:41:08 -0700, Blinky the Shark wrote:

    > ray wrote:
    >
    >> On Sat, 24 May 2008 22:02:15 +1200, Peter in New Zealand wrote:
    >>
    >>> I am aware of the issue with repeated editing and re-saving of jpeg
    >>> image files, and the diminishing quality this can cause. I mostly work
    >>> in jpeg all the time as my little humble p&s Fuji only makes jpegs
    >>> anyway. I notice in the options settings of my editing software there
    >>> is the ability to set the default compression of jpeg files. In this
    >>> package it defaults to 90 (on a scale from 1 to 100), but I have seen
    >>> similar settings in a few other programs also. The help says the
    >>> higher the setting the higher the quality and the larger the file.
    >>> What happens if I set the default to 100? Does that mean no
    >>> compression at all? Without recompression would this do away with the
    >>> slow drop in quality over repeated saves, in effect making all editing
    >>> and saving lossless?

    >>
    >> No. It will still do lossy compression. The difference on any one save
    >> will not be noticeable, but it will accumulate. Best to either always
    >> start with the original file or change to a lossless format (png is
    >> good).

    >
    > This brings up a question I've pondered. If png is lossless (and I'm
    > not arguing that point), then why does it offer levels of compression?
    > If it's lossless, then quality will be the same for least *and* most
    > compression; so why not compress maximally?


    Time.
     
    ray, May 24, 2008
    #4
  5. David J Taylor wrote:

    > Blinky the Shark wrote:
    > []
    >> This brings up a question I've pondered. If png is lossless (and I'm
    >> not arguing that point), then why does it offer levels of
    >> compression? If it's lossless, then quality will be the same for
    >> least *and* most compression; so why not compress maximally?

    >
    > There is a trade-off between the degree of compression, and the CPU time
    > that compression (and decompression) takes. Fast but less compression.


    <slaps forehead> Speed of process. I didn't think of that.

    > Slow with more compression. Your choice. There are also some options
    > in PNG to take the image line-to-line similarity into account when
    > compressing. Again, this takes more time, but may produce better
    > compression. The quality is the same in all cases, but more compression
    > may reduce the file size.


    Thanks.

    --
    Blinky
    Killing all posts from Google Groups
    The Usenet Improvement Project --> http://improve-usenet.org
    Found 5/08: a free GG-blocking news *feed* --> http://usenet4all.se
     
    Blinky the Shark, May 24, 2008
    #5
  6. ray wrote:

    > On Sat, 24 May 2008 09:41:08 -0700, Blinky the Shark wrote:
    >
    >> ray wrote:
    >>
    >>> On Sat, 24 May 2008 22:02:15 +1200, Peter in New Zealand wrote:
    >>>
    >>>> I am aware of the issue with repeated editing and re-saving of jpeg
    >>>> image files, and the diminishing quality this can cause. I mostly work
    >>>> in jpeg all the time as my little humble p&s Fuji only makes jpegs
    >>>> anyway. I notice in the options settings of my editing software there
    >>>> is the ability to set the default compression of jpeg files. In this
    >>>> package it defaults to 90 (on a scale from 1 to 100), but I have seen
    >>>> similar settings in a few other programs also. The help says the
    >>>> higher the setting the higher the quality and the larger the file.
    >>>> What happens if I set the default to 100? Does that mean no
    >>>> compression at all? Without recompression would this do away with the
    >>>> slow drop in quality over repeated saves, in effect making all editing
    >>>> and saving lossless?
    >>>
    >>> No. It will still do lossy compression. The difference on any one save
    >>> will not be noticeable, but it will accumulate. Best to either always
    >>> start with the original file or change to a lossless format (png is
    >>> good).

    >>
    >> This brings up a question I've pondered. If png is lossless (and I'm
    >> not arguing that point), then why does it offer levels of compression?
    >> If it's lossless, then quality will be the same for least *and* most
    >> compression; so why not compress maximally?

    >
    > Time.


    It waits for no compression algorithm.

    Thanks. And I just read DJT's more detailed explanation. I'm somewhat
    embarrassed that I did not think of that.

    --
    Blinky
    Killing all posts from Google Groups
    The Usenet Improvement Project --> http://improve-usenet.org
    Found 5/08: a free GG-blocking news *feed* --> http://usenet4all.se
     
    Blinky the Shark, May 24, 2008
    #6
  7. Dave Platt wrote:

    > In article <>,
    > Blinky the Shark <> wrote:
    >
    >>This brings up a question I've pondered. If png is lossless (and I'm not
    >>arguing that point), then why does it offer levels of compression? If
    >>it's lossless, then quality will be the same for least *and* most
    >>compression; so why not compress maximally?

    >
    > Maximal compression takes more CPU time during the compression
    > process. Going from "moderate" to "highest" compression quality for
    > PNG may increase the CPU time needed by a factor of several times (3x
    > to 5x I think) while decreasing the size of the compressed data by
    > only a few percent.
    >
    > PNG uses the "deflate" version of the LZ77 lossless compression
    > algorithm. To greatly oversimply things, the compressed data consists
    > of either:
    >
    > [1] The original bytes of data from the input, unaltered, or
    > [2] Special sequences of codes which mean "Hey, you've seen this
    > sequence of bytes before... you can find the next N bytes by
    > looking back in the data by a distance of XXXX and copying that
    > sequence."


    I was kind of familiar with that process (at that level of simplicity,
    anyhow), but I don't know anything about the various flavors of it.

    > In PNG compression, the furthest that the sequences can "look back" in
    > the data is 32k bytes... this is the size of the "data window" that the
    > decompressing software must keep buffered, so that it can "look back"
    > and copy the data pointed to by the compression sequence.
    >
    > The job of the software which does the compression, is to look through
    > the image, find sequences of bytes which appear more than once, and use
    > this knowledge to create the compressed representation.
    >
    > There will (almost certainly) be many different ways to compress the
    > data... numerous different "Hey, look back XXXX bytes and copy N bytes"
    > sequences which will accurately reproduce the original data. The
    > compressed sequences will vary in their total length... shorter is
    > better.


    I'm still with you...

    > The compressing program gets to decide "how hard it wants to work"...
    > that is, how many different alternative compression sequences it wants
    > to try, to find the one which ends up being the shortest. That takes
    > time. In most cases, it's not "worth the effort" to spend a maximal
    > amount of CPU time looking for the Very Best Compressed Sequence... it's
    > usually possible to do 95% as well, with only 10-20% as much searching
    > effort.


    Gotcha...

    > Back in the days of 40 MHz 486 CPUs, this was a big issue. Nowadays,


    <thinking about my Apple IIe>

    > with multi-gigahertz CPUs, using a higher compression level may be more
    > worthwhile for some users.


    My choke point is my dialup connection. I know, I know......someday I'll
    join this millenneum. :)

    > The decompressing software "doesn't care" - it's no more work (and in
    > fact may be a bit less) for the decompression program to handle an
    > optimally-compressed representation than it is to handle one that's a
    > bit more loosy-goosy.
    >
    > As a practical example of the tradeoff: I sometimes burn filesystem
    > backups to CD-R for offsite storage, using a backup utility that
    > incorprates "gzip" compression (another LZ77 variant). I set the


    gzip I'm used to; I didn't know what it was using.

    > compression level to one that's high enough to be useful, but low enough
    > (and fast enough) that it can compress the filesystem data faster than
    > the CD-R drive can burn it to the media. This choice ensures that the
    > CD-R drive doesn't suffer from a buffer underrun while burning.


    I appreciate the extended explanation, Dave. This is not to say I don't
    appreciate the shorter answers I've received, too, of course.


    --
    Blinky
    Killing all posts from Google Groups
    The Usenet Improvement Project --> http://improve-usenet.org
    Found 5/08: a free GG-blocking news *feed* --> http://usenet4all.se
     
    Blinky the Shark, May 24, 2008
    #7
  8. ray

    Dave Cohen Guest

    Blinky the Shark wrote:
    > David J Taylor wrote:
    >
    >> Blinky the Shark wrote:
    >> []
    >>> This brings up a question I've pondered. If png is lossless (and I'm
    >>> not arguing that point), then why does it offer levels of
    >>> compression? If it's lossless, then quality will be the same for
    >>> least *and* most compression; so why not compress maximally?

    >> There is a trade-off between the degree of compression, and the CPU time
    >> that compression (and decompression) takes. Fast but less compression.

    >
    > <slaps forehead> Speed of process. I didn't think of that.
    >
    >> Slow with more compression. Your choice. There are also some options
    >> in PNG to take the image line-to-line similarity into account when
    >> compressing. Again, this takes more time, but may produce better
    >> compression. The quality is the same in all cases, but more compression
    >> may reduce the file size.

    >
    > Thanks.
    >

    Winzip works same way (and is of course lossless). I've never changed
    setting from max compression and winzip seems to work pretty fast.
    Dave Cohen
     
    Dave Cohen, May 26, 2008
    #8
  9. Dave Cohen wrote:

    > Blinky the Shark wrote:
    >> David J Taylor wrote:
    >>
    >>> Blinky the Shark wrote:
    >>> []
    >>>> This brings up a question I've pondered. If png is lossless (and I'm
    >>>> not arguing that point), then why does it offer levels of
    >>>> compression? If it's lossless, then quality will be the same for
    >>>> least *and* most compression; so why not compress maximally?
    >>> There is a trade-off between the degree of compression, and the CPU time
    >>> that compression (and decompression) takes. Fast but less compression.

    >>
    >> <slaps forehead> Speed of process. I didn't think of that.
    >>
    >>> Slow with more compression. Your choice. There are also some options
    >>> in PNG to take the image line-to-line similarity into account when
    >>> compressing. Again, this takes more time, but may produce better
    >>> compression. The quality is the same in all cases, but more compression
    >>> may reduce the file size.

    >>
    >> Thanks.
    >>

    > Winzip works same way (and is of course lossless). I've never changed
    > setting from max compression and winzip seems to work pretty fast.


    I've never even thought of compressing images with [Win or any other]zip,
    and I first paid for a WinZip registration back in the mid-1990s. :)
    Just for grins, I'll play with that, sometime, against...say, png.


    --
    Blinky
    Killing all posts from Google Groups
    The Usenet Improvement Project --> http://improve-usenet.org
    Found 5/08: a free GG-blocking news *feed* --> http://usenet4all.se
     
    Blinky the Shark, May 26, 2008
    #9
  10. Blinky the Shark wrote:
    []
    > I've never even thought of compressing images with [Win or any
    > other]zip, and I first paid for a WinZip registration back in the
    > mid-1990s. :) Just for grins, I'll play with that, sometime,
    > against...say, png.


    What you /should/ find is that a lossless compression designed
    specifically for images should do better than a general-purpose lossless
    compression algorithm. There have been comparisons published here before,
    and IIRC both lossless JPEG and PNG produced smaller file sizes than Zip.
    I expect you already appreciate that as JPEG is already compressed, there
    will not be much gain from zipping it, and the file size could even
    increase.

    [Note: lossless JPEG is not widely supported - I don't just mean turning
    the quality up to maximum]

    Cheers,
    David
     
    David J Taylor, May 26, 2008
    #10
  11. David J Taylor wrote:

    > Blinky the Shark wrote:
    > []
    >> I've never even thought of compressing images with [Win or any
    >> other]zip, and I first paid for a WinZip registration back in the
    >> mid-1990s. :) Just for grins, I'll play with that, sometime,
    >> against...say, png.

    >
    > What you /should/ find is that a lossless compression designed
    > specifically for images should do better than a general-purpose lossless
    > compression algorithm. There have been comparisons published here before,


    That's what I was figuring.

    > and IIRC both lossless JPEG and PNG produced smaller file sizes than


    I thought the concensus was that all jpg is lossy, even a "0" compression?

    > Zip. I expect you already appreciate that as JPEG is already compressed,


    Uh, yeah. :)

    > there will not be much gain from zipping it, and the file size could
    > even increase.


    Right.

    > [Note: lossless JPEG is not widely supported - I don't just mean turning
    > the quality up to maximum]


    Ah! It's a whole 'nother thing?


    --
    Blinky
    Killing all posts from Google Groups
    The Usenet Improvement Project --> http://improve-usenet.org
    Found 5/08: a free GG-blocking news *feed* --> http://usenet4all.se
     
    Blinky the Shark, May 26, 2008
    #11
  12. Blinky the Shark wrote:
    []
    > I thought the concensus was that all jpg is lossy, even a "0"
    > compression?

    []
    >> [Note: lossless JPEG is not widely supported - I don't just mean
    >> turning the quality up to maximum]

    >
    > Ah! It's a whole 'nother thing?


    Yes, there is a truly lossless version of JPEG available, but not widely
    supported. As you say, it is /not/ the same as "0" compression, or 100%
    quality. See:

    http://en.wikipedia.org/wiki/Lossless_JPEG

    I actually have a use for 16-bit lossless JPEG in one of my programs, to
    decode some weather satellite data, and it is used in some medical
    applications.

    http://xmedcon.sourceforge.net/

    Apparently used in DNG as well.....

    Cheers,
    David
     
    David J Taylor, May 26, 2008
    #12
  13. David J Taylor wrote:

    > Blinky the Shark wrote:
    > []
    >> I thought the concensus was that all jpg is lossy, even a "0"
    >> compression?

    > []
    >>> [Note: lossless JPEG is not widely supported - I don't just mean
    >>> turning the quality up to maximum]

    >>
    >> Ah! It's a whole 'nother thing?

    >
    > Yes, there is a truly lossless version of JPEG available, but not widely
    > supported. As you say, it is /not/ the same as "0" compression, or 100%
    > quality. See:
    >
    > http://en.wikipedia.org/wiki/Lossless_JPEG


    Hmmmmm.

    > I actually have a use for 16-bit lossless JPEG in one of my programs, to
    > decode some weather satellite data, and it is used in some medical
    > applications.


    Coincidentally, I just ran across it in a program I installed in the wee
    hours this morning. I got it for IPTC notation, and I noticed when I was
    going through the menus when I was settling into it, I saw a ref to
    "lossless jpg".

    > http://xmedcon.sourceforge.net/
    >
    > Apparently used in DNG as well.....


    I hadda look that up. I don't do PS.

    --
    Blinky
    Killing all posts from Google Groups
    The Usenet Improvement Project --> http://improve-usenet.org
    Found 5/08: a free GG-blocking news *feed* --> http://usenet4all.se
     
    Blinky the Shark, May 26, 2008
    #13
  14. Blinky the Shark wrote:
    > David J Taylor wrote:

    []
    >> Apparently used in DNG as well.....

    >
    > I hadda look that up. I don't do PS.


    Neither do I - so it came as a slight surprise that DNG can use lossless
    JPEG. Whether that's what it normally uses, I don't know.

    Cheers,
    David
     
    David J Taylor, May 27, 2008
    #14
    1. Advertising

Want to reply to this thread or ask your own question?

It takes just 2 minutes to sign up (and it's free!). Just click the sign up button to choose a username and then you can ask your own questions on the forum.
Similar Threads
  1. Nicholas J. Coscoros

    JPEG Compression (Newbie Question)

    Nicholas J. Coscoros, Jul 29, 2003, in forum: Digital Photography
    Replies:
    5
    Views:
    539
    Bruce Chastain
    Jul 30, 2003
  2. Phillean
    Replies:
    5
    Views:
    533
    ArtKramr
    Oct 4, 2003
  3. TheChair

    Kodak JPEG Compression

    TheChair, Dec 7, 2003, in forum: Digital Photography
    Replies:
    10
    Views:
    2,270
    Ron Hunter
    Dec 10, 2003
  4. Les

    JPEG Compression Question

    Les, Aug 23, 2005, in forum: Digital Photography
    Replies:
    34
    Views:
    809
    Bruce Lewis
    Aug 26, 2005
  5. Paul Furman

    Re: Question re jpeg compression

    Paul Furman, May 24, 2008, in forum: Digital Photography
    Replies:
    4
    Views:
    291
    David J Taylor
    May 25, 2008
Loading...

Share This Page