Velocity Reviews - Computer Hardware Reviews

Velocity Reviews > Newsgroups > Computing > Digital Photography > Re: Question re jpeg compression

Reply
Thread Tools

Re: Question re jpeg compression

 
 
ray
Guest
Posts: n/a
 
      05-24-2008
On Sat, 24 May 2008 22:02:15 +1200, Peter in New Zealand wrote:

> I am aware of the issue with repeated editing and re-saving of jpeg
> image files, and the diminishing quality this can cause. I mostly work
> in jpeg all the time as my little humble p&s Fuji only makes jpegs
> anyway. I notice in the options settings of my editing software there is
> the ability to set the default compression of jpeg files. In this
> package it defaults to 90 (on a scale from 1 to 100), but I have seen
> similar settings in a few other programs also. The help says the higher
> the setting the higher the quality and the larger the file. What happens
> if I set the default to 100? Does that mean no compression at all?
> Without recompression would this do away with the slow drop in quality
> over repeated saves, in effect making all editing and saving lossless?


No. It will still do lossy compression. The difference on any one save
will not be noticeable, but it will accumulate. Best to either always
start with the original file or change to a lossless format (png is good).


>
> Probably not all that relevant to the question but I am using Microsoft
> Digital Image Standard Edition 2006 (Library and Editor). I would love
> to know if this is possible as I love fiddling with my images, and
> storage space is really not an issue for me.
>
> Thanks for any thoughts out there.


 
Reply With Quote
 
 
 
 
Blinky the Shark
Guest
Posts: n/a
 
      05-24-2008
ray wrote:

> On Sat, 24 May 2008 22:02:15 +1200, Peter in New Zealand wrote:
>
>> I am aware of the issue with repeated editing and re-saving of jpeg
>> image files, and the diminishing quality this can cause. I mostly work
>> in jpeg all the time as my little humble p&s Fuji only makes jpegs
>> anyway. I notice in the options settings of my editing software there is
>> the ability to set the default compression of jpeg files. In this
>> package it defaults to 90 (on a scale from 1 to 100), but I have seen
>> similar settings in a few other programs also. The help says the higher
>> the setting the higher the quality and the larger the file. What happens
>> if I set the default to 100? Does that mean no compression at all?
>> Without recompression would this do away with the slow drop in quality
>> over repeated saves, in effect making all editing and saving lossless?

>
> No. It will still do lossy compression. The difference on any one save
> will not be noticeable, but it will accumulate. Best to either always
> start with the original file or change to a lossless format (png is good).


This brings up a question I've pondered. If png is lossless (and I'm not
arguing that point), then why does it offer levels of compression? If
it's lossless, then quality will be the same for least *and* most
compression; so why not compress maximally?


--
Blinky
Killing all posts from Google Groups
The Usenet Improvement Project: http://improve-usenet.org
NEW --> Now evaluating a GG-free news feed: http://usenet4all.se

 
Reply With Quote
 
 
 
 
David J Taylor
Guest
Posts: n/a
 
      05-24-2008
Blinky the Shark wrote:
[]
> This brings up a question I've pondered. If png is lossless (and I'm
> not arguing that point), then why does it offer levels of
> compression? If it's lossless, then quality will be the same for
> least *and* most compression; so why not compress maximally?


There is a trade-off between the degree of compression, and the CPU time
that compression (and decompression) takes. Fast but less compression.
Slow with more compression. Your choice. There are also some options in
PNG to take the image line-to-line similarity into account when
compressing. Again, this takes more time, but may produce better
compression. The quality is the same in all cases, but more compression
may reduce the file size.

Cheers,
David


 
Reply With Quote
 
ray
Guest
Posts: n/a
 
      05-24-2008
On Sat, 24 May 2008 09:41:08 -0700, Blinky the Shark wrote:

> ray wrote:
>
>> On Sat, 24 May 2008 22:02:15 +1200, Peter in New Zealand wrote:
>>
>>> I am aware of the issue with repeated editing and re-saving of jpeg
>>> image files, and the diminishing quality this can cause. I mostly work
>>> in jpeg all the time as my little humble p&s Fuji only makes jpegs
>>> anyway. I notice in the options settings of my editing software there
>>> is the ability to set the default compression of jpeg files. In this
>>> package it defaults to 90 (on a scale from 1 to 100), but I have seen
>>> similar settings in a few other programs also. The help says the
>>> higher the setting the higher the quality and the larger the file.
>>> What happens if I set the default to 100? Does that mean no
>>> compression at all? Without recompression would this do away with the
>>> slow drop in quality over repeated saves, in effect making all editing
>>> and saving lossless?

>>
>> No. It will still do lossy compression. The difference on any one save
>> will not be noticeable, but it will accumulate. Best to either always
>> start with the original file or change to a lossless format (png is
>> good).

>
> This brings up a question I've pondered. If png is lossless (and I'm
> not arguing that point), then why does it offer levels of compression?
> If it's lossless, then quality will be the same for least *and* most
> compression; so why not compress maximally?


Time.
 
Reply With Quote
 
Blinky the Shark
Guest
Posts: n/a
 
      05-24-2008
David J Taylor wrote:

> Blinky the Shark wrote:
> []
>> This brings up a question I've pondered. If png is lossless (and I'm
>> not arguing that point), then why does it offer levels of
>> compression? If it's lossless, then quality will be the same for
>> least *and* most compression; so why not compress maximally?

>
> There is a trade-off between the degree of compression, and the CPU time
> that compression (and decompression) takes. Fast but less compression.


<slaps forehead> Speed of process. I didn't think of that.

> Slow with more compression. Your choice. There are also some options
> in PNG to take the image line-to-line similarity into account when
> compressing. Again, this takes more time, but may produce better
> compression. The quality is the same in all cases, but more compression
> may reduce the file size.


Thanks.

--
Blinky
Killing all posts from Google Groups
The Usenet Improvement Project --> http://improve-usenet.org
Found 5/08: a free GG-blocking news *feed* --> http://usenet4all.se

 
Reply With Quote
 
Blinky the Shark
Guest
Posts: n/a
 
      05-24-2008
ray wrote:

> On Sat, 24 May 2008 09:41:08 -0700, Blinky the Shark wrote:
>
>> ray wrote:
>>
>>> On Sat, 24 May 2008 22:02:15 +1200, Peter in New Zealand wrote:
>>>
>>>> I am aware of the issue with repeated editing and re-saving of jpeg
>>>> image files, and the diminishing quality this can cause. I mostly work
>>>> in jpeg all the time as my little humble p&s Fuji only makes jpegs
>>>> anyway. I notice in the options settings of my editing software there
>>>> is the ability to set the default compression of jpeg files. In this
>>>> package it defaults to 90 (on a scale from 1 to 100), but I have seen
>>>> similar settings in a few other programs also. The help says the
>>>> higher the setting the higher the quality and the larger the file.
>>>> What happens if I set the default to 100? Does that mean no
>>>> compression at all? Without recompression would this do away with the
>>>> slow drop in quality over repeated saves, in effect making all editing
>>>> and saving lossless?
>>>
>>> No. It will still do lossy compression. The difference on any one save
>>> will not be noticeable, but it will accumulate. Best to either always
>>> start with the original file or change to a lossless format (png is
>>> good).

>>
>> This brings up a question I've pondered. If png is lossless (and I'm
>> not arguing that point), then why does it offer levels of compression?
>> If it's lossless, then quality will be the same for least *and* most
>> compression; so why not compress maximally?

>
> Time.


It waits for no compression algorithm.

Thanks. And I just read DJT's more detailed explanation. I'm somewhat
embarrassed that I did not think of that.

--
Blinky
Killing all posts from Google Groups
The Usenet Improvement Project --> http://improve-usenet.org
Found 5/08: a free GG-blocking news *feed* --> http://usenet4all.se

 
Reply With Quote
 
Blinky the Shark
Guest
Posts: n/a
 
      05-24-2008
Dave Platt wrote:

> In article <(E-Mail Removed) .net>,
> Blinky the Shark <(E-Mail Removed)> wrote:
>
>>This brings up a question I've pondered. If png is lossless (and I'm not
>>arguing that point), then why does it offer levels of compression? If
>>it's lossless, then quality will be the same for least *and* most
>>compression; so why not compress maximally?

>
> Maximal compression takes more CPU time during the compression
> process. Going from "moderate" to "highest" compression quality for
> PNG may increase the CPU time needed by a factor of several times (3x
> to 5x I think) while decreasing the size of the compressed data by
> only a few percent.
>
> PNG uses the "deflate" version of the LZ77 lossless compression
> algorithm. To greatly oversimply things, the compressed data consists
> of either:
>
> [1] The original bytes of data from the input, unaltered, or
> [2] Special sequences of codes which mean "Hey, you've seen this
> sequence of bytes before... you can find the next N bytes by
> looking back in the data by a distance of XXXX and copying that
> sequence."


I was kind of familiar with that process (at that level of simplicity,
anyhow), but I don't know anything about the various flavors of it.

> In PNG compression, the furthest that the sequences can "look back" in
> the data is 32k bytes... this is the size of the "data window" that the
> decompressing software must keep buffered, so that it can "look back"
> and copy the data pointed to by the compression sequence.
>
> The job of the software which does the compression, is to look through
> the image, find sequences of bytes which appear more than once, and use
> this knowledge to create the compressed representation.
>
> There will (almost certainly) be many different ways to compress the
> data... numerous different "Hey, look back XXXX bytes and copy N bytes"
> sequences which will accurately reproduce the original data. The
> compressed sequences will vary in their total length... shorter is
> better.


I'm still with you...

> The compressing program gets to decide "how hard it wants to work"...
> that is, how many different alternative compression sequences it wants
> to try, to find the one which ends up being the shortest. That takes
> time. In most cases, it's not "worth the effort" to spend a maximal
> amount of CPU time looking for the Very Best Compressed Sequence... it's
> usually possible to do 95% as well, with only 10-20% as much searching
> effort.


Gotcha...

> Back in the days of 40 MHz 486 CPUs, this was a big issue. Nowadays,


<thinking about my Apple IIe>

> with multi-gigahertz CPUs, using a higher compression level may be more
> worthwhile for some users.


My choke point is my dialup connection. I know, I know......someday I'll
join this millenneum.

> The decompressing software "doesn't care" - it's no more work (and in
> fact may be a bit less) for the decompression program to handle an
> optimally-compressed representation than it is to handle one that's a
> bit more loosy-goosy.
>
> As a practical example of the tradeoff: I sometimes burn filesystem
> backups to CD-R for offsite storage, using a backup utility that
> incorprates "gzip" compression (another LZ77 variant). I set the


gzip I'm used to; I didn't know what it was using.

> compression level to one that's high enough to be useful, but low enough
> (and fast enough) that it can compress the filesystem data faster than
> the CD-R drive can burn it to the media. This choice ensures that the
> CD-R drive doesn't suffer from a buffer underrun while burning.


I appreciate the extended explanation, Dave. This is not to say I don't
appreciate the shorter answers I've received, too, of course.


--
Blinky
Killing all posts from Google Groups
The Usenet Improvement Project --> http://improve-usenet.org
Found 5/08: a free GG-blocking news *feed* --> http://usenet4all.se

 
Reply With Quote
 
Dave Cohen
Guest
Posts: n/a
 
      05-26-2008
Blinky the Shark wrote:
> David J Taylor wrote:
>
>> Blinky the Shark wrote:
>> []
>>> This brings up a question I've pondered. If png is lossless (and I'm
>>> not arguing that point), then why does it offer levels of
>>> compression? If it's lossless, then quality will be the same for
>>> least *and* most compression; so why not compress maximally?

>> There is a trade-off between the degree of compression, and the CPU time
>> that compression (and decompression) takes. Fast but less compression.

>
> <slaps forehead> Speed of process. I didn't think of that.
>
>> Slow with more compression. Your choice. There are also some options
>> in PNG to take the image line-to-line similarity into account when
>> compressing. Again, this takes more time, but may produce better
>> compression. The quality is the same in all cases, but more compression
>> may reduce the file size.

>
> Thanks.
>

Winzip works same way (and is of course lossless). I've never changed
setting from max compression and winzip seems to work pretty fast.
Dave Cohen
 
Reply With Quote
 
Blinky the Shark
Guest
Posts: n/a
 
      05-26-2008
Dave Cohen wrote:

> Blinky the Shark wrote:
>> David J Taylor wrote:
>>
>>> Blinky the Shark wrote:
>>> []
>>>> This brings up a question I've pondered. If png is lossless (and I'm
>>>> not arguing that point), then why does it offer levels of
>>>> compression? If it's lossless, then quality will be the same for
>>>> least *and* most compression; so why not compress maximally?
>>> There is a trade-off between the degree of compression, and the CPU time
>>> that compression (and decompression) takes. Fast but less compression.

>>
>> <slaps forehead> Speed of process. I didn't think of that.
>>
>>> Slow with more compression. Your choice. There are also some options
>>> in PNG to take the image line-to-line similarity into account when
>>> compressing. Again, this takes more time, but may produce better
>>> compression. The quality is the same in all cases, but more compression
>>> may reduce the file size.

>>
>> Thanks.
>>

> Winzip works same way (and is of course lossless). I've never changed
> setting from max compression and winzip seems to work pretty fast.


I've never even thought of compressing images with [Win or any other]zip,
and I first paid for a WinZip registration back in the mid-1990s.
Just for grins, I'll play with that, sometime, against...say, png.


--
Blinky
Killing all posts from Google Groups
The Usenet Improvement Project --> http://improve-usenet.org
Found 5/08: a free GG-blocking news *feed* --> http://usenet4all.se

 
Reply With Quote
 
David J Taylor
Guest
Posts: n/a
 
      05-26-2008
Blinky the Shark wrote:
[]
> I've never even thought of compressing images with [Win or any
> other]zip, and I first paid for a WinZip registration back in the
> mid-1990s. Just for grins, I'll play with that, sometime,
> against...say, png.


What you /should/ find is that a lossless compression designed
specifically for images should do better than a general-purpose lossless
compression algorithm. There have been comparisons published here before,
and IIRC both lossless JPEG and PNG produced smaller file sizes than Zip.
I expect you already appreciate that as JPEG is already compressed, there
will not be much gain from zipping it, and the file size could even
increase.

[Note: lossless JPEG is not widely supported - I don't just mean turning
the quality up to maximum]

Cheers,
David


 
Reply With Quote
 
 
 
Reply

Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are On
Pingbacks are On
Refbacks are Off


Similar Threads
Thread Thread Starter Forum Replies Last Post
Re: Question re jpeg compression Paul Furman Digital Photography 4 05-25-2008 03:50 PM
JPEG Compression Question Les Digital Photography 34 08-26-2005 01:30 PM
Improving jpeg compression Stu ASP .Net 0 06-15-2005 11:28 AM
kodak dx6440 jpeg compression and lens sharpness question Phillean Digital Photography 5 10-04-2003 10:39 PM
JPEG Compression (Newbie Question) Nicholas J. Coscoros Digital Photography 5 07-30-2003 07:20 PM



Advertisments