Velocity Reviews - Computer Hardware Reviews

Velocity Reviews > Newsgroups > Computing > Digital Photography > Re: JPEG 9 new lossless JPEG standard

Reply
Thread Tools

Re: JPEG 9 new lossless JPEG standard

 
 
Martin Brown
Guest
Posts: n/a
 
      01-23-2013
On 22/01/2013 21:38, Alfred Molon wrote:
> There is a new lossless JPEG standard released just now:
> http://www.infai.org/jpeg
>
> Does anybody know more (performance, compression etc.)?


Performance will be better than the original lossless JPEG (which was
pretty terrible and in practice almost never used outside a handful of
niche markets). PsPro8 will write a JPEG lossless file and give it the
..JPG extension thus crashing almost every other JPEG decoder.

JPEG came to always mean lossy JPEG in common usage and I hope that IJG
are giving their new format a distinctive name like .JPGL to avoid
codecs crashed by a format that they can't make sense of a la PsP8.

The new standard probably gives compression for 24bit rgb images broadly
comparable with or if they have done it right slightly better than PNG.
I have not had time to play with the new release yet.

--
Regards,
Martin Brown
 
Reply With Quote
 
 
 
 
Joe Kotroczo
Guest
Posts: n/a
 
      01-23-2013
On 23/01/2013 09:11, Martin Brown wrote:
> On 22/01/2013 21:38, Alfred Molon wrote:
>> There is a new lossless JPEG standard released just now:
>> http://www.infai.org/jpeg
>>
>> Does anybody know more (performance, compression etc.)?

>
> Performance will be better than the original lossless JPEG (which was
> pretty terrible and in practice almost never used outside a handful of
> niche markets). PsPro8 will write a JPEG lossless file and give it the
> .JPG extension thus crashing almost every other JPEG decoder.
>
> JPEG came to always mean lossy JPEG in common usage and I hope that IJG
> are giving their new format a distinctive name like .JPGL to avoid
> codecs crashed by a format that they can't make sense of a la PsP8.


Any application which crashes because the content of a file doesn't
match what it expected due to the file's filename extension is broken.

As is an operating system that solely relies on filename extensions to
figure out file type.


--
audentes fortuna iuvat
 
Reply With Quote
 
 
 
 
Martin Brown
Guest
Posts: n/a
 
      01-23-2013
On 23/01/2013 09:22, Joe Kotroczo wrote:
> On 23/01/2013 09:11, Martin Brown wrote:
>> On 22/01/2013 21:38, Alfred Molon wrote:
>>> There is a new lossless JPEG standard released just now:
>>> http://www.infai.org/jpeg
>>>
>>> Does anybody know more (performance, compression etc.)?

>>
>> Performance will be better than the original lossless JPEG (which was
>> pretty terrible and in practice almost never used outside a handful of
>> niche markets). PsPro8 will write a JPEG lossless file and give it the
>> .JPG extension thus crashing almost every other JPEG decoder.
>>
>> JPEG came to always mean lossy JPEG in common usage and I hope that IJG
>> are giving their new format a distinctive name like .JPGL to avoid
>> codecs crashed by a format that they can't make sense of a la PsP8.

>
> Any application which crashes because the content of a file doesn't
> match what it expected due to the file's filename extension is broken.
>
> As is an operating system that solely relies on filename extensions to
> figure out file type.


I agree, but Microsoft and the commonly used IJG codec both baulk on
Lossless-JPG streams with a .JPG extension. Malformed JPG files have
been used as a way to vector hostile executable code in the past. It
confused end users no end since they have .JPG files that the decoder
refuses to decode and in some cases for older codecs actually crashes.

Relevant prior art is the JPEG-LS sceme called LOCO by HP
http://www.hpl.hp.com/loco/

Full paper at http://www.hpl.hp.com/loco/HPL-98-193R1.pdf

I hope JPEG9 shows how it compares on the same test data.
These are used for some scientific image telemetry

http://www.hpl.hp.com/news/2004/jan-mar/hp_mars.html


--
Regards,
Martin Brown
 
Reply With Quote
 
Kevin McMurtrie
Guest
Posts: n/a
 
      01-26-2013
In article <(E-Mail Removed)>,
Alan Browne <(E-Mail Removed)> wrote:

> On 2013.01.23 03:11 , Martin Brown wrote:
> > On 22/01/2013 21:38, Alfred Molon wrote:
> >> There is a new lossless JPEG standard released just now:
> >> http://www.infai.org/jpeg
> >>
> >> Does anybody know more (performance, compression etc.)?

> >
> > Performance will be better than the original lossless JPEG (which was
> > pretty terrible and in practice almost never used outside a handful of
> > niche markets). PsPro8 will write a JPEG lossless file and give it the
> > .JPG extension thus crashing almost every other JPEG decoder.
> >
> > JPEG came to always mean lossy JPEG in common usage and I hope that IJG
> > are giving their new format a distinctive name like .JPGL to avoid
> > codecs crashed by a format that they can't make sense of a la PsP8.
> >
> > The new standard probably gives compression for 24bit rgb images broadly
> > comparable with or if they have done it right slightly better than PNG.
> > I have not had time to play with the new release yet.

>
> Frankly, I don't care if JPG is slightly lossy. If it's important I
> have it in another lossless (and higher DR) format (tif, raw, dng ...).
>
> Hopefully a tool to turn the JPG-9 encoded files to lossy JPG files will
> soon emerge. A simple line command would be fine...


Most lossless compression algorithms only work on data words that are
about 8 bits. That's why high efficiency lossless compression on high
dynamic range images is uncommon. A lot of work has to be done to
convert the 16, 24, or 32 bit data into fewer bits in a way that
enhances compression rather than hinders it.
--
I will not see posts from Google because I must filter them as spam
 
Reply With Quote
 
Wolfgang Weisselberg
Guest
Posts: n/a
 
      01-27-2013
Kevin McMurtrie <(E-Mail Removed)> wrote:

> Most lossless compression algorithms only work on data words that are
> about 8 bits. That's why high efficiency lossless compression on high
> dynamic range images is uncommon.


Logic failure. UCS16 can easily be losslessly compressed by
common lossless compressors, for example.

> A lot of work has to be done to
> convert the 16, 24, or 32 bit data into fewer bits in a way that
> enhances compression rather than hinders it.


Or one simply uses a compression algorithm that has no
problems if words are larger than an octet. (The "typical"
image is 24bit btw. if it has colour.) Not to mention that a
compressor would only care about how large a data word is if
they need to understand the data.

-Wolfgang
 
Reply With Quote
 
Martin Brown
Guest
Posts: n/a
 
      01-28-2013
On 26/01/2013 20:23, Kevin McMurtrie wrote:
> In article <(E-Mail Removed)>,
> Alan Browne <(E-Mail Removed)> wrote:
>
>> On 2013.01.23 03:11 , Martin Brown wrote:
>>> On 22/01/2013 21:38, Alfred Molon wrote:
>>>> There is a new lossless JPEG standard released just now:
>>>> http://www.infai.org/jpeg
>>>>
>>>> Does anybody know more (performance, compression etc.)?
>>>
>>> Performance will be better than the original lossless JPEG (which was
>>> pretty terrible and in practice almost never used outside a handful of
>>> niche markets). PsPro8 will write a JPEG lossless file and give it the
>>> .JPG extension thus crashing almost every other JPEG decoder.
>>>
>>> JPEG came to always mean lossy JPEG in common usage and I hope that IJG
>>> are giving their new format a distinctive name like .JPGL to avoid
>>> codecs crashed by a format that they can't make sense of a la PsP8.
>>>
>>> The new standard probably gives compression for 24bit rgb images broadly
>>> comparable with or if they have done it right slightly better than PNG.
>>> I have not had time to play with the new release yet.

>>
>> Frankly, I don't care if JPG is slightly lossy. If it's important I
>> have it in another lossless (and higher DR) format (tif, raw, dng ...).
>>
>> Hopefully a tool to turn the JPG-9 encoded files to lossy JPG files will
>> soon emerge. A simple line command would be fine...


The JPEG9 codec still includes all the original JPEG standard stuff
*and* in addition a new lossless encoder and colourspace it calls RGB1
that allows better lossless compression RGB images. I haven't tried it
out yet. Roundtuit problem.
>
> Most lossless compression algorithms only work on data words that are
> about 8 bits. That's why high efficiency lossless compression on high
> dynamic range images is uncommon. A lot of work has to be done to
> convert the 16, 24, or 32 bit data into fewer bits in a way that
> enhances compression rather than hinders it.


That is somewhat misleading. The original draft lossy JPEG standard
provided for images using 8bit or 12bit input data and the IJG codec can
be compiled for the latter case. 12bit lossy JPEG is seldom seen.

The original lossless JPEG standard also allowed for lossless encoding
of any data of bit length 2 through 16 bits. The problem was that there
were already other lossless encoders about that were as good or better
whereas the lossy JPEG high compression encoding was new and extremely
useful with almost no perceptual losses and *much* smaller file sizes.

IJG making a free implementation publicly available made it the defacto
standard for (pre)web images in the days when a really fast dialup modem
could manage up to 2kb/s on a good day with a trailing wind.

Variations on the theme of JPEG-LS extended by HP and other researchers
are used in the lossless compression of image data from space probes and
archiving digital X-rays but seldom (never?) seen in consumer kit.

The problem for lossless algorithms in general is that they generally
spend an inordinate amount of their space and time budget faithfully
preserving exactly the thermal noise from the imaging system.

--
Regards,
Martin Brown
 
Reply With Quote
 
Martin Brown
Guest
Posts: n/a
 
      01-28-2013
On 28/01/2013 18:30, Alfred Molon wrote:

> Will JPEG still be widespread in 50 years?


Impossible to say, but there is no reason why it should not survive.

The IJG JPEG codec is freely available in sourcecode form and the nasty
blocking patents on things that would improve it will time out.

Wavelets could in principal do slightly better in terms of higher
fidelity at a smaller size but JPEG is basically good enough for all
consumer grade imaging. The extent that JPEG is spread across the web
pretty much ensures that there will be decoders for the foreseeable
future - although 50 years is perhaps a bit of a stretch.

My instinct is that if J2k was going to take off on the web it would
have done so by now. If you look back in time I was an advocate for it.

http://www.nezumi.demon.co.uk/photo/j2k/j2k_v_jpeg.htm

J2k works significantly better than JPEG at highest quality but the
gains were not sufficient to overcome the inertia and various patent
litigation barriers. Various image apps have J2k codec in today.

--
Regards,
Martin Brown
 
Reply With Quote
 
Kevin McMurtrie
Guest
Posts: n/a
 
      01-29-2013
In article <(E-Mail Removed)>,
Wolfgang Weisselberg <(E-Mail Removed)> wrote:

> Kevin McMurtrie <(E-Mail Removed)> wrote:
>
> > Most lossless compression algorithms only work on data words that are
> > about 8 bits. That's why high efficiency lossless compression on high
> > dynamic range images is uncommon.

>
> Logic failure. UCS16 can easily be losslessly compressed by
> common lossless compressors, for example.
>
> > A lot of work has to be done to
> > convert the 16, 24, or 32 bit data into fewer bits in a way that
> > enhances compression rather than hinders it.

>
> Or one simply uses a compression algorithm that has no
> problems if words are larger than an octet. (The "typical"
> image is 24bit btw. if it has colour.) Not to mention that a
> compressor would only care about how large a data word is if
> they need to understand the data.
>
> -Wolfgang


The final compression stage is usually something simple like 'deflate'.
It does very much care what the bytes look like. If you were to dump
out an interleaved RGB image with 16 bits per pixel per channel to a
file and compress it with bzip2 or gzip, you'd find that not much
happens. It might even get a few bytes larger. PNG puts a simple
predictive filter in front of deflate so that typical images produce
simpler patterns at the byte level. Lossless JPEG attempts to create
simple patterns representing error correction for the lossy conversion.

Deflate and bzip2 only work well at reducing 8 bit words. Getting a 16
or 24 bit per pixel per channel image down to clean patterns of 8 bits
takes some work that's very specific to image processing.
--
I will not see posts from Google because I must filter them as spam
 
Reply With Quote
 
Wolfgang Weisselberg
Guest
Posts: n/a
 
      02-04-2013
Kevin McMurtrie <(E-Mail Removed)> wrote:
> Wolfgang Weisselberg <(E-Mail Removed)> wrote:
>> Kevin McMurtrie <(E-Mail Removed)> wrote:


>> > Most lossless compression algorithms only work on data words that are
>> > about 8 bits. That's why high efficiency lossless compression on high
>> > dynamic range images is uncommon.


>> Logic failure. UCS16 can easily be losslessly compressed by
>> common lossless compressors, for example.


>> > A lot of work has to be done to
>> > convert the 16, 24, or 32 bit data into fewer bits in a way that
>> > enhances compression rather than hinders it.


>> Or one simply uses a compression algorithm that has no
>> problems if words are larger than an octet. (The "typical"
>> image is 24bit btw. if it has colour.) Not to mention that a
>> compressor would only care about how large a data word is if
>> they need to understand the data.


> The final compression stage is usually something simple like 'deflate'.
> It does very much care what the bytes look like.


Deflate doesn't care at all. Only the archivable compression
rate may be suboptimal.

> If you were to dump
> out an interleaved RGB image with 16 bits per pixel per channel to a
> file and compress it with bzip2 or gzip, you'd find that not much
> happens. It might even get a few bytes larger. PNG puts a simple
> predictive filter in front of deflate so that typical images produce
> simpler patterns at the byte level. Lossless JPEG attempts to create
> simple patterns representing error correction for the lossy conversion.


And?

> Deflate and bzip2 only work well at reducing 8 bit words.


So we are not at "only work well" from your previous "only
work". That's at least the right direction.

For example deflate (the algorithm) is nowhere dependent on
"8 bit words". The characters can be of arbitrary size.

The same is true for the bzip2 algorithm.


> Getting a 16
> or 24 bit per pixel per channel image down to clean patterns of 8 bits
> takes some work that's very specific to image processing.


What work would that be?

> I will not see posts from Google because I must filter them as spam


Tell the guy who's pointing a pistol at you, forcing you to
filter posts from Google to go away. Then it will be *your*
choice if you want to filter posts from Google.

-Wolfgang
 
Reply With Quote
 
Kevin McMurtrie
Guest
Posts: n/a
 
      02-05-2013
In article <(E-Mail Removed)>,
Wolfgang Weisselberg <(E-Mail Removed)> wrote:

> Kevin McMurtrie <(E-Mail Removed)> wrote:
> > Wolfgang Weisselberg <(E-Mail Removed)> wrote:
> >> Kevin McMurtrie <(E-Mail Removed)> wrote:

>
> >> > Most lossless compression algorithms only work on data words that are
> >> > about 8 bits. That's why high efficiency lossless compression on high
> >> > dynamic range images is uncommon.

>
> >> Logic failure. UCS16 can easily be losslessly compressed by
> >> common lossless compressors, for example.

>
> >> > A lot of work has to be done to
> >> > convert the 16, 24, or 32 bit data into fewer bits in a way that
> >> > enhances compression rather than hinders it.

>
> >> Or one simply uses a compression algorithm that has no
> >> problems if words are larger than an octet. (The "typical"
> >> image is 24bit btw. if it has colour.) Not to mention that a
> >> compressor would only care about how large a data word is if
> >> they need to understand the data.

>
> > The final compression stage is usually something simple like 'deflate'.
> > It does very much care what the bytes look like.

>
> Deflate doesn't care at all. Only the archivable compression
> rate may be suboptimal.
>
> > If you were to dump
> > out an interleaved RGB image with 16 bits per pixel per channel to a
> > file and compress it with bzip2 or gzip, you'd find that not much
> > happens. It might even get a few bytes larger. PNG puts a simple
> > predictive filter in front of deflate so that typical images produce
> > simpler patterns at the byte level. Lossless JPEG attempts to create
> > simple patterns representing error correction for the lossy conversion.

>
> And?
>
> > Deflate and bzip2 only work well at reducing 8 bit words.

>
> So we are not at "only work well" from your previous "only
> work". That's at least the right direction.
>
> For example deflate (the algorithm) is nowhere dependent on
> "8 bit words". The characters can be of arbitrary size.
>
> The same is true for the bzip2 algorithm.


OK, you should call up the GIF, PNG, JPEG, MPEG, and FLAC folks to tell
them of your brilliant discovery. I bet they'll feel silly for all the
work they did coming up with algorithms to prepare data for compression.
Thanks for saving the Internet.


>
> > Getting a 16
> > or 24 bit per pixel per channel image down to clean patterns of 8 bits
> > takes some work that's very specific to image processing.

>
> What work would that be?
>
> > I will not see posts from Google because I must filter them as spam

>
> Tell the guy who's pointing a pistol at you, forcing you to
> filter posts from Google to go away. Then it will be *your*
> choice if you want to filter posts from Google.
>
> -Wolfgang


Refresh your meds.
--
I will not see posts from Google because I must filter them as spam
 
Reply With Quote
 
 
 
Reply

Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are On
Pingbacks are On
Refbacks are Off


Similar Threads
Thread Thread Starter Forum Replies Last Post
Re: JPEG 9 new lossless JPEG standard Joe Kotroczo Digital Photography 0 01-23-2013 09:18 AM
Re: JPEG 9 new lossless JPEG standard David Dyer-Bennet Digital Photography 0 01-23-2013 03:44 AM
Re: JPEG 9 new lossless JPEG standard nick c Digital Photography 0 01-22-2013 11:29 PM
JPEG lossless re-sizing? Bob & Anni Digital Photography 18 12-10-2003 09:07 AM
ImageMagick and lossless JPEG rotation? Jim Garrison Digital Photography 8 10-01-2003 12:35 AM



Advertisments