Velocity Reviews - Computer Hardware Reviews

Velocity Reviews > Newsgroups > Computing > Digital Photography > Re: 35mm scan quality Vs Digital

Reply
Thread Tools

Re: 35mm scan quality Vs Digital

 
 
Terje Mathisen
Guest
Posts: n/a
 
      09-02-2003
Robert Lynch wrote:

> "Roger N. Clark" <(E-Mail Removed)> wrote in message
> news:(E-Mail Removed)...
>
>>Well, a 6-megapixel camera only needs 6 million colors!
>>So having 16-million colors means 10 million aren't used.

>
>
> This is one of the dumbest statements that I have seen here in a while.


Besides being obviously true, you mean?

The key point RNC skipped (but definitely not because he forgot/didn't
know about it) is that you sometimes would like to be able to select
exactly _which_ 6 million colors to use.

If it is a photo with a _lot_ of finely graduated blues from a big sky,
then you might very well need more than 8 bits in the blue channel to
avoid any banding.

Terje

--
- <(E-Mail Removed)>
"almost all programming can be viewed as an exercise in caching"

 
Reply With Quote
 
 
 
 
wally
Guest
Posts: n/a
 
      09-02-2003
In article <(E-Mail Removed)>, Rafe B. <(E-Mail Removed)> wrote:
>On Tue, 02 Sep 2003 01:11:44 GMT, Larry Caldwell <(E-Mail Removed)>
>wrote:
>
>>(E-Mail Removed) (Rafe B.) writes:
>>> On Mon, 1 Sep 2003 14:40:03 -0400, "Joseph Brown"
>>> <(E-Mail Removed)> wrote:
>>>
>>> >> 48-bit image capture is "the key" to good color.
>>> >> What's odd is that even world-class experts disagree
>>> >
>>> >Why would somebody disagree that 48-bit capture provides better color than

> a
>>> >24-bit?

>>
>>> Because it simply isn't true.

>>
>>24 bit color is only 8 bit (256 colors) per channel. In combination, the
>>three channels give you millions of colors, most of which do not occur in
>>nature and are totally useless. Going to 16 bits per channel gives much
>>better results.

>
>Prove it. Lots of folks have tried.
>Look up Dan Margulis (a world class expert on
>Photoshop) -- he's not buying it.
>


Didn't Dan pay off $100 in his 16-bit challenge last year?

Seemed to recall a web site showing the "winners".

"Better" is subject to artistic interpretation and intent, but that's the
point, with 16-bits you have the most to work with to get the results you
want. What makes showing that 16-bits/pixel is "better" difficult is that all
monitors and printers in common use can only handle 8-bit/pixel data so you
can only see the results in 24-bit color anyways. Most output devices don't
have even the full 24-bit gamut.

--wally.
 
Reply With Quote
 
 
 
 
David Dyer-Bennet
Guest
Posts: n/a
 
      09-02-2003
"Roger N. Clark" <(E-Mail Removed)> writes:

> Larry Caldwell wrote:
>
> > http://www.velocityreviews.com/forums/(E-Mail Removed) (Rafe B.) writes:
> > > On Mon, 1 Sep 2003 14:40:03 -0400, "Joseph Brown"
> > > <(E-Mail Removed)> wrote:
> > >
> > > >> 48-bit image capture is "the key" to good color.
> > > >> What's odd is that even world-class experts disagree
> > > >
> > > >Why would somebody disagree that 48-bit capture provides better color than a
> > > >24-bit?

> >
> > > Because it simply isn't true.

> >
> > 24 bit color is only 8 bit (256 colors) per channel. In combination, the
> > three channels give you millions of colors, most of which do not occur in
> > nature and are totally useless. Going to 16 bits per channel gives much
> > better results. Just about any color scanner does 12 bits per channel,
> > for 36 bit color, which gives 8191 colors per channel and substantially
> > improved results.

>
> Well, a 6-megapixel camera only needs 6 million colors!
> So having 16-million colors means 10 million aren't used.
> Then in any image, I bet no one could tell the difference
> between a 6 million color image versus a 16 million color
> image, let alone the "gazillions" (2.8x10^14) 48 bit gives.


Yes, but you don't know in advance *which* 6 million. If you were
using a palleted system with optimized palette, you could do that, but
we're *not*; the camera output formats aren't like that.

> The use of high #bits is to bring out tonal detail over
> a large dynamic range, but when you view on screen or
> print, you need relatively few actual colors.
> When you view on screen, with say a 1280x1024 pixel monitor,
> you need only 1.31 million colors.


Remember that the digital camera original is the *input* to the image
processing process. Lots of things not visible in the original
directly displayed will make a difference after processing.
--
David Dyer-Bennet, <(E-Mail Removed)>, <www.dd-b.net/dd-b/>
RKBA: <noguns-nomoney.com> <www.dd-b.net/carry/>
Photos: <dd-b.lighthunters.net> Snapshots: <www.dd-b.net/dd-b/SnapshotAlbum/>
Dragaera mailing lists: <dragaera.info/>
 
Reply With Quote
 
Flux
Guest
Posts: n/a
 
      09-02-2003
Larry Caldwell wrote:
> (E-Mail Removed) (Rafe B.) writes:
>
>>On Mon, 1 Sep 2003 14:40:03 -0400, "Joseph Brown"
>><(E-Mail Removed)> wrote:
>>
>>
>>>>48-bit image capture is "the key" to good color.
>>>>What's odd is that even world-class experts disagree
>>>
>>>Why would somebody disagree that 48-bit capture provides better color than a
>>>24-bit?

>
>>Because it simply isn't true.

>
> 24 bit color is only 8 bit (256 colors) per channel.


256 shades of one colour is quite a lot. For display purposes, it is enough
- if you paint two shaded areas of colour with the equivalent of one-bit of
difference from a 24 bit pallete, most people won't be able to see the
difference between the two shades.

16 bits per channel will put 256 shades of colour between those two 8 bit
shades that most people can't already differentiate anyway.

> In combination, the three channels give you millions of colors, most of which do not occur in
> nature and are totally useless.


Not occuring in nature doesn't make a colour useless.

> Going to 16 bits per channel gives much better results.


Ah, so you can have even more colours that don't occur in nature?

As already posted by some, the only reason a higher number of bits per
channel is desirable is for processing. In digital photography, if you could
capture an image with perfect exposure and balance everytime, and then had
no need to make any other further processing, 8 bits per channel should be
enough for capture.

But of course, for most of us who make less than perfect photographs and/or
take their images through heavy post-processing, a few more bits are quite
welcome.


Flux

 
Reply With Quote
 
TCS
Guest
Posts: n/a
 
      09-02-2003
On Tue, 02 Sep 2003 22:31:33 GMT, (E-Mail Removed) <(E-Mail Removed)> wrote:
> In message
><(E-Mail Removed) mf625.kaosol.net>,
> TCS <(E-Mail Removed)> wrote:
>
>>On Tue, 02 Sep 2003 08:28:15 +0200, Terje Mathisen <(E-Mail Removed)> wrote:

>
>>>If it is a photo with a _lot_ of finely graduated blues from a big sky,
>>>then you might very well need more than 8 bits in the blue channel to
>>>avoid any banding.

>
>>Sheesh. Another idiot.

>
>>You really shouldn't post when you haven't the slightest clue what you're
>>babbling about.

>
> You shouldn't call someone an idiot without giving an explanation.


There are shades of blue that aren't 100% saturated.


 
Reply With Quote
 
JPS@no.komm
Guest
Posts: n/a
 
      09-02-2003
In message <Uj05b.34962$(E-Mail Removed)>,
(E-Mail Removed) (wally) wrote:

>"Better" is subject to artistic interpretation and intent, but that's the
>point, with 16-bits you have the most to work with to get the results you
>want. What makes showing that 16-bits/pixel is "better" difficult is that all
>monitors and printers in common use can only handle 8-bit/pixel data so you
>can only see the results in 24-bit color anyways. Most output devices don't
>have even the full 24-bit gamut.


The biggest benefit comes from manipulation of the data;
8-bits-per-channel dissolves very quickly with any kind manipulation.
The ugly experiences of over-manipulating 8-bit images sets up electric
fences in people's minds, where they begin to believe that you shouldn't
really be able to manipulate data levels to any great extent, and then
pat themselves on the back, saying, "See! 8-bit is all we need".

Digital pictures have not had a big banding problem in the past, because
sensors have been very noisy until recently. I don't know how the other
cameras fare, but the Canon 10D in ISO 100 mode has noise so low that
you can get solid 8-bit-truncated colors over large expanses in out-of
focus areas.
--

<>>< ><<> ><<> <>>< ><<> <>>< <>>< ><<>
John P Sheehy <(E-Mail Removed)>
><<> <>>< <>>< ><<> <>>< ><<> ><<> <>><

 
Reply With Quote
 
Roger N. Clark
Guest
Posts: n/a
 
      09-03-2003
Robert Lynch wrote:

> "Roger N. Clark" <(E-Mail Removed)> wrote in message
> news:(E-Mail Removed)...
> > Well, a 6-megapixel camera only needs 6 million colors!
> > So having 16-million colors means 10 million aren't used.

>
> This is one of the dumbest statements that I have seen here in a while.


OK, I challenge anyone to PROVE that a 6 million pixel
image can show more than 6 million colors.

A pixel is a red-green-blue representation of a single
color (or CMYK, etc). Each pixel has only one
color, whether it be pure red, cyan, flesh, brown, sea-green,
etc. By definition the number of colors in an image exactly
equals the number of pixels. In practice, there are many
pixels with the same color, or colors so close they can't
be distinguished by humans, and the effective number
of colors is almost always less than the number of pixels
in the image.

Roger

 
Reply With Quote
 
Bart van der Wolf
Guest
Posts: n/a
 
      09-03-2003

"Roger N. Clark" <(E-Mail Removed)> wrote in message
news:(E-Mail Removed)...
SNIP
> OK, I challenge anyone to PROVE that a 6 million pixel
> image can show more than 6 million colors.


You might want to change "show" into "capture at the same time".

You are correct that the number of pixels limits the 'final' number of
potentially different colors, BUT as a choice from a larger palette. The
larger palette is possibly (2^12)^3 (=68.7 billion) colors after
demosaicing, but a maximum of 6 million selected ones before. Gamma
adjusting the linear gamma capture, will compress some colors in the
highlights to the same values.

Bart


 
Reply With Quote
 
z
Guest
Posts: n/a
 
      09-03-2003
On Mon, 01 Sep 2003 20:06:56 -0600, "Roger N. Clark"
<(E-Mail Removed)> wrote:

>Larry Caldwell wrote:
>
>
>Many years ago (1970s) when image processing systems (these were the
>high end tens of thousands of dollars for 24-bit) and X-windows
>was just coming out (8-bit color), a group of us did
>some experimenting and at 10 to 12 bits people couldn't
>see the difference between that and full 24bit. Where you
>need more (e.g. 12 bit color, or 4096 colors) is when you
>have smooth gradients, like sky. In complex scenes, your
>eye/brain gets confused and fewer colors are needed.
>For a while, we went with 8-bit systems to save a lot of money.
>Of course now days, that $100,000 24-bit imaging system had less
>capability than the low end PC that someone threw in the trash!
>
>Roger


Roger on that.
In 1970, memory was $30+ per byte, and fast memory was 25 microsecond
Just the VGA memory (very modest 4MB) would be over 100 million
dollars, with interlaced addressing to get the speed. And that is
just for the memory(s). It would take 2000 chassis, or 250 full size
6 foot racks just to hold the memory. Now it is just part of one
card, that is just thrown away. The underlying manufacturing
technology has changed a lot. A rendering of a picture takes the
blink of an eye (literally). It used to be a four day batch job for
just one picture. <no bloatware in those days>


 
Reply With Quote
 
zuuum
Guest
Posts: n/a
 
      09-04-2003
I think one of the biggest blinds spots of people comparing digital imaging
quality to film and film scans is not considering the color depth and
fidelity differences between triple CCD and interpolated color (single CCD
capture devices) Anyone who has compared a 3CCD vid cam to a single CCD
(prosumer) knows that color saturation and fidelity is one of the first
things you notice.

An 1800x1200 pixel full-color image would need an 1800x1200 RED scan +
1800x1200 GREEN scan + 1800x1200 BLUE scan, times the bit depth of each
color channel. So you can see how the data required for even just 1,000
shades of each color channel adds up rapidly.

Ever notice that down-sampled digicam images always look better than full
resolution? Probably because interpolated color makes for soft edge
definition.

"DTJ" <(E-Mail Removed)> wrote in message
news:(E-Mail Removed)...
> On Tue, 02 Sep 2003 08:28:15 +0200, Terje Mathisen
> <(E-Mail Removed)> wrote:
>
> >Robert Lynch wrote:
> >
> >> "Roger N. Clark" <(E-Mail Removed)> wrote in message
> >> news:(E-Mail Removed)...
> >>
> >>>Well, a 6-megapixel camera only needs 6 million colors!
> >>>So having 16-million colors means 10 million aren't used.
> >>
> >>
> >> This is one of the dumbest statements that I have seen here in a while.

> >
> >Besides being obviously true, you mean?

>
> How?
>
> You could have a picture taken with a 6-megapixel camera that only
> used a single color. It could use 10, 100, 1000, 10000. It is not
> logical to claim what the OP claimed. It is simply wrong.
>
> However, had he said that AT MOST it could use approximately 6 million
> colors, it would have made more sense.



 
Reply With Quote
 
 
 
Reply

Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are On
Pingbacks are On
Refbacks are Off


Similar Threads
Thread Thread Starter Forum Replies Last Post
Re: 35mm scan quality Vs Digital Tony Whitaker Digital Photography 4 08-31-2003 11:37 PM
Re: 35mm scan quality Vs Digital Andrew McDonald Digital Photography 2 08-31-2003 10:45 PM
Re: 35mm scan quality Vs Digital David Dyer-Bennet Digital Photography 0 08-31-2003 03:51 AM
Re: 35mm scan quality Vs Digital JK Digital Photography 0 08-31-2003 03:30 AM
Re: 35mm scan quality Vs Digital Aidan Digital Photography 0 08-31-2003 02:33 AM



Advertisments