Velocity Reviews - Computer Hardware Reviews

Velocity Reviews > Newsgroups > Computing > Digital Photography > Something new for the experts--Demosaicing and noise

Reply
Thread Tools

Something new for the experts--Demosaicing and noise

 
 
jpc
Guest
Posts: n/a
 
      11-06-2004
I'm seeing perhaps a factor of 2 more noise in the individual RGB
channels than I'm seeing in the combined RGB image. This was measured
by running a line profile on images with clean blue sky (no traces of
clouds}, checking that there wasn't a slope to the profile plot, and
then calculating the standard deviation.

This seems to indicate that the demosaicing firmware in my camera is
taking the the four independent measurement of RGGB sensors--or in my
case the CMYG sensors--and signal averaging them to improve the noise
in the resulting image by a factor of two.

Does anyone know of a site that explains the theory behind this
effect? I can't remember seeing this discussed in RPD before and a
google search--keywords demosaic and noise--didn't turn up an
explaination.

jpc
 
Reply With Quote
 
 
 
 
usenet@imagenoir.com
Guest
Posts: n/a
 
      11-06-2004
Kibo informs me that jpc <> stated that:

>I'm seeing perhaps a factor of 2 more noise in the individual RGB
>channels than I'm seeing in the combined RGB image.


What do you mean by 'combined RGB image'? Monochrome perhaps?

>Does anyone know of a site that explains the theory behind this
>effect? I can't remember seeing this discussed in RPD before and a
>google search--keywords demosaic and noise--didn't turn up an
>explaination.


It has been discussed - briefly - here before, but I don't remember
enough detail to Google for it.
It was one of the very few informative posts in one of the Preddiot
troll threads.

--
W
. | ,. w , "Some people are alive only because
\|/ \|/ it is illegal to kill them." Perna condita delenda est
---^----^---------------------------------------------------------------
 
Reply With Quote
 
 
 
 
jpc
Guest
Posts: n/a
 
      11-06-2004
On Sun, 07 Nov 2004 00:49:07 +1100, http://www.velocityreviews.com/forums/(E-Mail Removed) wrote:

>Kibo informs me that jpc <> stated that:
>
>>I'm seeing perhaps a factor of 2 more noise in the individual RGB
>>channels than I'm seeing in the combined RGB image.

>
>What do you mean by 'combined RGB image'? Monochrome perhaps?


No. The colored jpg image straight from my camera--a oly 3020--and
using the default camera setting.
>
>>Does anyone know of a site that explains the theory behind this
>>effect? I can't remember seeing this discussed in RPD before and a
>>google search--keywords demosaic and noise--didn't turn up an
>>explaination.

>
>It has been discussed - briefly - here before, but I don't remember
>enough detail to Google for it.
>It was one of the very few informative posts in one of the Preddiot
>troll threads.


I've been more or less ignoring these posts for a while. Maybe someone
else will remember.

jpc

 
Reply With Quote
 
BillyJoeJimBob
Guest
Posts: n/a
 
      11-06-2004
jpc wrote:

>
> Does anyone know of a site that explains the theory behind this
> effect? I can't remember seeing this discussed in RPD before and a
> google search--keywords demosaic and noise--didn't turn up an
> explaination.


I can't point you to a site, but you might want to search for
things like uncorrelated noise, poission distributed noise,
additive noise. Things are likely to get a bit mathematically
intense for a proper treatment of what you're seeing. To
put it simply, for essentially random noise the signal to
noise ratio increases with the square root of the signal.

You average four channels (or add together four separate
pictures of the same scene), you get a factor of two
increase in your signal to noise ratio... when you rescale
your signal to fall within normal 8-bit RGB values, you see
an apparent factor of two drop in the noise. If you added
sixteen images together (or averaged a 4x4 area of pixels
together), the apparent noise should drop by a factor of four.

Astro-imagers use this technique as part of their workflow to
get low noise images of stars, planets, etc. while avoiding
overfilling their sensors' electron wells.

BJJB

 
Reply With Quote
 
jpc
Guest
Posts: n/a
 
      11-06-2004
On Sat, 06 Nov 2004 09:36:54 -0500, BillyJoeJimBob
<(E-Mail Removed)> wrote:

>jpc wrote:
>
>>
>> Does anyone know of a site that explains the theory behind this
>> effect? I can't remember seeing this discussed in RPD before and a
>> google search--keywords demosaic and noise--didn't turn up an
>> explaination.

>
>I can't point you to a site, but you might want to search for
>things like uncorrelated noise, poission distributed noise,
>additive noise. Things are likely to get a bit mathematically
>intense for a proper treatment of what you're seeing. To
>put it simply, for essentially random noise the signal to
>noise ratio increases with the square root of the signal.
>
>You average four channels (or add together four separate
>pictures of the same scene), you get a factor of two
>increase in your signal to noise ratio... when you rescale
>your signal to fall within normal 8-bit RGB values, you see
>an apparent factor of two drop in the noise. If you added
>sixteen images together (or averaged a 4x4 area of pixels
>together), the apparent noise should drop by a factor of four.
>
>Astro-imagers use this technique as part of their workflow to
>get low noise images of stars, planets, etc. while avoiding
>overfilling their sensors' electron wells.
>
>BJJB



I agree with everything you are saying but let me expand on what I'm
doing so I can narrow down and explain my question/problem a bit more
clearly.

Recently and for the third time over the last couple years I've tried
to work out a simple method of determining the well depth of my
camera. This time, I created what Kodak calls a photon response curve
in one of their app notes. This is a plot of noise vs illumination on
a bare CCD--kodak's version--or noise vs corrected A/D units in the
raw data out of my camera--my version.

While there is some complexity, and possibility for error, in
determining exactly how to measure and then correct the A/D units, I
did come up with a plot that looked right. Because of Poisson noise,
the noise peaks just before the sensor becomes saturated and then
falls off as expected until the readout noise begins to dominate in
the shadows, So far so good.

From what little info is available in CCD data sheets, I came up with
a rule of thumb that a very good sensor can hold 1250 photoelectron
per sq micron of silicon. With my camera that converts to a
respectable and reasonable 16,000 photoelectrons per sensor element.

The problem is when I try to calculate the well depth-- which is
approximate equal to the square of the S/N at saturation--from my
data. There I come up with a well depth of 63,000 photo electrons,
four times greater than expected.

So either I have a very unusual sensor in my four year old
prosumer--unlikely--or there is another layer of complexity in my
experiment that I didn't consider.

So if demosaicing is really a fancy form of signal averaging that will
create the equivelent noise of a virtual sensor the size of a color
filter square, then everything fall together nicely.

I can close the notebook with a smile and go off to find another
experiment that will cause my kids to roll their eyes skyward whenever
I back them into a corner and force them to listen to what daddy is
doing.

jpc







 
Reply With Quote
 
Bart van der Wolf
Guest
Posts: n/a
 
      11-06-2004

<jpc> wrote in message
news:(E-Mail Removed)...
> I'm seeing perhaps a factor of 2 more noise in the individual
> RGB channels than I'm seeing in the combined RGB image.
> This was measured by running a line profile on images with
> clean blue sky (no traces of clouds}, checking that there wasn't
> a slope to the profile plot, and then calculating the standard
> deviation.


Adding several noisy channels will, assuming the noise is uncorrelated
and random (approx. Gaussian), cancel out somewhat. The Red, Green and
Blue channels will have different amounts (standard deviation) of
noise, and the Luminance (for which our eyes are most sensitive)
contribution is very roughly 30/60/10 for the RGB channels. So
Luminance noise is a weighted average of channel variance, and
chrominance noise has different visibility depending on the color.

> This seems to indicate that the demosaicing firmware in my
> camera is taking the the four independent measurement of RGGB
> sensors--or in my case the CMYG sensors--and signal averaging
> them to improve the noise in the resulting image by a factor of two.


No, that's not how demosaicing is performed. There is no such thing as
RGGB averaging of four sensors. The data for each single color
filtered sensor, gets complemented for the missing colors by
sophisticated weighting of many surrounding sensor measurements. This
will result in a matrix with RGB values for each output pixel, of
which one channel value was sampled and the other two are
reconstructed.

> Does anyone know of a site that explains the theory behind this
> effect?


It's just statistics. Adding three random Gaussian noise samples of
equal weight, will reduce the noise by a factor of SQRT(1/3). If the
weights are different, the calculation becomes a bit more complicated.

Bart

 
Reply With Quote
 
BillyJoeJimBob
Guest
Posts: n/a
 
      11-06-2004
jpc wrote:
>
> I agree with everything you are saying but let me expand on what I'm
> doing so I can narrow down and explain my question/problem a bit more
> clearly.
>
> Recently and for the third time over the last couple years I've tried
> to work out a simple method of determining the well depth of my
> camera. This time, I created what Kodak calls a photon response curve
> in one of their app notes. This is a plot of noise vs illumination on
> a bare CCD--kodak's version--or noise vs corrected A/D units in the
> raw data out of my camera--my version.


Okay, you mention "raw data" from your camera. Does your camera
provide a RAW type format that simply dumps the individual sensor
A/D counts, or are you talking about something where each pixel in
the "raw data" has already been demosaiced?

[snip of disparity between predicted and experimental results]

> So if demosaicing is really a fancy form of signal averaging that will
> create the equivelent noise of a virtual sensor the size of a color
> filter square, then everything fall together nicely.


I did a search for "bayer", "demosaic", and "procedure", and I
got the following .pdf file as a hit:

http://ipserv.cse.yzu.edu.tw/iplab/m...emosaicing.pdf

It's a bit technical, but it has additional references listed
which might further explain things. While it doesn't directly
address effects on noise, it seems that you are indeed looking at
a sophisticated form of signal averaging, with some additional post
processing. If you want to put the time and neurons in, you could
probably derive how the noise propagates through the presented
demosaicing procedure.

BJJB
 
Reply With Quote
 
jpc
Guest
Posts: n/a
 
      11-07-2004
On Sat, 06 Nov 2004 17:20:00 -0500, BillyJoeJimBob
<(E-Mail Removed)> wrote:

>jpc wrote:
>>
>> I agree with everything you are saying but let me expand on what I'm
>> doing so I can narrow down and explain my question/problem a bit more
>> clearly.
>>
>> Recently and for the third time over the last couple years I've tried
>> to work out a simple method of determining the well depth of my
>> camera. This time, I created what Kodak calls a photon response curve
>> in one of their app notes. This is a plot of noise vs illumination on
>> a bare CCD--kodak's version--or noise vs corrected A/D units in the
>> raw data out of my camera--my version.

>
>Okay, you mention "raw data" from your camera. Does your camera
>provide a RAW type format that simply dumps the individual sensor
>A/D counts, or are you talking about something where each pixel in
>the "raw data" has already been demosaiced?


Here's where things become complicated. The camera--a Oly
3020--doesn't have an official raw mode. However the sensor, A/D and
control chip are identical to the ones used on the Nikon coolpic 990.
Moreover, a russian hacker has come up with a procedure to create raw
files that works on both cameras

To further complicate matters the cameras have CMYG color filters. So
the hacker also wrote a dos program that takes the CMYG data and turns
it into a Nikon NEF file which can be processed in Photoshop. So, as
you can see, any of these steps may have had some effect on my noise
numbers.

As for my photon response curve, I photographed an evenly illuminated
background thru a Kodak step tablet #2--a strip of 21 neutral density
filters if your aren't familar with the product. After using
PhotoshopCS on the NEF file, I converted the image to LAB mode and
then did my data reduction in ImageJ on an 8 bit grey scale image of
the L channel.

What I intend to do next is to take some slightly better data, process
it several diferent ways and see what I come up with. Hence this post,
an attempt to learn more about the theory so I have a better idea of
what to look for.


>
>[snip of disparity between predicted and experimental results]
>
>> So if demosaicing is really a fancy form of signal averaging that will
>> create the equivelent noise of a virtual sensor the size of a color
>> filter square, then everything fall together nicely.

>
>I did a search for "bayer", "demosaic", and "procedure", and I
>got the following .pdf file as a hit:
>
>http://ipserv.cse.yzu.edu.tw/iplab/m...emosaicing.pdf
>
>It's a bit technical, but it has additional references listed
>which might further explain things. While it doesn't directly
>address effects on noise, it seems that you are indeed looking at
>a sophisticated form of signal averaging, with some additional post
>processing. If you want to put the time and neurons in, you could
>probably derive how the noise propagates through the presented
>demosaicing procedure.


Thanks. I took a quick look and fear that paper might be too much for
my math challenged brain. But I do have access to a unversity library
and will use the references to see if I can find something slightly
less challenging.

jpc

>
>BJJB


 
Reply With Quote
 
jpc
Guest
Posts: n/a
 
      11-07-2004
On Sat, 6 Nov 2004 19:53:16 +0100, "Bart van der Wolf"
<(E-Mail Removed)> wrote:

Thanks for the post. It set me thinking. If you could, review what
I've come up and comment on weither you agree or disagree

><jpc> wrote in message
>news:(E-Mail Removed).. .
>> I'm seeing perhaps a factor of 2 more noise in the individual
>> RGB channels than I'm seeing in the combined RGB image.
>> This was measured by running a line profile on images with
>> clean blue sky (no traces of clouds}, checking that there wasn't
>> a slope to the profile plot, and then calculating the standard
>> deviation.

>
>Adding several noisy channels will, assuming the noise is uncorrelated
>and random (approx. Gaussian), cancel out somewhat. The Red, Green and
>Blue channels will have different amounts (standard deviation) of
>noise, and the Luminance (for which our eyes are most sensitive)
>contribution is very roughly 30/60/10 for the RGB channels. So
>Luminance noise is a weighted average of channel variance, and
>chrominance noise has different visibility depending on the color.


Since the camera I'm using is clearly photon noise limited from say
neutral grey to saturation the noise in individual channel should be
proportional to the square root of the illumination recieved.

>
>> This seems to indicate that the demosaicing firmware in my
>> camera is taking the the four independent measurement of RGGB
>> sensors--or in my case the CMYG sensors--and signal averaging
>> them to improve the noise in the resulting image by a factor of two.

>
>No, that's not how demosaicing is performed. There is no such thing as
>RGGB averaging of four sensors.


My poor choise in wording. I didn't mean to say their was RGGB
averaging but rather that rather that the effect of demosaicing maybe
simular to RGGB averaging.


The data for each single color
>filtered sensor, gets complemented for the missing colors by
>sophisticated weighting of many surrounding sensor measurements. This
>will result in a matrix with RGB values for each output pixel, of
>which one channel value was sampled and the other two are
>reconstructed.


This is where I may be totally off base. I will assume that the
"sophisticated weighting of many surrounding sensor measurements"
involves a series of additions and subtractions similar to the
conversion of my CMYG data into RGB data and perhaps to the down
sampling algorithums used in photoshop and other photo editors.
Furthermore, since camera companies are so tight lipped about the
technical details, there is not much chance of finding out exactly
what is going on short of a firmware dump of the camera control chips
and some serious reverse enginnering.

Both downsampling and CNYG to RGB coversion have an effect on noise.
In Photoshop for instance nearest neighbor downsampling makes blue sky
noise worse with the more sophisticated down sampling techniques doing
a better job. As for the CMYG to RGB I've seen it argued both ways in
the astronomical web sites--that the converion halfs the noise or than
it doesn't do much or may even make the noise slightly wiorse. Anybody
know if there is now a consensis?

In any case, since I've seen the effect in my own blue-sky pictures,
something is going on. For the moment, my working hypothisis is that
the noise reduction from demosaicing is strongest on high intensity
single colors and may fall off when the colors are more mixed.

Maybe totally wrong but once I get my hands on a color target it makes
for some interesting experiments.

Any comments?

jpc






>
>> Does anyone know of a site that explains the theory behind this
>> effect?

>
>It's just statistics. Adding three random Gaussian noise samples of
>equal weight, will reduce the noise by a factor of SQRT(1/3). If the
>weights are different, the calculation becomes a bit more complicated.
>
> Bart


 
Reply With Quote
 
E. Magnuson
Guest
Posts: n/a
 
      11-07-2004
On 2004-11-07, jpc <> wrote:
> Here's where things become complicated. The camera--a Oly
> 3020--doesn't have an official raw mode. However the sensor, A/D and
> control chip are identical to the ones used on the Nikon coolpic 990.
> [...]
> To further complicate matters the cameras have CMYG color filters.


That's odd. Site likes dpreview list the 3020 as using RGB
(http://www.dpreview.com/reviews/spec...oly_c3020z.asp)
Sony makes both RGB (ICX252AQ) and CYMG (ICX252AK) versions of that
sensor. While Nikon used the CYMG versions, I thought that Oly always
used the RGB versions (and the same applied to older cameras like the 950
vs. 2020).

--
Erik
 
Reply With Quote
 
 
 
Reply

Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are On
Pingbacks are On
Refbacks are Off


Similar Threads
Thread Thread Starter Forum Replies Last Post
luminance noise and chroma noise: What is it? Marc Wossner Digital Photography 6 07-05-2007 01:33 PM
Non-noise words are incorrectly recognised as noise words. Peter Strĝiman ASP .Net 1 08-23-2005 01:26 PM
Noise about noise... Stacey Digital Photography 3 02-18-2005 05:57 AM
Noise Ninja custom noise print- worth the effort for stacked photo?? Jason Sommers Digital Photography 4 01-19-2005 06:54 AM
Canon 1Ds Mark II - Noise? What noise? Brian C. Baird Digital Photography 9 09-21-2004 09:54 PM



Advertisments