Something new for the experts--Demosaicing and noise

Discussion in 'Digital Photography' started by jpc, Nov 6, 2004.

  1. jpc

    jpc Guest

    I'm seeing perhaps a factor of 2 more noise in the individual RGB
    channels than I'm seeing in the combined RGB image. This was measured
    by running a line profile on images with clean blue sky (no traces of
    clouds}, checking that there wasn't a slope to the profile plot, and
    then calculating the standard deviation.

    This seems to indicate that the demosaicing firmware in my camera is
    taking the the four independent measurement of RGGB sensors--or in my
    case the CMYG sensors--and signal averaging them to improve the noise
    in the resulting image by a factor of two.

    Does anyone know of a site that explains the theory behind this
    effect? I can't remember seeing this discussed in RPD before and a
    google search--keywords demosaic and noise--didn't turn up an
    explaination.

    jpc
     
    jpc, Nov 6, 2004
    #1
    1. Advertising

  2. jpc

    Guest

    Kibo informs me that jpc <> stated that:

    >I'm seeing perhaps a factor of 2 more noise in the individual RGB
    >channels than I'm seeing in the combined RGB image.


    What do you mean by 'combined RGB image'? Monochrome perhaps?

    >Does anyone know of a site that explains the theory behind this
    >effect? I can't remember seeing this discussed in RPD before and a
    >google search--keywords demosaic and noise--didn't turn up an
    >explaination.


    It has been discussed - briefly - here before, but I don't remember
    enough detail to Google for it.
    It was one of the very few informative posts in one of the Preddiot
    troll threads.

    --
    W
    . | ,. w , "Some people are alive only because
    \|/ \|/ it is illegal to kill them." Perna condita delenda est
    ---^----^---------------------------------------------------------------
     
    , Nov 6, 2004
    #2
    1. Advertising

  3. jpc

    jpc Guest

    On Sun, 07 Nov 2004 00:49:07 +1100, wrote:

    >Kibo informs me that jpc <> stated that:
    >
    >>I'm seeing perhaps a factor of 2 more noise in the individual RGB
    >>channels than I'm seeing in the combined RGB image.

    >
    >What do you mean by 'combined RGB image'? Monochrome perhaps?


    No. The colored jpg image straight from my camera--a oly 3020--and
    using the default camera setting.
    >
    >>Does anyone know of a site that explains the theory behind this
    >>effect? I can't remember seeing this discussed in RPD before and a
    >>google search--keywords demosaic and noise--didn't turn up an
    >>explaination.

    >
    >It has been discussed - briefly - here before, but I don't remember
    >enough detail to Google for it.
    >It was one of the very few informative posts in one of the Preddiot
    >troll threads.


    I've been more or less ignoring these posts for a while. Maybe someone
    else will remember.

    jpc
     
    jpc, Nov 6, 2004
    #3
  4. jpc wrote:

    >
    > Does anyone know of a site that explains the theory behind this
    > effect? I can't remember seeing this discussed in RPD before and a
    > google search--keywords demosaic and noise--didn't turn up an
    > explaination.


    I can't point you to a site, but you might want to search for
    things like uncorrelated noise, poission distributed noise,
    additive noise. Things are likely to get a bit mathematically
    intense for a proper treatment of what you're seeing. To
    put it simply, for essentially random noise the signal to
    noise ratio increases with the square root of the signal.

    You average four channels (or add together four separate
    pictures of the same scene), you get a factor of two
    increase in your signal to noise ratio... when you rescale
    your signal to fall within normal 8-bit RGB values, you see
    an apparent factor of two drop in the noise. If you added
    sixteen images together (or averaged a 4x4 area of pixels
    together), the apparent noise should drop by a factor of four.

    Astro-imagers use this technique as part of their workflow to
    get low noise images of stars, planets, etc. while avoiding
    overfilling their sensors' electron wells.

    BJJB
     
    BillyJoeJimBob, Nov 6, 2004
    #4
  5. jpc

    jpc Guest

    On Sat, 06 Nov 2004 09:36:54 -0500, BillyJoeJimBob
    <> wrote:

    >jpc wrote:
    >
    >>
    >> Does anyone know of a site that explains the theory behind this
    >> effect? I can't remember seeing this discussed in RPD before and a
    >> google search--keywords demosaic and noise--didn't turn up an
    >> explaination.

    >
    >I can't point you to a site, but you might want to search for
    >things like uncorrelated noise, poission distributed noise,
    >additive noise. Things are likely to get a bit mathematically
    >intense for a proper treatment of what you're seeing. To
    >put it simply, for essentially random noise the signal to
    >noise ratio increases with the square root of the signal.
    >
    >You average four channels (or add together four separate
    >pictures of the same scene), you get a factor of two
    >increase in your signal to noise ratio... when you rescale
    >your signal to fall within normal 8-bit RGB values, you see
    >an apparent factor of two drop in the noise. If you added
    >sixteen images together (or averaged a 4x4 area of pixels
    >together), the apparent noise should drop by a factor of four.
    >
    >Astro-imagers use this technique as part of their workflow to
    >get low noise images of stars, planets, etc. while avoiding
    >overfilling their sensors' electron wells.
    >
    >BJJB



    I agree with everything you are saying but let me expand on what I'm
    doing so I can narrow down and explain my question/problem a bit more
    clearly.

    Recently and for the third time over the last couple years I've tried
    to work out a simple method of determining the well depth of my
    camera. This time, I created what Kodak calls a photon response curve
    in one of their app notes. This is a plot of noise vs illumination on
    a bare CCD--kodak's version--or noise vs corrected A/D units in the
    raw data out of my camera--my version.

    While there is some complexity, and possibility for error, in
    determining exactly how to measure and then correct the A/D units, I
    did come up with a plot that looked right. Because of Poisson noise,
    the noise peaks just before the sensor becomes saturated and then
    falls off as expected until the readout noise begins to dominate in
    the shadows, So far so good.

    From what little info is available in CCD data sheets, I came up with
    a rule of thumb that a very good sensor can hold 1250 photoelectron
    per sq micron of silicon. With my camera that converts to a
    respectable and reasonable 16,000 photoelectrons per sensor element.

    The problem is when I try to calculate the well depth-- which is
    approximate equal to the square of the S/N at saturation--from my
    data. There I come up with a well depth of 63,000 photo electrons,
    four times greater than expected.

    So either I have a very unusual sensor in my four year old
    prosumer--unlikely--or there is another layer of complexity in my
    experiment that I didn't consider.

    So if demosaicing is really a fancy form of signal averaging that will
    create the equivelent noise of a virtual sensor the size of a color
    filter square, then everything fall together nicely.

    I can close the notebook with a smile and go off to find another
    experiment that will cause my kids to roll their eyes skyward whenever
    I back them into a corner and force them to listen to what daddy is
    doing.

    jpc
     
    jpc, Nov 6, 2004
    #5
  6. <jpc> wrote in message
    news:...
    > I'm seeing perhaps a factor of 2 more noise in the individual
    > RGB channels than I'm seeing in the combined RGB image.
    > This was measured by running a line profile on images with
    > clean blue sky (no traces of clouds}, checking that there wasn't
    > a slope to the profile plot, and then calculating the standard
    > deviation.


    Adding several noisy channels will, assuming the noise is uncorrelated
    and random (approx. Gaussian), cancel out somewhat. The Red, Green and
    Blue channels will have different amounts (standard deviation) of
    noise, and the Luminance (for which our eyes are most sensitive)
    contribution is very roughly 30/60/10 for the RGB channels. So
    Luminance noise is a weighted average of channel variance, and
    chrominance noise has different visibility depending on the color.

    > This seems to indicate that the demosaicing firmware in my
    > camera is taking the the four independent measurement of RGGB
    > sensors--or in my case the CMYG sensors--and signal averaging
    > them to improve the noise in the resulting image by a factor of two.


    No, that's not how demosaicing is performed. There is no such thing as
    RGGB averaging of four sensors. The data for each single color
    filtered sensor, gets complemented for the missing colors by
    sophisticated weighting of many surrounding sensor measurements. This
    will result in a matrix with RGB values for each output pixel, of
    which one channel value was sampled and the other two are
    reconstructed.

    > Does anyone know of a site that explains the theory behind this
    > effect?


    It's just statistics. Adding three random Gaussian noise samples of
    equal weight, will reduce the noise by a factor of SQRT(1/3). If the
    weights are different, the calculation becomes a bit more complicated.

    Bart
     
    Bart van der Wolf, Nov 6, 2004
    #6
  7. jpc wrote:
    >
    > I agree with everything you are saying but let me expand on what I'm
    > doing so I can narrow down and explain my question/problem a bit more
    > clearly.
    >
    > Recently and for the third time over the last couple years I've tried
    > to work out a simple method of determining the well depth of my
    > camera. This time, I created what Kodak calls a photon response curve
    > in one of their app notes. This is a plot of noise vs illumination on
    > a bare CCD--kodak's version--or noise vs corrected A/D units in the
    > raw data out of my camera--my version.


    Okay, you mention "raw data" from your camera. Does your camera
    provide a RAW type format that simply dumps the individual sensor
    A/D counts, or are you talking about something where each pixel in
    the "raw data" has already been demosaiced?

    [snip of disparity between predicted and experimental results]

    > So if demosaicing is really a fancy form of signal averaging that will
    > create the equivelent noise of a virtual sensor the size of a color
    > filter square, then everything fall together nicely.


    I did a search for "bayer", "demosaic", and "procedure", and I
    got the following .pdf file as a hit:

    http://ipserv.cse.yzu.edu.tw/iplab/meeting_paper/911/Demosaicing.pdf

    It's a bit technical, but it has additional references listed
    which might further explain things. While it doesn't directly
    address effects on noise, it seems that you are indeed looking at
    a sophisticated form of signal averaging, with some additional post
    processing. If you want to put the time and neurons in, you could
    probably derive how the noise propagates through the presented
    demosaicing procedure.

    BJJB
     
    BillyJoeJimBob, Nov 6, 2004
    #7
  8. jpc

    jpc Guest

    On Sat, 06 Nov 2004 17:20:00 -0500, BillyJoeJimBob
    <> wrote:

    >jpc wrote:
    >>
    >> I agree with everything you are saying but let me expand on what I'm
    >> doing so I can narrow down and explain my question/problem a bit more
    >> clearly.
    >>
    >> Recently and for the third time over the last couple years I've tried
    >> to work out a simple method of determining the well depth of my
    >> camera. This time, I created what Kodak calls a photon response curve
    >> in one of their app notes. This is a plot of noise vs illumination on
    >> a bare CCD--kodak's version--or noise vs corrected A/D units in the
    >> raw data out of my camera--my version.

    >
    >Okay, you mention "raw data" from your camera. Does your camera
    >provide a RAW type format that simply dumps the individual sensor
    >A/D counts, or are you talking about something where each pixel in
    >the "raw data" has already been demosaiced?


    Here's where things become complicated. The camera--a Oly
    3020--doesn't have an official raw mode. However the sensor, A/D and
    control chip are identical to the ones used on the Nikon coolpic 990.
    Moreover, a russian hacker has come up with a procedure to create raw
    files that works on both cameras

    To further complicate matters the cameras have CMYG color filters. So
    the hacker also wrote a dos program that takes the CMYG data and turns
    it into a Nikon NEF file which can be processed in Photoshop. So, as
    you can see, any of these steps may have had some effect on my noise
    numbers.

    As for my photon response curve, I photographed an evenly illuminated
    background thru a Kodak step tablet #2--a strip of 21 neutral density
    filters if your aren't familar with the product. After using
    PhotoshopCS on the NEF file, I converted the image to LAB mode and
    then did my data reduction in ImageJ on an 8 bit grey scale image of
    the L channel.

    What I intend to do next is to take some slightly better data, process
    it several diferent ways and see what I come up with. Hence this post,
    an attempt to learn more about the theory so I have a better idea of
    what to look for.


    >
    >[snip of disparity between predicted and experimental results]
    >
    >> So if demosaicing is really a fancy form of signal averaging that will
    >> create the equivelent noise of a virtual sensor the size of a color
    >> filter square, then everything fall together nicely.

    >
    >I did a search for "bayer", "demosaic", and "procedure", and I
    >got the following .pdf file as a hit:
    >
    >http://ipserv.cse.yzu.edu.tw/iplab/meeting_paper/911/Demosaicing.pdf
    >
    >It's a bit technical, but it has additional references listed
    >which might further explain things. While it doesn't directly
    >address effects on noise, it seems that you are indeed looking at
    >a sophisticated form of signal averaging, with some additional post
    >processing. If you want to put the time and neurons in, you could
    >probably derive how the noise propagates through the presented
    >demosaicing procedure.


    Thanks. I took a quick look and fear that paper might be too much for
    my math challenged brain. But I do have access to a unversity library
    and will use the references to see if I can find something slightly
    less challenging.

    jpc

    >
    >BJJB
     
    jpc, Nov 7, 2004
    #8
  9. jpc

    jpc Guest

    On Sat, 6 Nov 2004 19:53:16 +0100, "Bart van der Wolf"
    <> wrote:

    Thanks for the post. It set me thinking. If you could, review what
    I've come up and comment on weither you agree or disagree

    ><jpc> wrote in message
    >news:...
    >> I'm seeing perhaps a factor of 2 more noise in the individual
    >> RGB channels than I'm seeing in the combined RGB image.
    >> This was measured by running a line profile on images with
    >> clean blue sky (no traces of clouds}, checking that there wasn't
    >> a slope to the profile plot, and then calculating the standard
    >> deviation.

    >
    >Adding several noisy channels will, assuming the noise is uncorrelated
    >and random (approx. Gaussian), cancel out somewhat. The Red, Green and
    >Blue channels will have different amounts (standard deviation) of
    >noise, and the Luminance (for which our eyes are most sensitive)
    >contribution is very roughly 30/60/10 for the RGB channels. So
    >Luminance noise is a weighted average of channel variance, and
    >chrominance noise has different visibility depending on the color.


    Since the camera I'm using is clearly photon noise limited from say
    neutral grey to saturation the noise in individual channel should be
    proportional to the square root of the illumination recieved.

    >
    >> This seems to indicate that the demosaicing firmware in my
    >> camera is taking the the four independent measurement of RGGB
    >> sensors--or in my case the CMYG sensors--and signal averaging
    >> them to improve the noise in the resulting image by a factor of two.

    >
    >No, that's not how demosaicing is performed. There is no such thing as
    >RGGB averaging of four sensors.


    My poor choise in wording. I didn't mean to say their was RGGB
    averaging but rather that rather that the effect of demosaicing maybe
    simular to RGGB averaging.


    The data for each single color
    >filtered sensor, gets complemented for the missing colors by
    >sophisticated weighting of many surrounding sensor measurements. This
    >will result in a matrix with RGB values for each output pixel, of
    >which one channel value was sampled and the other two are
    >reconstructed.


    This is where I may be totally off base. I will assume that the
    "sophisticated weighting of many surrounding sensor measurements"
    involves a series of additions and subtractions similar to the
    conversion of my CMYG data into RGB data and perhaps to the down
    sampling algorithums used in photoshop and other photo editors.
    Furthermore, since camera companies are so tight lipped about the
    technical details, there is not much chance of finding out exactly
    what is going on short of a firmware dump of the camera control chips
    and some serious reverse enginnering.

    Both downsampling and CNYG to RGB coversion have an effect on noise.
    In Photoshop for instance nearest neighbor downsampling makes blue sky
    noise worse with the more sophisticated down sampling techniques doing
    a better job. As for the CMYG to RGB I've seen it argued both ways in
    the astronomical web sites--that the converion halfs the noise or than
    it doesn't do much or may even make the noise slightly wiorse. Anybody
    know if there is now a consensis?

    In any case, since I've seen the effect in my own blue-sky pictures,
    something is going on. For the moment, my working hypothisis is that
    the noise reduction from demosaicing is strongest on high intensity
    single colors and may fall off when the colors are more mixed.

    Maybe totally wrong but once I get my hands on a color target it makes
    for some interesting experiments.

    Any comments?

    jpc






    >
    >> Does anyone know of a site that explains the theory behind this
    >> effect?

    >
    >It's just statistics. Adding three random Gaussian noise samples of
    >equal weight, will reduce the noise by a factor of SQRT(1/3). If the
    >weights are different, the calculation becomes a bit more complicated.
    >
    > Bart
     
    jpc, Nov 7, 2004
    #9
  10. jpc

    E. Magnuson Guest

    On 2004-11-07, jpc <> wrote:
    > Here's where things become complicated. The camera--a Oly
    > 3020--doesn't have an official raw mode. However the sensor, A/D and
    > control chip are identical to the ones used on the Nikon coolpic 990.
    > [...]
    > To further complicate matters the cameras have CMYG color filters.


    That's odd. Site likes dpreview list the 3020 as using RGB
    (http://www.dpreview.com/reviews/specs/Olympus/oly_c3020z.asp)
    Sony makes both RGB (ICX252AQ) and CYMG (ICX252AK) versions of that
    sensor. While Nikon used the CYMG versions, I thought that Oly always
    used the RGB versions (and the same applied to older cameras like the 950
    vs. 2020).

    --
    Erik
     
    E. Magnuson, Nov 7, 2004
    #10
  11. jpc

    jpc Guest

    On Sun, 07 Nov 2004 17:03:34 GMT, "E. Magnuson" <>
    wrote:

    >On 2004-11-07, jpc <> wrote:
    >> Here's where things become complicated. The camera--a Oly
    >> 3020--doesn't have an official raw mode. However the sensor, A/D and
    >> control chip are identical to the ones used on the Nikon coolpic 990.
    >> [...]
    >> To further complicate matters the cameras have CMYG color filters.

    >
    >That's odd. Site likes dpreview list the 3020 as using RGB
    >(http://www.dpreview.com/reviews/specs/Olympus/oly_c3020z.asp)
    >Sony makes both RGB (ICX252AQ) and CYMG (ICX252AK) versions of that
    >sensor. While Nikon used the CYMG versions, I thought that Oly always
    >used the RGB versions (and the same applied to older cameras like the 950
    >vs. 2020).


    Thanks for identifying the chips used. I've been looking for that
    information for a while.

    <mini rant> As for what is reported on the review sites--like
    everything else on the internet the info can be wrong. Like the
    dpreview's noise figures, for instance. Anyone who thinks a camera has
    "a" noise spec and that a noise numbers taken from a neutral grey card
    will infallibly perdict how a camera will perform under extreme low
    light conditions doesn't understand the difference between readout
    (more or less constant) and photon nosie (far less constant). They
    also don't understand how the noise reduction firmware on some noisy
    and expensive high pixel count camera have been tailored to make grey
    card noise look better than it is. <end mini rant>

    Anyway the 3020 does have a CYMG sensor as does the Oly 4040, 4000,
    2020 and maybe others. If you have any of these cameras and want to
    know how to use their unofficial raw mode the information is posted on
    the yahoo 405080 camera groups new web site-- www.myolympus.org.

    jpc
     
    jpc, Nov 7, 2004
    #11
  12. <jpc> wrote in message
    news:...
    > On Sat, 6 Nov 2004 19:53:16 +0100, "Bart van der Wolf"

    SNIP
    > Since the camera I'm using is clearly photon noise limited
    > from say neutral grey to saturation the noise in individual
    > channel should be proportional to the square root of the
    > illumination recieved.


    Yes, but also mixed with sensor, electronics, and quantization noise
    which can add to or subtract from the photon noise for each individual
    sensor. But at the higher luminance levels and at a low ISO setting,
    the photon shot noise will dominate in absolute terms (not as a
    percentage).

    SNIP
    > Both downsampling and CNYG to RGB coversion have an
    > effect on noise. In Photoshop for instance nearest neighbor
    > downsampling makes blue sky noise worse with the more
    > sophisticated down sampling techniques doing a better job.


    Yes, poorly implemented down-sampling methods produce aliasing
    artifacts. Photoshop's better methods are still not very good at that:
    http://www.xs4all.nl/~bvdwolf/main/foto/down_sample/down_sample.htm

    Bart
     
    Bart van der Wolf, Nov 13, 2004
    #12
    1. Advertising

Want to reply to this thread or ask your own question?

It takes just 2 minutes to sign up (and it's free!). Just click the sign up button to choose a username and then you can ask your own questions on the forum.
Similar Threads
  1. CFA demosaicing examples

    , Feb 6, 2004, in forum: Digital Photography
    Replies:
    3
    Views:
    549
    Bart van der Wolf
    Feb 8, 2004
  2. Brian C. Baird

    Canon 1Ds Mark II - Noise? What noise?

    Brian C. Baird, Sep 21, 2004, in forum: Digital Photography
    Replies:
    9
    Views:
    514
    Brian C. Baird
    Sep 21, 2004
  3. Giox

    Demosaicing of Mg,Cy,Ye,G

    Giox, Oct 10, 2006, in forum: Digital Photography
    Replies:
    2
    Views:
    771
  4. Marc Wossner

    luminance noise and chroma noise: What is it?

    Marc Wossner, Jul 3, 2007, in forum: Digital Photography
    Replies:
    6
    Views:
    1,200
  5. Giuen
    Replies:
    0
    Views:
    1,016
    Giuen
    Sep 12, 2008
Loading...

Share This Page