maximum dynamic range using RAW

Discussion in 'Digital Photography' started by Frank van der Pol, Dec 21, 2003.

  1. Hi all,

    I wonder if there is any information available about the exact color
    temperature to be used to make maximum use of the full dynamic range
    of the CCD of a digital camera.

    Digital cameras do not have what videocameras do have, which is a
    filter wheel to prefilter the light behind the lens, before it reaches
    the CCD. Afaik a videocamera CCD without a filter is adjusted to work
    without filtering at 3200K and a yellow color correction filter (using
    the filter wheel) is used for 5000 K light. (which costs about 2/3 of
    a stop but that doesn't matter much since light level is usually
    higher in daylight).

    In film based photography the use of color correction filters to adapt
    the light to the color temperature of the film is quite common.

    In digital photography however, every correction seems to be done
    after the exposure, using either auto white balance or custom white

    I have a feeling that this approach results in not using the full
    dynamic range of each of the sets of cells in the bayer pattern of the

    Suppose a 5000 K light source would result in filling each of the CCD
    cells exactly to their edge, so to speak. Then you would make use of
    the maximum dynamic range of the CCD. However, when you would use
    3200K tungsten lighting, you would probably be overexposing the red
    filtered elements of the CCD and therefore not make use of the full
    dynamic range of the green and blue elements. Consequently, the gamut
    of the resulting image will probably be limited, since one of the
    three channels will already be clipping while the other two may not
    even be at 3/4 of their maximum.

    So what is the color temperature to be used to be sure a CCD is being
    used to its dynamic range limits? And would this be different for the
    various CCD's used by camera manufactureres?

    If my theory is correct, one probably should also measure the green
    component in the light used.

    Frank van der Pol, Dec 21, 2003
    1. Advertisements

  2. Frank van der Pol

    Mark Johnson Guest

    Yes, but if you get the pixels, at some intensity, then you can work
    with them. Obviously the CCD has a far greater range than the human
    eye. In bringing the shadows out, and keeping the highlights down, you
    modify the color as suits you. You don't even need to correct wb and
    color when reading in the RAW. Do that in PS. You can get IR shots
    with a wratten 70, supposedly, on a C5050 as I have (I just can't find
    the wratten 70 anywhere). And the C5050 is supposed to have internal
    filters against IR. So. If you can get any information, however in
    shadow, you can work with it. I've seen some excellent IR examples,
    without the 'white tree effect', from just such a C5050. Just as long
    as the RAW records some information that isn't just completely lost to
    noise, you might be able to get a good photo.
    Mark Johnson, Dec 21, 2003
    1. Advertisements

  3. Frank van der Pol

    Jeremy Nixon Guest

    If you adjust tungsten light with white balance, you are boosting the
    blue channel, increasing its contribution to the overall picture. Since
    the blue channel is usually the one with the most noise, look what you've
    just done -- boosted your noise.

    I never thought about the aspect you bring up, but it sounds valid enough

    As for the actual answer to your question, I suspect none exists -- you
    would have to test your particular camera to figure out the "best" white
    balance for it, and your results would probably not apply to others.
    Jeremy Nixon, Dec 21, 2003
  4. Frank van der Pol

    MikeWhy Guest

    I'm guessing Canon's are slanted toward green, by virtue of the extra
    pixels. That's a guess, and not an edumacated one. Actually, not even a good
    guess, as the extra green pixels only means it will be more sensitive in
    green'ish light, but not capture any more highlight or shadow detail at each

    I suppose you could find out empirically. Set up an adjustable color light
    and measure the results. :) (Sorry; but do let us know when you find out.)
    MikeWhy, Dec 21, 2003
  5. Mark Johnson schreef in bericht
    But what the CCD sees is not exactly what is put into the RAW file. It
    is a digitized version of what the CCD sees and therefore limited by
    (the quality i.e. bit depth of) the A/D converter used.
    True, but you will have less color values available if you don't
    expose each of the channels/colors tot their maximum.
    I'm not looking for IR nor do I own a C5050. I'm looking for a correct
    exposure and a light source that matches the color sensitivity of the
    CCD as closely as possible in order to get as much values into the RAW
    file as possible.
    That would be true if the RAW format would hold the analog data of the
    CCD, but the RAW file format is a digital format and thus holds
    digital values that are the result of A/D conversion. When all red
    sensitive pixels are not exposed to their maximum, the dynamic range
    of the A/D converter is not used to its limit. And the result will be
    simply: less digital values available in the channel that holds the
    data for the red sensitive pixels.

    Besides, I'm not looking for 'a good photo' only. I'm looking for 'the
    best exposure' to ensure that the RAW data file is holding as much of
    the dynamic range of the CCD as possible. And imho this can only be
    achieved by making sure that each of the pixels reaches its maximum
    value when exposed with white light in order to make use of the
    maximum capacity of the A/D converter. The subsequent question is:
    which light source makes this happen?

    Frank van der Pol, Dec 21, 2003
  6. Not only that. You are extending a limited range of values extracted
    from the blue sensitive pixels to the full scale, which in the end
    results in less (color-)values than possible.

    In audio recording terms a mismatch in color simply means that you
    loose our 'headroom' because one or two channels are already clipping
    while the third is not yet recorded/exposed at its/their maximum.
    That is also true.
    If the point is valid, I would expect camera manufactureres to specify
    the light source to be used that makes their camera perform at its

    Frank van der Pol, Dec 21, 2003
  7. Frank van der Pol

    Lionel Guest

    Kibo informs me that Frank van der Pol
    I suspect that you're probably correct in this, & it's an extremely
    interesting question you've raised.
    There are a number of ways to think about this - for one, silicon
    photosensors are most sensive (IIRC) to the green end of the spectrum,
    which might lead one to assume that it'd be better to bias one's light
    source to red. OTOH, I would expect the sensor engineers to bias the
    Bayer colour filters to give a tailered response curve, balancing
    optimum sensitivity with best overall performance with the most common
    shooting conditions, so that doesn't neccessarily prove anything.

    Hm. Someone recently posted a link to an article about scientific
    imaging that compared the Canon EOS D60 to the Sigma SD9 & included
    scpetral response curves for both cameras. <sound of me rummaging
    through my archive> Ah! - Here you go:
    So, in the case of the D60 & SD9, the answer to your question is that
    the sensor is slightly biased towards red (with a 'daylight' WB

    With a calibrated-colour light-source & software that can read exact A
    to D values directly from the RAW input file, it should be possible to
    precisely map the sprectral sensitivity for any given digital camera,
    but I don't have the resources to do this myself. If anyone else out
    there can determine this, I'd be fascinated to see the results, as it
    might provide useful information for optimising lights sources (as
    you've said), or for appropriate filters for use under extreme lighting
    Lionel, Dec 22, 2003
  8. Frank van der Pol

    Lionel Guest

    But it doesn't matter, because you've increased the signal level as
    well. In general, (as Frank is assuming) you'll get least noise from a
    sensor by maximising the signal going into it, as (all else being equal)
    transducer noise is usually a constant, rather than being directly
    related to signal amplitude. It's a maxim of signal processing that you
    strive for the highest possible signal before processing it, in order to
    lift it as far as possible above the noise floor before converting it to

    Bringing it back to digital cameras; bear in mind that when you WB a
    digital photo, you're skewing the spectral response by multiplying the
    R, G & B data by a set of numbers that correspond to the *inverse* of
    the colour temperature you want the scene to *appear* to be illuminated
    by (typically 5600K or thereabouts), resulting in amplification of both
    signal data *and* noise.

    For example, if your scene was lit by halogens at (say) 3000K, your blue
    channel starts off dark (a range of values far lower than the dynamic
    range of the sensors blue channel), & is then multiplied by a WB matrix
    to bring it up to 5600K (or whatever), resulting in much more noise in
    the final blue channel.
    OTOH, if you illuminated the scene (or used an appropriate filter on the
    lens), such that a white card would give you a signal from each channel
    on the image sensor slightly below the clipping level, no correction
    would be required, resulting in the minimal possible noise possible from
    that sensor.
    Yes, that's precisely what I'd expect, too.
    Lionel, Dec 22, 2003
  9. Frank van der Pol

    Jeremy Nixon Guest

    Without having actually done the tests, I would think that it *would*
    matter. If we assume (as is generally the case) that the blue channel has
    the most noise; and that we are boosting the blue channel to correct the
    light's color temperature, then we are increasing both signal and noise in
    blue, but we are not increasing signal in the other channels. Thus the
    extra noise from the blue channel would increase in its "contribution" to
    the final picture, in comparison to the non-noise portion of the data,
    and the result would be more noise than either not boosting the blue
    channel, or boosting one of the other channels.

    Thus, if you need to compensate for warm artificial light, better results
    will be had by locking a daylight white balance into the camera and using
    a filter to adjust color temperature the old fashioned way -- assuming that
    the sensor's "native" white balance is daylight rather than tungsten, which
    is just that, an assumption. Even if that's *not* the case, though, the
    fact that the blue channel has the most noise probably means that those
    results would still be better in terms of final noise content, though not
    necessarily in dynamic range.

    The original poster's question, of course, was what *is* that "native" white
    balance? A most interesting question. I had assumed it would be daylight,
    but that of course isn't based on anything at all, and therefore it's silly
    to blindly assume that.
    Jeremy Nixon, Dec 22, 2003
  10. Frank van der Pol

    Mark Johnson Guest

    Close enough. The electronics are in there trying, somehow, to chase a
    signal, using this Bayer design on top of that. It's amazing they get
    what they do.
    If you have anything to work with, it'll be cleaner in the RAW. And
    then you can 'colorize', if you prefer, at your leisure.

    It sees what it sees. And you'll have to do stuff to get it to look
    'right'. The thing is, you sometimes don't want it to look like it did
    in the LCD. Wide range of values, and if you let the camera 'fix it',
    it might fix it to where you can't unfix it. Yes, it does require
    extra work in Photoshop, or whatever RAW program you might use
    separately or as a plugin. You want the best photo you can get,
    whether it's the perfect record of the scene or not - a good photo
    generally is not.

    What is it you think you're missing. Are you hoping for a better CCD
    to reduce confusion/noise? Are you hoping for less of that noise in
    the blackest shadows, so you can pull it out, later?
    Mark Johnson, Dec 22, 2003
  11. In digital photography however, every correction seems to be done
    No, silicon is most sensitive to near-IR
    As I see it, there are two issues:

    - as the ADCs are typically 10, 12 or 14-bit, and the eye can only see 7
    bits, the slight lack of dynamic range doesn't matter.

    - if you have a "daylight" sensor, and wish to correct for using
    "artificial" light, then either you increase the blue gain, increasing the
    noise, or you filter optically cutting down both the overall light and in
    particular the red light. Less light - more noise. I don't know which is
    best, but I would suggest that if you are using articifical light, you are
    working at low light levels, and therefore noise rather than any lack of
    dynamic range would be a more important issue.

    David J Taylor, Dec 22, 2003
  12. Frank van der Pol

    Lionel Guest

    Kibo informs me that "David J Taylor"
    Yes, you're quite correct. I was limiting my comments to the part of the
    spectrum that's usually used for photography.
    Well, if you get a perfect exposure evey time & never need to gamma
    correct or otherwise alter the tonal balance of the image, sure, but
    it's always good to have as much headroom as possible. (Which is why I
    edit in 16 bit colour, rather than 8 bit, whenever possible.)
    If all else remains equal, yes. In practice, you'd either turn up the
    lights, slow the shutter, or open up the aperture to get the best
    exposure. Any of these options would result in the optimum use of the
    dynamic range of the image sensor.
    In general, more input signal = less output noise.
    Lionel, Dec 22, 2003
  13. Frank van der Pol

    gsum Guest

    The human eye has many times the range of any CCD
    out there. That is why this problem arises in the
    first place! Moreover, the eye/brain can adjust for
    range, brightness and colour temperature in real time
    to a resolution way beyond that seen in the latest CCDs.

    gsum, Dec 22, 2003
  14. Yes, and No!

    Within a single image, the eye only has a range of about 7 bits.

    Between different parts of a scene, or between day and night of course, it
    is several orders of magnitude.

    David J Taylor, Dec 22, 2003
  15. Frank van der Pol

    Lionel Guest

    Eh? I'm confused - that's pretty much exactly what I've been saying:
    That increasing the *source* signal in the weak channel (ie; increasing
    the amount of blue in your illumination, or filtering out some of the
    red/yellow) to balance it vs the other channels will improve your signal
    to noise ratio.
    Yep, exactly. From an engineering point of view, I would assume that it
    would start off as something close to daylight, & would be tweaked
    closer to that in the Bayer filter array, but as with you, it's purely
    an educated guess.
    Lionel, Dec 22, 2003
  16. Frank van der Pol

    Mark Johnson Guest

    No, the eye can't pick up subtle shadows below a certain threshold and
    the like. But information might be there, anyway, which you can see by
    lightening the section.
    Mark Johnson, Dec 22, 2003
  17. Frank van der Pol

    DJ Guest

    I could be wrong about this, but the following experiment comes to mind:

    1. Shoot a white card in different types of light (daylight, tungsten etc)

    2. Examine the RAW files' histograms using a program that plots each colour
    channel separately, like Capture One.

    3. See which shot has the 3 curves most nearly overlapping each other.

    Appropos this I noticed today, processing a shot of a flower, that the shaded
    interior of the flower "trumpet" had a strange "flatness" to it. It turned out
    that the blue channel is reading 00 in those parts. The picture is at (230K file)
    Similarly some of the brighter parts of other shots were saturated in one colour
    channel but not others. These were taken in RAW mode.

    The "bad" news is that the in-camera histogram only seems to display a
    saturation warning at 255,255,255, so contrary to what I've been assuming it is
    not a 100% reliable exposure control indicator.

    DJ, Dec 22, 2003
  18. Mark Johnson schreef in bericht
    No. The CCD sees R, G and B channels in an analog form, having an
    infinite number of levels. The A/D converter limits this to say 10, 12
    of 14 bits. If one of the three channels is not used to its full
    extend, you might end up using only 8 7 or even 6 of the bits
    available. That is the same as the audio equivalent of making a
    digital recording with the PPM prior to AD conversion set to -40 dB
    It might be amazing, but it's not the maximum the camera can deliver.
    You simply loose a number of values in one of the seperate channels.
    I have a feeling you're missing the point in my question.
    What I'm trying to figure out is how to provide the sensor with the
    light and color that will make full use of the available dynamic range
    of the AD converter, resulting in a maximum quality RAW file.

    To say that it sees what it sees and you'll have to do stuff to get it
    to look 'right' is about the same as using daylight film with tungsten
    light and have the lab filter out the red/yellow cast or shooting
    fluorescent lighting using tungsten film and have the lab filter out
    the green cast. This simply isn't the correct approach and will not
    yield the best color fidelity. Using the camera (i.e. the CCD and the
    subsequent A/D converter) to the maximum of its capabilities means
    feeding the sensor a light source that is perfectly balanced. And this
    may or may not be 5000 K, 5500K 6000K or even 3200 K but may even be
    light that shifts towards either green or magenta.
    No. I'm hoping for the best light source to capture an image in order
    to reach the maximum quality RAW file.

    If the light source doesn't matter and correcting using the RAW file
    does not involve quality loss, you could also use an infra red light
    or even a natrium vapour light source. And that isn't true either.

    Frank van der Pol, Dec 22, 2003
  19. David J Taylor schreef in bericht

    But imho it does. When you would not have a correctly colored light
    source, you may end up with as little as 7 bits and thus 128 values at
    the output side of tyhe A/D converter. This simply results in a lower
    number of color values.
    If that would be true, there would be no advantage in using 10,12 or
    14 bits A/D converters. If the only reason for using 14 bits A/D
    conversion is to be able to convert one channel to its maximum of 14
    bits and the two others to only 8 or even 7 bits, one could just as
    well use the optimum color balance and use 8 bit A/D converters.

    Frank van der Pol, Dec 22, 2003
  20. But what the CCD sees is not exactly what is put into the RAW file.
    This is only a limitation if the blue channel is way lower than the other
    channels. In any case, under artifical light conditions I think that the
    sensor noise (quantum limited?) will be more important than quantisation
    in the conversion process.

    Getting the most light on the sensor is important, i.e. applying no
    external filter.

    Only required when you have plenty of light, though. What situations are
    you envisaging? Well lit daylight scenes or poorly lit night ones?

    David J Taylor, Dec 22, 2003
    1. Advertisements

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments (here). After that, you can post your question and our members will help you out.