flawed megapixel experiment

Discussion in 'Digital Photography' started by Bucky, Nov 22, 2006.

  1. I guess it is all relative, especially to your intended use.
    Thomas T. Veldhouse, Nov 27, 2006
    1. Advertisements

  2. Bucky

    Scott W Guest

    Absolutely I will get a better photo without the flash.

    It is a very rare condition where a photo taken with a flash at ISO 100
    is going to look better then a photo taken without at ISO 800, assuming
    there is enough light for the ISO 800 shot.

    Sure there are going to be cases where there is not even enough light
    for an ISO 800 shot and in those cases I will use a flash.

    Again it would really help for you to post an example of when a flash
    at ISO 100 was the better choice over ISO 800 and available light.

    Scott W, Nov 27, 2006
    1. Advertisements

  3. Bucky

    Scott W Guest

    You keep saying that, but you have yet to give an example where going
    to a higher ISO and avoiding the flash results in the poorer photo.

    Scott W, Nov 27, 2006
  4. It was somebody here, who clearly claims to be a photographer, who indicates
    they will use a flash "only as the last resort".
    Thomas T. Veldhouse, Nov 27, 2006
  5. Indeed not ... and you would be a fool to use a flash if that is what you are
    trying to photograph.
    Why do you insist on making generalizations as if my comment applies to all
    situations and all photographers? I have already said, "Some photographers
    will automatically boost their ISO where a flash will better suit the
    situation". It is equally valid to say that "some photographers will use
    flash where they should boost ISO and use ambient lighting". It is not
    mutually exclusive and yet you INSIST on making it so to bash my comment.
    Thomas T. Veldhouse, Nov 27, 2006
  6. Bucky

    J. Clarke Guest

    And for the work he does that might very well be the correct viewpoint.
    J. Clarke, Nov 28, 2006
  7. That would be true if the sensor could capture arbitrarily large
    exposures without loss. But all sensors have a maximum exposure beyond
    which they discard additional light or (worse) have the photoelectrons
    spill into adjacent pixels. This is called "full well capacity" on
    CCDs, but CMOS sensors have an exposure limit too.

    So a reasonable definition of "native" ISO rating is one that gives 3
    or 4 stops of headroom above mid-grey before the sensor becomes
    saturated. You can *increase* the effective ISO above that by using
    less than the full sensor exposure range and adding some extra gain,
    preferably between sensor and A/D converter. But you can't *decrease*
    effective ISO and compensate by decreased amplifier gain, because the
    sensor has saturated and damaged the highlights.

    Film is more tolerant of "pull" processing (overexpose/underdevelop)
    because film saturates gradually due to the "shoulder" in the
    characteristic curve. CCDs don't have a "shoulder"; they clip hard when

    Dave Martindale, Nov 30, 2006
  8. [A complimentary Cc of this posting was sent to
    Ray Fischer
    Yeah, right!
    I'm afraid that your understanding of what is aliasing is severely
    limited. The effect you describe (convolution with a "bump" function)
    is fully invertible in postprocessing (with VERY minor S/N ratio

    Aliasing, by definition, can't be reduced by postprocessing. This is
    why camera manufacturers agree to significantly blur the image so that
    the aliasing is decreased.

    Hope this helps,
    Ilya Zakharevich, Dec 1, 2006
  9. You are still wrong here. When I and others compare the 5D against medium
    format film, what we find is that the 5D randomly smears the fine detail.
    What's happening is that (due to it's overly weak AA filter) it's pretending
    to resolve detail that it really can't, so for text in signs that the film
    camera resolves correctly, the 5D gets some of it and mushes up some of it.
    It's an ugly disturbing effect, and means that you can't enarge to the point
    where that detail is visible. So there's absolutely zero practical gain from
    the bogus sharpness, and the user would be better off with apparently softer
    images (sharpened to taste in postprocessing instead of damaged permanently
    during capture) that only look soft, not deranged, when overenlarged.


    David J. Littleboy
    Tokyo, Japan
    David J. Littleboy, Dec 1, 2006
  10. Bucky

    Ray Fischer Guest

    And summarily trashed. I don't accept spam.
    You agree?
    Better than yours
    I didn't describe any function at all, much less an invertable one.
    I'm referring to the actual, physical sensor.
    That's right. Once the image has been captured it is too late to do
    any anti-aliasing.
    Bzzzt! Wrong.

    Blurring has nothing at all to do with anti-aliasing.
    Ray Fischer, Dec 1, 2006

  11. Not like me to defend Ilya.

    And iIt's not like Ilya to use a 5 cent word when a
    25 cent word will do -- but isn't "blurring" really
    th common term for lowpass filtering in this context?

    And isn't LPF exactly what an anti-alias filter does?

    If I've got that right, then it appears Ilya is right,
    for once.

    rafe b
    Raphael Bustin, Dec 1, 2006
  12. Bucky

    Scott W Guest

    It is a complicated issue and neither David or Ilya is completely
    wrong. If we were talking about audio then David would be completely
    right, not removing the frequencies that are going to alias would make
    no sense and trying to digitize them would just add artifacts to the
    resulting digital file. But things are more complicated with images,
    we can't use a hard edged filter to remove the higher frequencies
    like we can with an audio signal, if we did we would get ringing and
    this shows up as halos. So the AA filter has to be somewhat smooth in
    how it rolls off the higher frequencies. The result of this smooth roll
    off is that some of the lower frequencies are reduced in amplitude and
    this is sadly the same as blurring.

    Now I have to take strong issue with Ilya wanting to call the anti
    alias filtering a defect, he should know better then this and it is a
    silly thing to try and say. The anti alias filtering is simply
    limiting the information presented to the digitizing system, the sensor
    in our case, to information that it can handle. This is done in almost
    all digitizing systems and no one calls the filter a defect or

    Scott W, Dec 1, 2006
  13. Bucky

    acl Guest

    What do you mean? Why would this ringing not appear in audio? Or do you
    mean that it is not important? If so, why not? The characteristics of
    the human ear (and brain)?
    acl, Dec 1, 2006
  14. Bucky

    Scott W Guest

    Sorry I should have been more clear, the reason it does not matter in
    audio is that the cut off filter is places above the range of human
    hearing and so you can't hear the ringing.

    Scott W, Dec 1, 2006
  15. Bucky

    acl Guest

    Ah ok, thanks.
    acl, Dec 1, 2006
  16. That's exactly how it should be done with images as well.

    Sampling an audio recording at 40 kHz is a problem because you need a
    brickwall filter at 20 kHz. Note that just like using a low quality lens,
    most real world sounds do not have a lot of energy above Nyquist so
    you can get away with some aliasing.

    However, you can also sample audio at 96 kHz or 192 kHz. This provides
    much more room for the brickwall filter, and avoids damaging frequencies
    below 20 kHz.

    For photography you can do the same thing: use a sensor that has a much
    high resolution than you are going to use in print. For example, take
    pictures with a 10 Mpixel sensor and print at 4x6. That allows strong
    filters that kill all aliasing and at the same time do affect the 4x6

    Unfortunately, oversampling images is much too expensive. So we have to
    live with AA filters that are too weak (I don't think too many people
    way pay for a sensor that has an AA filter strong enough to kill all
    Philip Homburg, Dec 1, 2006
  17. Bucky

    Scott W Guest

    Yup samples/second are cheap samples/mm not so much.

    The truth is some aliasing in a photo is not a real problem it is only
    when the frequencies get close to twice the Nyquist limit that it
    become a real problem, mostly. There are some odd patterns that should
    up in some of my photos that are clearly from aliasing of frequencies
    just above Nyquist but these are pretty rare.

    What would be the nicest would be an adjustable AA filter. I can sort
    of do this by shooting at really high f/number but the use of this is
    very limited.

    Scott W, Dec 2, 2006
  18. [A complimentary Cc of this posting was sent to
    David J. Littleboy
    Well, at least your argument does not show this.
    There is no "bogus sharpness" without AAF. What there is is a
    honest-to-goodness blurriness *with* AAF.

    *IF* the image in the focal plane has aliasing features, then having
    AAF has some positive effects (in addition to obvious negative effect
    of blurriness). However, if there is no such features (as happens
    with many lens settings, shake, focus errors etc), then AAF has ONLY
    negative effects.
    One should know their tools. For a camera with weak or no AAF, one
    should be careful indeed when using "sharp" lens settings (at least
    with certain subjects). It would be nice if the camera would avoid
    these settings when on auto-everything mode...

    Hope this helps,
    Ilya Zakharevich, Dec 2, 2006
  19. Bucky

    Ray Fischer Guest

    Blurring may be a low pass filter, but that has nothing to do with
    No. A properly designed anti-alias filter is not a low-pass filter.
    They're very different functions.

    An example. Here's a ASCII representation of the light falling on
    just one sensor in the array. I've divided the light up into 9 parts,
    but the reality is there the number of discrete coloars is limited
    only by the lens. The letters represent the obvious colors.


    If the light-sensitive of the chip is just in the middle, then the
    color it will detect is green. The other colors will be completely
    invisible. Repeat that over many sensors and you get aliasing.

    A properly anti-aliasing "filter" (typically something like a
    microlens) focuses all of the light falling on that square onto the
    sensor, eliminating the aliasing effect and give the correct color of,
    um, orangish. It's not blurring because it captures the image at the
    maximum resolution of the sensor array.

    A proper anti-aliasing filter does not blur across adjacent sensor
    elements. It eliminates the noise caused by aliasing.
    Ray Fischer, Dec 2, 2006
  20. [A complimentary Cc of this posting was sent to
    Scott W
    Sorry, but this argument is almost completely bogus. It *would* be
    applicable in the following situation:

    a camera has a fixed lens with a fixed f-stop (and you always
    shoot flat stationary strongly textured objects with a tripod and
    a perfact focus).

    But in real life, this is not how most photographic equipment is used.

    Quite often a shot has nothing to cause aliasing at all (e.g., shoot
    at f/22).

    Hope this helps,
    Ilya Zakharevich, Dec 2, 2006
    1. Advertisements

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments (here). After that, you can post your question and our members will help you out.