Putting the SD9 "yellow myth" to bed, a follow along RAW demo

Discussion in 'Digital Photography' started by George Preddy, Dec 24, 2003.

  1. Scan a single color retangle at 4000 dpi and see how much information you
    have.
     
    George Preddy, Dec 30, 2003
    1. Advertisements

  2. A truly digital (binary) image 1 or 0 ;)
     
    Darrell Larose, Dec 30, 2003
    1. Advertisements

  3. The SD9 has much larger pixels than the 1Ds, so obviously sensor size isn't
    the only variable. Even so, noise isn't the issue.

    You can't have it both ways. To say that the ~11MP 1Ds has similar optical
    resolution to the ~11MP SD9 because you can zoom in, is to say that the SD9
    has a similar FOV as the 1Ds, because you can zoom out.
    As long as you first stipulate that there is no FOV advantage, fine.
    When measuring FOV, should you put a wider angle lens on the SD9?
    Just the opposite, to pretend FOV costs nothing in terms of optical
    resolution is simple denial. The SD9 resolves much about double the optical
    detail as a 1Ds (http://www.outbackphoto.com/artofraw/raw_05/essay.html),
    and the 1Ds has a wider FOV.

    Either can be changed with lensing, the 1Ds can concentrate it's sensors to
    increase optical resolution and lose its FOV advantage, or the SD9 can
    increase its FOV at the expense of its optical resolution advantage.

    To determine the optical resolution advantage of the much higher sensor
    density SD9, the same lens must be used. To determine the FOV advantage of
    the much lower sensor density 1Ds, the same lens must be used.
     
    George Preddy, Dec 30, 2003
  4. You said "there's no advantage to full-frame sensors". I pointed out
    that there are several advantages, but FOV is not normally considered one
    of them. The fact that the SD9 has noise problems with large pixels
    seems to be a problem unique to the Foveon sensor, but among Bayer
    sensors larger pixels generally means lower noise.
    That's wrong in several respects. First, the SD9 is a 3.4 MP camera; it
    has only that many pixel locations. Second, the resolution of the 1Ds
    greatly exceeds the SD9 without "zooming in" - just making the field of
    view equal. In the outbackphoto test, the Sigma was effectively "zoomed
    in" to cover the smaller field.

    And of course the SD9 *can* have the same FOV as the 1Ds, with suitable
    choice of lens and subject distance. This is how resolution tests are
    supposed to be shot - but they weren't in this test. That's why the test
    is flawed.
    There should be no FOV advantage in a resolution test, since that's how
    resolution tests are supposed to be done. There *is* a FOV advantage if
    you are limited in your choice of lens and you want the widest possible
    field, but that's irrelevant to resolution testing.
    You should put a *shorter focal length* lens on the SD9, if you're going
    to use the same subject distance, in order to get the same angle (not a
    wider angle). Alternately, use the same lens, but move the SD9 closer to
    the target, until the target fills the image height. Do the same for all
    of the other cameras too. This is basic setup for shooting resolution tests.
    You have real problems with comprehension. I did *not* say that FOV
    costs nothing in resolution. Just the opposite - because a wider FOV
    reduces the subject-space resolution, it's important to have all
    cameras capture the *same* FOV to eliminate this factor from
    influencing the results. The outbackphoto test didn't do this, the
    Canon camera was capturing a wider FOV, and that makes its resolution
    appear worse than a fair test.

    In addition, this particular test used saturated red/blue test patterns
    that are designed to show a Bayer sensor at its worst, conditions that
    would never be found in nature. Normal resolution figures are not
    based on saturated red/blue tests, they're based on black/white tests,
    since the latter are much more representative of how our eyes see.
    (The eye's colour resolution is 10X worse than its B/W, a Bayer sensor
    is only 2X worse).
    This is a resolution test, not a "FOV advantage" test. Further,
    photographers don't care directly whether a camera uses a high or low
    pixel density on the sensor, just what resolution the final camera/lens
    combination achieves. To get *that* information, you must equalize
    field of view during resolution tests. In terms of "optical resoluton
    at the sensor plane", any cheap small-sensor P&S camera will beat any
    SLR, precisely because the pixel density is high.

    What matters is lines per picture height, or line pairs per picture
    height. Line pairs per mm is irrelevant and misleading when comparing
    different size sensors, unless you multiply by sensor size to get total
    resolution per picture height.

    Dave
     
    Dave Martindale, Dec 30, 2003

  5. I've been lurking for a while, enjoying the display of a human dumb
    enough to prove his lack of photographic education with every post.
    I'm a college graduate in photography, and was a custom darkroom
    printer, so you can imagine how funny I find your posts.

    So, about that link you gave above. Check the EXIF... you do know
    what that is, right? Look at the exposure time. Now look at the
    image title, "Lily Light Painting 2". Hmm, light painting, long
    exposure.... maybe there's a clue here why the image is less that
    perfectly sharp, and why the skin tone is so warm.

    You must be an extremely insecure person to need to bolster your ego
    by glorifying your equipment and demonising all other cameras. This
    would be more effective, however, if you used logical arguments and
    used examples that did not reveal your lack of any photographic
    knowledge. Just a helpful hint: you will get more ego-building if you
    do not humiliated yourself.
     
    The Black Sheep, Dec 30, 2003
  6. Kabang! Shot yourself right in the mouth right before you even got started.
     
    George Preddy, Dec 30, 2003
  7. Says who?
    It has lower noise than Canon (Popular Photography, Nov 2003), who cares
    why, but I'm sure all that extra room between sensors is a part of it.
    Unrealated to optical resolution. See...
    http://www.outbackphoto.com/artofraw/raw_05/essay.html
    It not even close, in fact, the blurry 10D has about the same resolution as
    the 1Ds with if the lens is held constant. See the link above again.
    Of course they are, before shooting a resolution test, you are supposed to
    put a 2000mm zoom on the Canon and put the lens cap on the SD9, then hoe for
    the best.

    Even when Phil Askey of dpreview erroneously moved the SD9 way far away to
    compensate for the extra "magnification" of the 1.7X vs 1.6X "crop" factor,
    the SD9 blew the D60 out of the water onto the beach...

    http://www.dpreview.com/reviews/sigmasd9/page19.asp
    tests.

    Great, so there you have it, the 1Ds enjoys no FOV advantage over the SD9.
    Like I said, as long as you mount an astronomical telescope to the front of
    the 1Ds and put 8mm fisheye on the competition, it does ok in head to head
    tests reolving lines on a test chart from the same distance (well, as long
    as they're B&W anyway). It's only fair. Or, alternatively, you could move
    the Canon 1Ds to just 6" away from the test chart, and put the SD9 in
    Mexico, then the resolution test is equally fair, even with the same lens.

    'Cause we al know that cropping an image optically magnifies the detail of
    what's left. In fact, if you take a picture of the Grand Canyon with the
    moon in it, and crop away the Canyon, you'll very clearly see the American
    Flag and the Lunar landing site. It's blatantly obvious.
     
    George Preddy, Dec 30, 2003
  8. started.

    What's wrong, couldn't make it to higher learning, so you resent that
    others could? Well, that won't distract me from noticing the complete
    lack of response to the sample photo you inaccurately critiqued. Did
    you forget, or are you trying to cover your mistake?
     
    The Black Sheep, Dec 30, 2003
  9. George Preddy

    Larry Lynch Guest

    Thats pretty much how he handles anything that disproves
    him, or shows he is wrong.. He just acts like it doesn't
    exsist.
     
    Larry Lynch, Dec 30, 2003
  10. The number of pixels gives an absolute upper limit to the resolution of
    the image. It *is* relevant. The 3.4 MP SD9 cannot possibly exceed the
    resolution of a 3.4 MP film scan or a 3.4 MP flatbed scan. Calling it a
    10.3 MP camera is simply wrong and misleading, since it gives
    expectations of resolution that the SD9 cannot possibly meet.

    And you have a lot of nerve quoting the "artofraw" essay in a discussion
    thread that is about whether this particular essay is valid or not.
    Have you ever heard of circular reasoning? If you want to argue that
    document A is correct, you need to quote references from outside the
    document!

    Finally, what do you mean by "optical resolution" anyway? Do you mean
    the resolution limit in lp/mm in subject space (i.e. on the test chart)?
    Or resolution in lp/mm on the image plane? Or resolution in
    cycles/pixel in the final digital image (and if so, what size digital
    image?) Or angular resolution in cycles per degree at the lens? The
    phrase "optical resolution" could mean any of these, and they change
    value as sensor size and field of view change.

    The resolution figure that's most useful to a photographer is lines per
    picture height, or line pairs per picture height, because that says how
    much detail there is actually in the captured image with all of the
    effects of field of view, sensor size, and pixel pitch removed. No
    matter what the cameras being compared, a higher figure for lines per
    picture height means more detail in a print of the same size.
    Again, the link above is in dispute because the tests were not valid.
    Repeating yourself doesn't make it any more believable.

    On the other hand, the resolution tests done by dpreview.com, which were
    done correctly (black/white target, target filling the field as it was
    designed to do), show that the 10D and 1Ds have higher resolution than
    the SD9.
    No, you're supposed to fill the field of view of the camera with the
    target, that's all. That's what dpreview did, and what the outbackphoto
    test failed to do.
    Again, matching field of view is the correct way to test resolution.
    It's not an error.

    That page shows that the SD9 output had more sharpening applied, and
    that it appears to have more detail at the cost of getting the contents
    of the image wrong. Look at the missing tines in the "picket fence" on
    the riverboat in the 6th closeup, for example. That's the effect of not
    including an antialiasing filter, not any higher real resolution.

    However, the SD9 is obsolete, and its replacement the SD10 includes
    microlenses that will reduce this false detail - so the SD10 will be
    more faithful to the real image, but not appear as sharp. The "super
    sharp" SD9 is dead.
    Yes, during resolution tests that are properly performed, the sensor's
    physical size does not matter; the field of view of all cameras are
    matched. I don't know why you say "so there you have it", as if you
    just proved a point, since the only person who seems at all confused
    about this is you.
    Exaggerate all you want. The simple fact is that when both cameras are
    set up to image the same test chart, the SD9 captures less real detail.
    What's obvious is that you're completely out in left field. Just match
    the field of view, like the test chart instructions tell you to, before
    shooting the test if you want proper test results. Why is this so
    difficult?

    No comment about this part, I see.

    Dave
     
    Dave Martindale, Dec 30, 2003
  11. Absurd.

    But ok, the SD9 is 14MP. .
    You mean 14MP.
    I know, its crazy to compare pictures of color resolution charts, everyone
    knows the world is B&W.
    As I said. The world is in B&W, so obviously the (undisputed) fact that the
    SD9 outresolves the Canon 1Ds in color by a factor of 2.3X doesn't matter at
    all.
    Like I siad, you have to test the 1Ds with 1.7X the zoom of the SD9, at
    least than it only gets beaten severely, instead of to death.
    At least you claim that full frame has zero value. I disagree.
    Obviously not, since it beat the 1Ds by its full 2.3X theoretical advantage.
    Take your pick, the SD9 kills Bayer in any color. B&W is required to
    compete with Foveon.
     
    George Preddy, Dec 31, 2003
  12. Perhaps you should look up some basic sampling theory.

    But here's a simple explanation: The SD9 and SD10 have 1512 rows of
    pixels on their CMOS sensors. Thus, they cannot possibly resolve more
    than 1512 changes from dark to light and vice versa. So the absolute
    upper limit on resolution for this camera is 1512 lines per picture
    height, or 756 line pairs per picture height. It doesn't matter how
    many colours are sensed at each pixel, since all colours are sampled in
    the same place in space. It doesn't matter if the SPP raw converter
    interpolates the image up to 3000 pixels in the vertical direction. The
    sensor has only 1512 rows, and the resolution is limited to that.

    In comparison, the 10D and 300D have 2048 image-forming rows on the
    sensor, and the 1Ds has 2704 rows. Their resolution is similarly
    limited to these numbers, but they are significantly larger than the
    SD9/SD10.
    No way. It has only 3.4 million sensing locations, as you well know.
    The 14 MP images are produced in SPP, via empty magnification. The
    output image cannot contain any real detail that wasn't in the 3.4 MP
    data file produced by the camera.
    Almost everything in a real world scene has both colour and luminance
    difference between adjacent areas. Our eyes see 10X finer detail when
    the luminance changes than when only colour changes, so it's luminance
    resolution that primarily determines how sharp an image is to our eyes.
    That's also why Bayer sensors, with colour-only resolution that's 2X
    worse than their luminance resolution, still produce images that look
    fine. And that's why resolution tests are done with B&W targets.

    What the outbackphoto tests show is that Bayer colour-only resolution is
    worse than luminance resolution - hardly surprising. But unless you're
    shooting images that contain nothing but saturated red and blue, the
    results aren't relevant for pictorial photography.

    So it's not that "the world is B&W". It's that "the world always has
    some luminance changes; nobody shoots saturated red/blue test targets".
    This test doesn't establish even that, since the target magnifications
    were not set properly.
    You're not reading. I didn't say full frame has zero value. I said
    that it does not have an advantage or disadvantage during properly-shot
    resolution tests. You seem incapable of understanding the difference.
    In an improperly-shot test. With a properly-shot test, the SD9 is worse
    than both the D60 and 1Ds, as expected.
    The SD9 resolution probably does exceed the D60 and 1Ds when shooting
    saturated red/blue test targets in a properly-shot test. But (a) this
    test doesn't demonstrate this, since it wasn't properly done, and (b)
    the red/blue resolution isn't important to general photography.

    What matters is that both the D60 and 1Ds capture more real detail from
    real-world scenes, where there are luminance differences.

    (Also, note that all of the above is true for any 3-colour 3.4 MP
    sensor, including flatbed and slide scanners and a 3CCD camera. The SD9
    and SD10 have additional problems not related to spatial resolution).

    Dave
     
    Dave Martindale, Dec 31, 2003
  13. George Preddy

    JPS Guest

    In message <bsubmk$e17$>,
    There are varying wavelengths of light in the real world, but the B&W
    tests are generally a better measure of real photographic performance
    than your artificial outbackphoto test, aimed precisely at the main node
    in bayer perception, which has no relationship to most real-world color
    scenarios. Real-world colors tend to be less saturated, and even when
    they _are_ highly saturated, it is not usually fine alternating lines of
    two extremely saturated colors with no luminance difference.

    B&W tests on a color system will show many color reproduction problems,
    if they exist. You need good color to avoid color fringing and other
    chromatic artifacts. Your B&W 60-spoke wheel from the SD9 has more
    color fringing in it than the one from my 10D does. I go around yours
    with the info tool, and it reports saturation levels in the lower teens
    (up to 13%). With the 10D, I get saturation levels in the lower single
    digits (up to 4%).
    --
     
    JPS, Dec 31, 2003
  14. Could you explain why it is absurd? Are you saying that a pixel can show a
    line pair (one black, one white line) that is only one pixel wide? Or if
    you wish, a line pair where one line is red and one is blue where the total
    width of the two lines is one pixel?
     
    Gherry Bender, Dec 31, 2003
  15. Ok great, the SD9 is 14MP.
    They have that many 1/3rd sensors. Big difference.
    Choose any color, the SD9 has

    230% the Canon 10D's red sensor count
    230% the Canon 10D's bluesensor count
    120% the Canon 10D's green sensor count
    At only 1.5MP full color, the Canon 10D isn't worth discussing in a pro
    context. At 2.7MP full color, 1Ds is certainly competive after major
    downsizing, but it has lost every published color test I've seen posted.
    Which is only two. Seems no one wants to embarrass their big money
    sponsor's $8000 flagship...

    http://www.outbackphoto.com/artofraw/raw_05/essay.html
    http://61.206.42.242/images/fujieda/sd9/sd10vs1ds_1.jpg
     
    George Preddy, Jan 1, 2004
  16. Right, because black is no wavelength.
     
    George Preddy, Jan 1, 2004
  17. I'm saying a 1/3rd spectrum sensor can turn black all by itself, but it
    can't turn any color. This is why B&W tests are required by Bayer
    manufacturers, it inflates their full color optical resolution by a factor
    of 3-4X.

    The best case for Bayer is a picture of the back of a lens cap. 6M pixels
    all 100% accurate. Kills a 10.3MP Foveon which "dumbly" combines 3 black
    sensors to form the color black. The reslution of the Bayer is almost 74%
    higher, right? Sure, as long as there is no light. If there is light, a
    Bayer requires 3-4 sensors to form each full color, dropping it's color
    resolution to 1.5MP, and its luminance resolution to just 2MP. Foveon is
    3.43MP color, 3.43MP luminance.

    But only in light.
     
    George Preddy, Jan 1, 2004
  18. Except resolution is defined as alternating white and black lines although
    in reality for lenses it is measured at a single wavelength which usually
    happens to be green. Lens have a lower resolution for white light than they
    do for a single monocromatic light source. Green was selected long before
    digital cameras existed. Either because it was in the center of the
    spectrum or more likely because it is easily genrated with a sodium light
    source and filter. But his point was that even if the sensor is perfect, it
    can't have any more resolution in a particular direction than the number of
    rows or columns of pixels in that direction.

    I still don't understand what is absurb about that statement. You can
    certainly decide if it is a meaningful test to you, but it is accurate and
    well within the standard definition of resolution. (Actually that is not
    quite true since probably should be done with a sine wave chart and not a
    square wave chart, but that would favor a bayer arrangement even more .)
     
    Gherry Bender, Jan 1, 2004
  19. It is not, because it has only 3.4 million sampling locations. There's
    no way it can conceivably be called a 14 MP camera. Even with 3
    measurements per pixel, there are only 10 million numbers produced by
    one exposure. And it's not a 10 MP camera either.
    They have that many one-colour sensors, but each is located at a unique
    location in the image plane. For resolving luminance detail, the number
    of sensing locations is very important, but the number of colours
    measured at each location is of little importance. Bayer sensor
    resolution results are almost the same as a B&W sensor of the same
    size.
    I say "probably" because we can only guess based on the results of this
    test. The test was done wrong, so it doesn't prove anything
    conclusively.
    In the first place, all these extra colour measurements only help colour
    resolution. Even a 3-colour scanner or 3CCD camera has little or no
    luminance resolution advantage over a Bayer sensor. Second, the Foveon
    sensor doesn't actually measure red, green, or blue - it measures 3
    numbers only vaguely related to RGB, and requires a lot of processing to
    produce anything even vaguely plausible. Properly implemented 3-colour
    sensors should have better colour reproduction than any Bayer sensor,
    but the Foveon is often worse.
    It's a 6 megapixel camera. Dividing that count by 4 is not supported by
    anything other than your imagination. Resolution tests show the
    resolution one would expect from a 6 MP camera, exceeding the Sigma
    cameras, and far exceeding what an actual 1.5 MP camera would produce.
    Your "full color" pixel math is nonsense; it doesn't predict actual
    performance.
    Again, it's 11 MP - your math is nonsense.

    Are these your two "color tests"? The first one was done improperly and
    doesn't apply to real-world photos, as this thread has been discussing.
    You can't use it as a reference that proves itself!

    The second is a pair of photos with little information about how they
    were shot. They were both shot with a 28 mm lens, so the SD10 has the
    advantage of 1.73X greater magnification on the sensor (smaller field
    of view). To match field of view, if the SD10 used a 28 mm lens the
    1DS should have used 48.5 mm. That would have been possible since the
    test was done with a zoom lens on the Canon - why didn't the tester do
    this?

    Because of the greatly different field of view, the test is comparing
    the full SD10 frame with only about the centre 1/3 of the 1Ds pixels -
    it's discarding 2/3 of the 1Ds image area! The 3.4 MP Sigma full sensor
    is being compared to about 3.7 MP in the centre of the 1Ds sensor,
    discarding the other 7.3 MP of 1Ds data! You call this a fair test?

    What photographer would shoot with a 28 mm lens, then crop away most of
    the image, when they really want the field of view of a 50 mm lens?
    This test makes no more sense than that.

    Despite that, I don't see any real detail in the SD10 images that isn't
    also in the 1Ds image. The SD10 looks crisper, but we know that the
    SPP software does quite a bit of sharpening by default while the Canon
    software does not. And I do see unpleasant aliasing artifacts in the
    vertical brick near the centre of the image, and in the stairway
    railings, in the SD10 image but not the 1Ds image.

    Given the two images as shot, I'd avoid the SD10 and buy the 1Ds - even
    if this was an example of the image quality over the same field of
    view. But it's not. If the photographer had zoomed the lens on the
    1Ds to cover the same FOV as the SD10, instead of shooting a 1.73 times
    wider FOV and then cropping and enlarging the centre, the 1Ds image
    would have 1.73 times more real detail than the example shows.

    Even when handicapped by throwing away 2/3 of the image area, the 1Ds
    beats the SD10. It's just astonishing that you'd refer to this test as
    demonstrating the SD10's superiority, when it's really more evidence
    that the SD10 is not even close to the 1Ds.

    Dave
     
    Dave Martindale, Jan 1, 2004
  20. In a digital context, "optical resolution" is almost always defined as the
    ability to distinguish between two objects without digital interpolation
    (guessing). Digital scanners have been dealing with the problem a lot
    longer, and most people have figured out that scam, Bayer uses are just now
    starting to realize they are being ripped off...

    Excerpt from: http://www.scantips.com/basics07.html
    ----------
    For Line art mode, yes, interpolation might be useful, provided we have some
    use for the large image. A 2400 dpi line art scan might be appropriate for a
    real 2400 dpi printer (more likely an imagesetter for newspaper/magazine
    reproduction). The pixels are smaller, and the jaggies on the line edges are
    much reduced. Line art is a special case. If you have a 1200 dpi printer,
    you could scan Line art at 1200 dpi (could, not necessarily should, 300 dpi
    often looks fine). Correspondingly the same for 600 dpi, 300 dpi, and 180
    dpi printers, scan line art at those numbers.

    For Color or Gray Scale modes, no, these interpolated resolutions are NOT
    generally useful at all, the results are too poor. Some sample scans are
    provided (112K) to show why high values of interpolated resolution are NOT
    useful for color photos.
    ----------

    "Line art?" B&W resolution charts? Anything here sound familiar?
    Why don't you think it is absurd to require only B&W "line art" test full
    color digital sensors? Because you know it can't see color well? That's
    not testing, that's rationalization.

    Certainly Foveon wasn't the intended target of the absurd requirement to use
    only B&W "line art" to test heavily interpolated Bayer color sensors. The
    original idea was to inflate digital's (at the time digital = Bayer)
    performance compared to color film, which like Foveon, is a full color
    sensor.

    Sony will start complaining soon too, their new Bayer sensor will have a
    full color resolution advantage over old Bayer sensors. Sooner rather than
    later, photographers are going to figure out that full color tests do in
    fact make perfect sense when testing full color sensors. Even if
    manufactures using old Bayer designs wind up taking their ball and stomping
    home.
     
    George Preddy, Jan 1, 2004
    1. Advertisements

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments (here). After that, you can post your question and our members will help you out.