CCD quality

Discussion in 'Digital Photography' started by Charles Douglas Wehner, Nov 10, 2003.

  1. Alfred Molon had asked what the maximum definition of "Prosumer"
    cameras is.

    "Prosumer" presumably means "Professional consumer" - that is, people
    such as wedding photographers who buy a camera in the shops - not one
    built to special order.

    The maximum definition of current CCDs appears to be about 16
    MEGAPIXELS - yes, sixteen million. But that may be only for
    professional astronomers &c.

    Sony and Kodak are the top players:

    http://www.kodak.com/global/en/digital/ccd/products/fullframe/fullframeFamilyMain.jhtml

    The current consumer range seems to be about 6 megapixels maximum.

    To understand the technical limits, consider that a lens cannot
    resolve better than about 50 line-pairs (or 100 pixels) per
    millimetre. That makes each dot (pixel) about ten microns wide.

    The Oscar Barnack Leica format was based on TWO academy-format 18mm by
    24mm (35mm) cinema frames laid together. That is, two 1800 by 2400
    pixel frames. This gives 36mm by 24mm.

    Kodachrome managed to resolve 108 lines per millimetre. The Leitz
    Summicron-R was the world's sharpest 50mm lens with 114 lines per
    millimetre, with the Nikkor H-Auto 105mm portrait lens closely
    following with about 108.

    So if you put the best film and the best lens together, the combined
    fall-off would be 3db at about 100 lines per millimetre.

    Crude CCDs use perhaps 20 microns per pixel - mainly for astronomy
    where the large pixel area is needed in order to achieve high
    sensitivity. Non-astronomical CCDs tend to have pixel areas of ten or
    nine microns.

    However, the structure of each pixel seems to be finer.

    The Bayer pattern lays out FOUR sensitive areas per pixel. So we have
    a row of red-green-red-green, and underneath it we have
    green-blue-green-blue.

    Camera makers use all kinds to "tricks" to advertise their products.
    For example, if a CCD has 320 by 288 pixels they may not advertise it
    as such. They may promise you 640 by 576.

    What does this mean? Simply that they are mixing up the terms
    "sensitive area" and "pixel" - deliberately. A Bayer pixel would be

    RG
    GB

    but they pretend that R is a pixel, G is a pixel and B is a pixel.

    Again, we have the problem of how the CCD is read-out into memory.
    Consider the following patterns:

    RGR R R G
    GBG B G G
    RGR R R G

    Here we have a 3-by-3 extract of the Bayer pattern. The B sensitive
    spot is unique to its position, but the R is the COMPOSITE of four
    neighbouring spots in a square array, whilst the G is the COMPOSITE of
    four in a diamond.

    So the blue is sharp whilst red and green are blurred.

    Move along one sensitive area, and we have a sharp green, a square of
    red and a diamond of blue. Here only the green is sharp.

    Keep going to the end of the row, then travel through the CCD row by
    row, and you can boast a unique "pixel" for every sensitive spot on
    the surface.

    The give-away is the FOCAL LENGTH of the lens. An 8.64 megapixel CCD
    would replicate the definition of the Barnack Leica format, and as
    with 35mm would need a 50mm standard lens or at least a 35mm
    wide-angle.

    I bought a 3.1 mega"pixel" camera with an 8.7 mm lens. It boasted 2048
    by 1600 dots. It was immediately obvious that (1) there were exactly 3
    MEGA pixels where MEGA is 1024 by 1024 - advertised as 3.1
    DECIMAL-mega (2) they were sensitive spots in the array, not pixels.

    An 8.7 mm lens might cover a 10mm by 8mm array - giving 800k pixels.
    Multiply this by FOUR, and you get the fake "pixel" count.

    I was not complaining - I was happy to buy the camera at its low price
    - but I was amused by the pretensions of the vendor.

    The camera saves the image as JPEG. What does one do?

    ACCEPT the lie, as if it were true. To get the best out of the camera,
    save the images as the BIGGEST files you can.

    Take the example of an IMAGINARY manufacturer who has a 320 pixel-wide
    camera and advertises it as 2560. He writes software that goes along
    each row, repeating each pixel eight times : 111111112222222233333333

    That software also repeats the rows eight times:
    11111111222222223333333344444444
    11111111222222223333333344444444
    11111111222222223333333344444444
    11111111222222223333333344444444
    11111111222222223333333344444444
    11111111222222223333333344444444
    11111111222222223333333344444444
    11111111222222223333333344444444

    Then JPEG takes groups of eight by eight:

    111111111
    111111111
    111111111
    111111111
    111111111
    111111111
    111111111
    111111111

    JPEG finds the AVERAGE colour, and then seeks its discrete cosines.
    The block is FEATURELESS, so it finds NO cosine data.

    JPEG passes the spectrum through a filter, which "smears" the cosines.
    As there are none, it does not matter.

    JPEG packages the result and puts it onto your flash card.

    When you now fetch the image and unpack it, it is presented to
    Photoshop or whatever as a 2560-wide image. You can safely shrink it
    to its proper 320-wide size, and save it as BMP.

    Had you been too clever, and set the camera to its 320-wide setting,
    JPEG would have smeared the few good pixels together and so reduced
    the quality.

    So even if the pixels are partially interpolated, you still need as
    many as possible if you are stuck with JPEG.

    Charles Douglas Wehner
     
    Charles Douglas Wehner, Nov 10, 2003
    #1
    1. Advertising

  2. You really need an SD-9.

    "Charles Douglas Wehner" <> wrote in message
    news:...
    > Alfred Molon had asked what the maximum definition of "Prosumer"
    > cameras is.
    >
    > "Prosumer" presumably means "Professional consumer" - that is, people
    > such as wedding photographers who buy a camera in the shops - not one
    > built to special order.
    >
    > The maximum definition of current CCDs appears to be about 16
    > MEGAPIXELS - yes, sixteen million. But that may be only for
    > professional astronomers &c.
    >
    > Sony and Kodak are the top players:
    >
    >

    http://www.kodak.com/global/en/digital/ccd/products/fullframe/fullframeFamil
    yMain.jhtml
    >
    > The current consumer range seems to be about 6 megapixels maximum.
    >
    > To understand the technical limits, consider that a lens cannot
    > resolve better than about 50 line-pairs (or 100 pixels) per
    > millimetre. That makes each dot (pixel) about ten microns wide.
    >
    > The Oscar Barnack Leica format was based on TWO academy-format 18mm by
    > 24mm (35mm) cinema frames laid together. That is, two 1800 by 2400
    > pixel frames. This gives 36mm by 24mm.
    >
    > Kodachrome managed to resolve 108 lines per millimetre. The Leitz
    > Summicron-R was the world's sharpest 50mm lens with 114 lines per
    > millimetre, with the Nikkor H-Auto 105mm portrait lens closely
    > following with about 108.
    >
    > So if you put the best film and the best lens together, the combined
    > fall-off would be 3db at about 100 lines per millimetre.
    >
    > Crude CCDs use perhaps 20 microns per pixel - mainly for astronomy
    > where the large pixel area is needed in order to achieve high
    > sensitivity. Non-astronomical CCDs tend to have pixel areas of ten or
    > nine microns.
    >
    > However, the structure of each pixel seems to be finer.
    >
    > The Bayer pattern lays out FOUR sensitive areas per pixel. So we have
    > a row of red-green-red-green, and underneath it we have
    > green-blue-green-blue.
    >
    > Camera makers use all kinds to "tricks" to advertise their products.
    > For example, if a CCD has 320 by 288 pixels they may not advertise it
    > as such. They may promise you 640 by 576.
    >
    > What does this mean? Simply that they are mixing up the terms
    > "sensitive area" and "pixel" - deliberately. A Bayer pixel would be
    >
    > RG
    > GB
    >
    > but they pretend that R is a pixel, G is a pixel and B is a pixel.
    >
    > Again, we have the problem of how the CCD is read-out into memory.
    > Consider the following patterns:
    >
    > RGR R R G
    > GBG B G G
    > RGR R R G
    >
    > Here we have a 3-by-3 extract of the Bayer pattern. The B sensitive
    > spot is unique to its position, but the R is the COMPOSITE of four
    > neighbouring spots in a square array, whilst the G is the COMPOSITE of
    > four in a diamond.
    >
    > So the blue is sharp whilst red and green are blurred.
    >
    > Move along one sensitive area, and we have a sharp green, a square of
    > red and a diamond of blue. Here only the green is sharp.
    >
    > Keep going to the end of the row, then travel through the CCD row by
    > row, and you can boast a unique "pixel" for every sensitive spot on
    > the surface.
    >
    > The give-away is the FOCAL LENGTH of the lens. An 8.64 megapixel CCD
    > would replicate the definition of the Barnack Leica format, and as
    > with 35mm would need a 50mm standard lens or at least a 35mm
    > wide-angle.
    >
    > I bought a 3.1 mega"pixel" camera with an 8.7 mm lens. It boasted 2048
    > by 1600 dots. It was immediately obvious that (1) there were exactly 3
    > MEGA pixels where MEGA is 1024 by 1024 - advertised as 3.1
    > DECIMAL-mega (2) they were sensitive spots in the array, not pixels.
    >
    > An 8.7 mm lens might cover a 10mm by 8mm array - giving 800k pixels.
    > Multiply this by FOUR, and you get the fake "pixel" count.
    >
    > I was not complaining - I was happy to buy the camera at its low price
    > - but I was amused by the pretensions of the vendor.
    >
    > The camera saves the image as JPEG. What does one do?
    >
    > ACCEPT the lie, as if it were true. To get the best out of the camera,
    > save the images as the BIGGEST files you can.
    >
    > Take the example of an IMAGINARY manufacturer who has a 320 pixel-wide
    > camera and advertises it as 2560. He writes software that goes along
    > each row, repeating each pixel eight times : 111111112222222233333333
    >
    > That software also repeats the rows eight times:
    > 11111111222222223333333344444444
    > 11111111222222223333333344444444
    > 11111111222222223333333344444444
    > 11111111222222223333333344444444
    > 11111111222222223333333344444444
    > 11111111222222223333333344444444
    > 11111111222222223333333344444444
    > 11111111222222223333333344444444
    >
    > Then JPEG takes groups of eight by eight:
    >
    > 111111111
    > 111111111
    > 111111111
    > 111111111
    > 111111111
    > 111111111
    > 111111111
    > 111111111
    >
    > JPEG finds the AVERAGE colour, and then seeks its discrete cosines.
    > The block is FEATURELESS, so it finds NO cosine data.
    >
    > JPEG passes the spectrum through a filter, which "smears" the cosines.
    > As there are none, it does not matter.
    >
    > JPEG packages the result and puts it onto your flash card.
    >
    > When you now fetch the image and unpack it, it is presented to
    > Photoshop or whatever as a 2560-wide image. You can safely shrink it
    > to its proper 320-wide size, and save it as BMP.
    >
    > Had you been too clever, and set the camera to its 320-wide setting,
    > JPEG would have smeared the few good pixels together and so reduced
    > the quality.
    >
    > So even if the pixels are partially interpolated, you still need as
    > many as possible if you are stuck with JPEG.
    >
    > Charles Douglas Wehner
     
    George Preddy, Nov 10, 2003
    #2
    1. Advertising

  3. Or, silver halide crystals on acetate (sometimes called "film")
     
    Charles Schuler, Nov 10, 2003
    #3
  4. "Charles Schuler" <> wrote in message
    news:...
    > Or, silver halide crystals on acetate (sometimes called "film")


    It sounds like you'd enjoy the SD-9's non-interpolated image quality,
    especially when developed using a traditional silver halide process.

    It only costs about $5-10 for an 8x10 these days for a Durst Lambda or Light
    Jet photograph from digital (sizes up to 4 x 8 feet commonly available, up
    to 20-some feet with some services) . Bayer cameras need not apply for true
    non-interpolated 68 billion color photographs as they are color interpolated
    from the outset, as you noted.

    At Photokina, Foveon demonstrated the SD-9's near 56mm medium format film
    parity whendeveloped on a 36-bit non-interpolated Durst or Light Jet, but
    only when enlarged to less than 40 inches.
    http://www.foveon.com/faq_technology.html#FAQ_tech_13

    Some say the 14n approaches 35mm film, other say it still falls short.
    Personally I don't think there is any way a Bayer interpolated image can be
    compared to film, as you pointed out the artifacting and up-scaling is
    really prohibitive for parity. To demonstrate how artifact prone any Bayer
    image is, set the WB manually by plucking a pixel, then pluck the one next
    to it, and next to that and so on, you get really wild variations even over
    smooth gradients. That's not film nor Foveon quality.
     
    George Preddy, Nov 11, 2003
    #4
  5. "George Preddy" <> wrote in message
    news:bop8ql$ck3$...
    SNIP
    > At Photokina, Foveon demonstrated the SD-9's near 56mm medium format film
    > parity whendeveloped on a 36-bit non-interpolated Durst or Light Jet, but
    > only when enlarged to less than 40 inches.


    I've seen the SD9 TIFFs and I've seen the Photokina displays. The displayed
    images were not made from the SD9 TIFFs, you fell for another of their
    marketing deceptions.

    Bart
     
    Bart van der Wolf, Nov 11, 2003
    #5
  6. "George Preddy" <> wrote in message news:<bop8ql$ck3$>...
    > "Charles Schuler" <> wrote in message
    > news:...
    > > Or, silver halide crystals on acetate (sometimes called "film")

    >
    > It sounds like you'd enjoy the SD-9's non-interpolated image quality,
    > especially when developed using a traditional silver halide process.
    >


    Because conventional lenses cannot do better than 100 pixels per
    millimetre, if the red-, green- and blue- sensitive areas are less
    than this, it would be difficult for a "houndstooth" or "plaid" jacket
    to produce rainbow effects.

    Because the photoionisation in a CCD gives one electron per photon, we
    start out with a linear system. With film, a "mist" of silver might
    remove 10% of the light, and two "mists" would allow 81% through -
    removing 19%. Three "mists" transmit 72.9% - removing 27.1. The
    quantity of "silver mist" in the gelatine is proportional to the
    original light, but the absorption of light by the negative is
    NON-LINEAR (10, 19, 27.1). It is related to the LOGARITHM, hence the
    computation of film density as "gamma".

    Because the CCD module is assembled in a clean-room there is no dust
    on the array. If dust falls on the lens, it is out of focus - just
    spoiling the contrast a little.

    The CCD, at one electron per photon, is about twenty times more
    sensitive than a halide grain - which needs twenty or so photons to
    trigger the development process.

    A cheap CCD might be small, but works with a short-focus lens.
    Everything is sharp from a few inches to infinity. To stop the sun
    burning the CCD, the cheap camera maker has given you a small aperture
    - and this enhances the depth of field.

    So THEORETICALLY everything is BETTER. Of course, the cheapest of the
    cheap will spoil things.

    Sometimes, however, we want to use DIFFERENTIAL FOCUS to separate the
    foreground from the background. A cheap 1.1 megapixel camera gives
    stunning depth-of-field even when you DON'T want it - so you have to
    swap over to large format.

    At a wedding, a cheap CCD will give sharp pictures up to European A4
    /wholeplate /US Foolscap. But what if you want to blow up just ONE
    face in the crowd?

    You get what you pay for.

    Charles Douglas Wehner
     
    Charles Douglas Wehner, Nov 11, 2003
    #6
  7. "Bart van der Wolf" <> wrote in message
    news:3fb0ef02$0$58706$4all.nl...
    >
    > "George Preddy" <> wrote in message
    > news:bop8ql$ck3$...
    > SNIP
    > > At Photokina, Foveon demonstrated the SD-9's near 56mm medium format

    film
    > > parity whendeveloped on a 36-bit non-interpolated Durst or Light Jet,

    but
    > > only when enlarged to less than 40 inches.

    >
    > I've seen the SD9 TIFFs and I've seen the Photokina displays. The

    displayed
    > images were not made from the SD9 TIFFs, you fell for another of their
    > marketing deceptions.


    Yes they were, printed on a Durst Lambda. Many still wonder why a Light Jet
    wasn't used.
     
    George Preddy, Nov 11, 2003
    #7
  8. Charles Douglas Wehner

    Guest Guest


    >
    > Yes they were, printed on a Durst Lambda. Many still wonder why a Light

    Jet
    > wasn't used.
    >


    Did you see them printed with your own eyes?

    I believe my eyes normally - and my eyes tell me that Sigma SD9 images have
    about the same resoluion as a 6MP Bayer, but have aliaising problems
    sometimes, and you have to process every single image on your computer.
     
    Guest, Nov 11, 2003
    #8
  9. Charles Douglas Wehner

    Guest Guest

    Did you see it printed with your own eyes?


    "George Preddy" <> wrote in message
    news:bor0om$22p$...
    >
    > "Bart van der Wolf" <> wrote in message
    > news:3fb0ef02$0$58706$4all.nl...
    > >
    > > "George Preddy" <> wrote in message
    > > news:bop8ql$ck3$...
    > > SNIP
    > > > At Photokina, Foveon demonstrated the SD-9's near 56mm medium format

    > film
    > > > parity whendeveloped on a 36-bit non-interpolated Durst or Light Jet,

    > but
    > > > only when enlarged to less than 40 inches.

    > >
    > > I've seen the SD9 TIFFs and I've seen the Photokina displays. The

    > displayed
    > > images were not made from the SD9 TIFFs, you fell for another of their
    > > marketing deceptions.

    >
    > Yes they were, printed on a Durst Lambda. Many still wonder why a Light

    Jet
    > wasn't used.
    >
    >
     
    Guest, Nov 12, 2003
    #9
  10. "George Preddy" <> wrote in message news:<bootrn$36v$>...
    > You really need an SD-9.
    >


    GI'US one, then - go on, gi'us one!

    The Foveon sensor is indeed a major advance. It was always a problem
    that red penetrated deeper than green, which was deeper than blue in
    the CCD.

    The Foveon clearly takes advantage of this, and produces a three-layer
    system just like three-layer wet-process film.

    There is a good illustration of the advantages on
    http://www.foveon.com/X3_sharper.html . However, as I have already
    pointed out, few lenses can exceed ten microns. So, conventional CCDs
    with poor lenses do not show the colour fringes as dramatically as
    this. The "poor quality" is used to ADVANTAGE as a "blur filter".

    The SD-9 (http://www.dpreview.com/reviews/sigmasd9/ ) is conspicuously
    a good camera. A major use for the Foveon is in television studios,
    however.

    It is usual on professional equipment to separate out the primaries
    with prisms and colour filters. As the Foveon detects colour by DEPTH
    INTO THE CCD, no prisms are needed.

    Cheap cameras giving 640 by 480 VGA resolution often offer 320 by 240
    QVGA. This is done by "leaping" over a pixel - instead of reading BOTH
    and AVERAGING. Then they leap over a row of pixels.

    Thus a white cigarette-end on the pavement has only a 25% chance of
    landing on the active pixel. If the camera shakes, the cigarette-end
    "twinkles" - and thoroughly distracts the audience. Similarly, ivy
    leaves on the side of a building twinkle.

    Foveon point out that they can merge pixels into clusters
    (http://www.foveon.com/X3_vps.html ), which to me seems to be a
    logical extension to CCD technology. An insulated "GATE" - just like
    the gate of a field-effect transistor - sits on top of a "well" where
    the pixel-charge is. If you control your gates correctly, you can
    merge four, nine or sixteen wells into a composite well. Nice touch,
    Foveon designers!

    So yes! I - as a completely impartial observer - can indeed see the
    advantages.

    It is also true that they are at a disadvantage when they quote
    "PIXELS". If a pixel is a two-dimensional area, they have LESS.
    However, in the special case of Foveon one has to multiply the pixels
    by THREE to obtain the number of sensitive three-dimensional areas.

    My website http://www.wehner.org/frames/index.htm shows what happens
    when rasters interact. Move the scroll-bar on the right, and you will
    see the moiré effect. With the Foveon, that is no longer a problem
    with full-size pixels - and I am pleased that they took the idea
    further with "well-clustering" to destroy the raster on smaller
    images.

    Charles Douglas Wehner
     
    Charles Douglas Wehner, Nov 12, 2003
    #10
  11. Charles Douglas Wehner

    Guest

    In message <>,
    (Charles Douglas Wehner) wrote:

    >My website http://www.wehner.org/frames/index.htm shows what happens
    >when rasters interact. Move the scroll-bar on the right, and you will
    >see the moiré effect.


    Unfortunately, this demonstrates nothing about the actual cameras in
    question. Bayers 6mp and up do not have any significant moire problems.
    3.43mp foveons *do* moire due to over-sampling with a sharp lens.

    >With the Foveon, that is no longer a problem
    >with full-size pixels - and I am pleased that they took the idea
    >further with "well-clustering" to destroy the raster on smaller
    >images.


    --

    <>>< ><<> ><<> <>>< ><<> <>>< <>>< ><<>
    John P Sheehy <>
    ><<> <>>< <>>< ><<> <>>< ><<> ><<> <>><
     
    , Nov 13, 2003
    #11
  12. Charles Douglas Wehner

    Ray Fischer Guest

    Re: CCD quality (disinformation)

    Charles Douglas Wehner <> wrote:
    >The maximum definition of current CCDs appears to be about 16
    >MEGAPIXELS - yes, sixteen million. But that may be only for
    >professional astronomers &c.


    Or anyone else who wants to spend $5000+

    [...]
    >The Bayer pattern lays out FOUR sensitive areas per pixel.


    Incorrect. It uses four sensors for four pixels and interpolates
    color data.

    [...]
    >Camera makers use all kinds to "tricks" to advertise their products.
    >For example, if a CCD has 320 by 288 pixels they may not advertise it
    >as such. They may promise you 640 by 576.
    >
    >What does this mean? Simply that they are mixing up the terms
    >"sensitive area" and "pixel" - deliberately. A Bayer pixel would be
    >
    > RG
    > GB
    >
    >but they pretend that R is a pixel, G is a pixel and B is a pixel.


    Because it is.

    >Again, we have the problem of how the CCD is read-out into memory.
    >Consider the following patterns:
    >
    > RGR R R G
    > GBG B G G
    > RGR R R G
    >
    >Here we have a 3-by-3 extract of the Bayer pattern. The B sensitive
    >spot is unique to its position, but the R is the COMPOSITE of four
    >neighbouring spots in a square array, whilst the G is the COMPOSITE of
    >four in a diamond.


    Since the algorithm for interpolating colors isn't proprietary and is
    readily findable on the net, I suggest that you refer to it and stop
    spreading your inaccurate bullshit.

    --
    Ray Fischer
     
    Ray Fischer, Nov 13, 2003
    #12
  13. wrote in message news:<>...
    > In message <>,
    > (Charles Douglas Wehner) wrote:
    >
    > >My website http://www.wehner.org/frames/index.htm shows what happens
    > >when rasters interact. Move the scroll-bar on the right, and you will
    > >see the moiré effect.

    >
    > Unfortunately, this demonstrates nothing about the actual cameras in
    > question. Bayers 6mp and up do not have any significant moire problems.
    > 3.43mp foveons *do* moire due to over-sampling with a sharp lens.


    I have not tried the cameras.

    Bayer-pattern CCDs will produce COLOURED moiré fringing if the lens is
    too sharp, but the effect will be subtle if there are huge numbers of
    pixels.

    I do not know what percentage-area of the Foveon CCD pixel is active.
    If it is small, such as 25%, it is possible for MONOCHROME moiré
    fringing to occur.

    Monochrome fringes are less disturbing - less garish - than colour
    ones.

    My website (above) shows the DELIBERATE production of MONOCHROME
    effects, but there are other pages (http://wehner.org/download.htm ,
    http://www.wehner.org/tools/ , http://www.wehner.org/3d/index.htm )
    where use the effect for colour or 3D.

    The moment Foveon use their "well-merging" technology for
    smaller-format images, the monochrome fringes will vanish.

    Here then is a summary:

    I had some DUFAYCOLOUR film - film made by painting the plastic blue,
    scraping it at 45 degrees, filling in the grooves with green and then
    scraping again and filling in with red, before a B/W emulsion was
    coated on. You develop this film as a B/W reversal film - and it comes
    out in colour.

    That is just like a conventional CCD.

    After DUFAY and AUTOCHROME came the TRIPACK. Everybody knows that the
    tripack - such as Kodachrome, Kodacolor, Ektachrome, Agfacolor, Adox,
    Perutz, Gevaert, Orwo, Cibacolor, Cibachrome, Ferrania, Ansco, GAF,
    &c. &c. was better.

    That is why there were so many tripack makers.

    The Foveon is a tripack.

    Charles Douglas Wehner
     
    Charles Douglas Wehner, Nov 13, 2003
    #13
  14. Charles Douglas Wehner

    Chris Brown Guest

    In article <>,
    Charles Douglas Wehner <> wrote:
    >
    >After DUFAY and AUTOCHROME came the TRIPACK. Everybody knows that the
    >tripack - such as Kodachrome, Kodacolor, Ektachrome, Agfacolor, Adox,
    >Perutz, Gevaert, Orwo, Cibacolor, Cibachrome, Ferrania, Ansco, GAF,
    >&c. &c. was better.
    >
    >That is why there were so many tripack makers.
    >
    >The Foveon is a tripack.


    Foveon's problem with colour reproduction would seem to stem from the
    reliance on the absorption of silicon to "separate-out" the colours. My
    experiments with SD9 raw files suggests that this is not as good as we'd
    perhaps like it to be, however, and nowhere near as discriminating as the
    specially chosen pigments in mosaic sensors or film. As a result, the
    sampled data needs significant processing to extract RGB data. This step
    seems to introduce noise and artifacts into the image. Unprocessed SD9
    images are pretty much free of the noise and artifacts/posterisation that
    can appear in processed images, but the colours are very unsaturated.
    Significantly, the amount of unsaturation appears to vary according to the
    hue of the thing being photographed, with blues coming out looking blue, and
    greens coming out looking almost grey.

    If they can resolve this issue, and actually have the colocated sensors
    produce something much closer to RGB to start with, then these sensors
    should be very impressive indeed. I think they still have some way to go,
    however, and may find the job harder than the manufacturers of mosaic
    sensors, who would seem to have rather more freedom about how they filter
    the light going in to each well for spectral response. The other thing you
    can do with mosaic sensors is increase the number of "primaries" you respond
    to - the new Sony RGBE sensor being a case in point. This will give you
    better colour reproduction, but possibly at the expense of resolved detail.
    However, if the mosaic manufacturers continue to ramp up the pixel count,
    this quickly becomes a non-issue.

    In a couple of years, we could well be looking at >15 megapixel mosaic
    sensors that respond to 4 or even more different colours. Such a sensor
    would give you excellent colour reproduction, but at the expense of
    pixel-level detail. You may well be able to match the resolving power of
    such a sensor with half the pixel count on a Foveon-type sensor, but even if
    such a thing did exist on the market, the colocated sensor would still be at
    a disadvantage in its ability to reproduce colour.
     
    Chris Brown, Nov 13, 2003
    #14
  15. I wrote:

    > >The Foveon is a tripack.

    >


    Chris Brown <_uce_please.com> wrote in message news:<>...
    > >

    > Foveon's problem with colour reproduction would seem to stem from the
    > reliance on the absorption of silicon to "separate-out" the colours. My
    > experiments with SD9 raw files suggests that this is not as good as we'd
    > perhaps like it to be, however, and nowhere near as discriminating as the
    > specially chosen pigments in mosaic sensors or film. As a result, the
    > sampled data needs significant processing to extract RGB data. This step
    > seems to introduce noise and artifacts into the image. Unprocessed SD9
    > images are pretty much free of the noise and artifacts/posterisation that
    > can appear in processed images, but the colours are very unsaturated.
    > Significantly, the amount of unsaturation appears to vary according to the
    > hue of the thing being photographed, with blues coming out looking blue, and
    > greens coming out looking almost grey.


    I looked up "Foveon" then "Color filters", using Google, and found
    http://www.eetimes.com/semi/news/OEG20020211S0075 which explains:

    - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    Eric Zarakov, vice-president of marketing, says the silicon in each
    layer is treated so that it absorbs different wavelengths of light at
    different depths. The sensor has a resolution of 3.5 million pixels
    for each color plane
    - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -

    This is very reminiscent of the "sensitisers" used in silver-halide
    film.

    How do they do this? Perhaps they use a coloured glass. If so, they
    cannot just buy bulk silicon on the open market but must have it
    specially made.

    >
    > In a couple of years, we could well be looking at >15 megapixel mosaic
    > sensors that respond to 4 or even more different colours. Such a sensor
    > would give you excellent colour reproduction, but at the expense of
    > pixel-level detail. You may well be able to match the resolving power of
    > such a sensor with half the pixel count on a Foveon-type sensor, but even if
    > such a thing did exist on the market, the colocated sensor would still be at
    > a disadvantage in its ability to reproduce colour.


    If they don't improve their sensitizers, they may yet lose market
    share.

    Nevertheless, I feel that the Foveon is a great advance - at least on
    the basis of theory.

    Charles Douglas Wehner
     
    Charles Douglas Wehner, Nov 14, 2003
    #15
  16. Re: CCD quality (disinformation)

    (Ray Fischer) wrote in message news:<bp08ts$7e5$>...

    > Re: CCD quality (disinformation)
    > inaccurate bullshit


    The tone of this impudence speaks VOLUMES.

    > >The Bayer pattern lays out FOUR sensitive areas per pixel.

    >
    > Incorrect. It uses four sensors for four pixels and interpolates
    > color data.


    When computer graphics BEGAN - long, long before the CCD - there was
    just black and white (without grey).

    Accordingly, images had one BIT per PIXEL - and by tradition, a
    computer file containing a graphic image - uncompressed - became known
    as a BITMAP.

    So Microsoft created their BMP (bitmap) system - which continued even
    when there was a whole byte per pixel or even three or four.

    Pixel means PICTURE CELL (a contraction of those words), nothing more.

    A Microsoft BMP file - or any equivalent bitmap, such as on the Apple
    series - has ONE byte per pixel for a 256-colour system. Thus a
    1024-by-1024 pixel array (one megapixel) will have in a Microsoft
    bitmap a 54-byte header, a 1K lookup table and a MEGABYTE of data.

    Save that megapixel array as a 24-bit bitmap (16,777,216 colours),
    however, and we have NO lookup table, but THREE magabytes of picture
    data.

    But STOP THe WORLD! Ray Fischer has changed the rules. He will become
    positively OFFENSIVE if we do not agree with him that it now THREE
    megapixels in that 1K by 1K array.

    The Foveon X3 has THREE sensors at each point - distributed in DEPTH,
    and so by the Fischer definition of the word, it has three PIXELS at
    each point. If Foveon made a 1024 by 1024 array, it would have 3
    megaSENSORS and therefore 3 megaPIXELS.

    MEGA bullshit - to use Fischer language.

    > >What does this mean? Simply that they are mixing up the terms
    > >"sensitive area" and "pixel" - deliberately. A Bayer pixel would be
    > >
    > > RG
    > > GB
    > >
    > >but they pretend that R is a pixel, G is a pixel and B is a pixel.

    >
    > Because it is.


    RIDICULOUS. The diagram I made shows how one is AVERAGING over NINE
    areas. Only ONE dot is genuinely the colour at the point of
    measurement. The rest are INTERPOLATIONS of what the colour WOULD HAVE
    BEEN had one been able to take a measurement there.

    > >
    > > RGR R R G
    > > GBG B G G
    > > RGR R R G
    > >
    > >Here we have a 3-by-3 extract of the Bayer pattern. The B sensitive
    > >spot is unique to its position, but the R is the COMPOSITE of four
    > >neighbouring spots in a square array, whilst the G is the COMPOSITE of
    > >four in a diamond.


    In my forty years as a Technical Author, Design Engineer and Factory
    Manager in PHOTOELECTRICS and SAFETY EQUIPMENT, I have written many
    thousands of instruction manuals - and nobody has ever come to grief
    by following my instructions.

    It is true that ONE Concorde crashed - but that was a tyre-burst
    causing a wing fire, not the result of any errors in the radar manuals
    I was involved with.

    I have a reputation for accuracy.

    Now, it seems, some juvenile deliquent decides to spread his graffitti
    on the Internet at my expense. Sorry - but you have made NO clear,
    logical statement about anything other than about your senseless
    rancour.

    >
    > Since the algorithm for interpolating colors isn't proprietary and is
    > readily findable on the net, I suggest that you refer to it and stop
    > spreading your inaccurate bullshit.


    I see NO reason to follow a wild-goose chase chosen by YOU.

    The word "MISinformation" suggests a mistake. The word
    "DISinformation" - widely used in espionage - means DELIBERATE
    DECEPTION.

    It is highly PRESUMPTUOUS of you to attribute deliberate deception to
    somebody you do not know.

    There is no great mystery about the "algorithm for interpolating
    colors". It is as I described, except that what I call a "diamond",
    Foveon and others have described as a "square" - because if you turn a
    square through 45 degrees it becomes a diamond.

    In academic circles, it is usual for the challenger to provide SOLID
    ACADEMIC PROOF.

    YOU must do the searching. YOU must find the COUNTER-EXAMPLE to what I
    say. I do not have to defend myself against your WRONG ASSUMPTIONS and
    your malice.

    If you cannot deliver a SERIOUS ACADEMIC REFERENCE that proves your
    point, you will have made yourself look ridiculous in a very public
    place.

    Charles Douglas Wehner
     
    Charles Douglas Wehner, Nov 16, 2003
    #16
  17. Re: CCD quality (disinformation)

    "Charles Douglas Wehner" <> wrote in message
    news:...
    > (Ray Fischer) wrote in message

    news:<bp08ts$7e5$>...
    >
    > Pixel means PICTURE CELL (a contraction of those words), nothing more.
    >

    Actually, "Pixel" means Picture Element.

    [snip...]
     
    Daniel W. Rouse Jr., Nov 16, 2003
    #17
  18. Charles Douglas Wehner

    Guest

    Re: CCD quality (disinformation)

    In message <>,
    (Charles Douglas Wehner) wrote:

    >Pixel means PICTURE CELL (a contraction of those words), nothing more.


    Pixel means "picture element".
    --

    <>>< ><<> ><<> <>>< ><<> <>>< <>>< ><<>
    John P Sheehy <>
    ><<> <>>< <>>< ><<> <>>< ><<> ><<> <>><
     
    , Nov 16, 2003
    #18
  19. Charles Douglas Wehner

    Ray Fischer Guest

    Re: CCD quality (disinformation)

    Charles Douglas Wehner <> wrote:
    > (Ray Fischer) wrote in message news:<bp08ts$7e5$>...
    >
    >> Re: CCD quality (disinformation)
    >> inaccurate bullshit

    >
    >The tone of this impudence speaks VOLUMES.


    Ooooeee! "Impudence". Sow-ree, yer lordship.

    >> >The Bayer pattern lays out FOUR sensitive areas per pixel.

    >>
    >> Incorrect. It uses four sensors for four pixels and interpolates
    >> color data.

    >
    >When computer graphics BEGAN - long, long before the CCD - there was
    >just black and white (without grey).
    >
    >Accordingly, images had one BIT per PIXEL - and by tradition, a
    >computer file containing a graphic image - uncompressed - became known
    >as a BITMAP.


    No kidding? Of course there were image files containing more than one
    bit per pixel even in the earliest days, but if you weren't in the
    cumputer biz back then you wouldn't know about such things.

    >Pixel means PICTURE CELL (a contraction of those words), nothing more.


    Picture ELEMENT.

    >A Microsoft BMP file - or any equivalent bitmap, such as on the Apple
    >series - has ONE byte per pixel for a 256-colour system.


    That too is wrong. The used a color-lookup table. The byte isn't a
    color at all, but is an index into a table of RGB values.

    > Thus a
    >1024-by-1024 pixel array (one megapixel) will have in a Microsoft
    >bitmap a 54-byte header, a 1K lookup table and a MEGABYTE of data.


    Don't care. Everybody's long since moved on. Today's graphics file
    formats contain 8 to 16 bits per channel with one or more channels,
    depending upon how many colors and how many masks.

    >But STOP THe WORLD! Ray Fischer has changed the rules. He will become
    >positively OFFENSIVE if we do not agree with him that it now THREE
    >megapixels in that 1K by 1K array.


    Apparenly you're another stupid asshole who doesn't realize that
    there is NOT a 1:1 mapping between pixels and file size. Hasn't
    been for ages.

    >> >What does this mean? Simply that they are mixing up the terms
    >> >"sensitive area" and "pixel" - deliberately. A Bayer pixel would be
    >> >
    >> > RG
    >> > GB
    >> >
    >> >but they pretend that R is a pixel, G is a pixel and B is a pixel.

    >>
    >> Because it is.

    >
    >RIDICULOUS.


    Impressive logic. You don't understand the difference between "pixel"
    and "sensor". You dont' understand the difference between "average"
    and "interpolate".

    > The diagram I made shows how one is AVERAGING over NINE
    >areas.


    Your "diagram" doesn't say shit about the algorithm used to create the
    final image. It doesn't "average". It interpolates colors between
    known values. Green is interpolated between green sensors. Blue is
    interpolated between blue sensors. Red is interpolated between red
    sensors. There is no averaging. There is no "combining".

    [...]
    >I have a reputation for accuracy.


    Not any more. Now you have a reputation of ignorance and and arrogant
    sloppiness. You don't know the etymology of "pixel". You don't
    understand how color-mapping works. You dont' understand how color
    interpolation works. And you don't understand the difference between
    pixels and the data representation of the pixels.

    >> Since the algorithm for interpolating colors isn't proprietary and is
    >> readily findable on the net, I suggest that you refer to it and stop
    >> spreading your inaccurate bullshit.

    >
    >I see NO reason to follow a wild-goose chase chosen by YOU.


    In short, you're full of shit and you don't care what the truth is.

    --
    Ray Fischer
     
    Ray Fischer, Nov 16, 2003
    #19
  20. Charles Douglas Wehner

    Chris Brown Guest

    Re: CCD quality (disinformation)

    In article <bp91ke$37t$>,
    Ray Fischer <> wrote:
    >
    >Don't care. Everybody's long since moved on. Today's graphics file
    >formats contain 8 to 16 bits per channel with one or more channels,
    >depending upon how many colors and how many masks.


    Check out Cinepaint - floating point channels. Shame it's not that stable,
    otherwise it would make a nice companion to Photoshop.
     
    Chris Brown, Nov 17, 2003
    #20
    1. Advertising

Want to reply to this thread or ask your own question?

It takes just 2 minutes to sign up (and it's free!). Just click the sign up button to choose a username and then you can ask your own questions on the forum.
Similar Threads
  1. Morgan Ohlson

    Canon A60 vs A70 vs A80 and quality CCD's

    Morgan Ohlson, Sep 17, 2003, in forum: Digital Photography
    Replies:
    5
    Views:
    500
    Morgan Ohlson
    Sep 19, 2003
  2. N.E.1.
    Replies:
    4
    Views:
    863
    Phil Stripling
    Sep 23, 2003
  3. Desmond
    Replies:
    5
    Views:
    884
    Bob D.
    Sep 27, 2003
  4. Miguel Gonzalez

    Tech Ques: About 1-ccd, 3-ccd, sensor filters, etc

    Miguel Gonzalez, Jun 3, 2004, in forum: Digital Photography
    Replies:
    5
    Views:
    2,257
    Dave Haynie
    Jun 6, 2004
  5. Richard Lee
    Replies:
    16
    Views:
    4,636
    Big Bill
    Aug 23, 2004
Loading...

Share This Page