Highest Megapixels Possible in APS-Cs

Discussion in 'Digital Photography' started by lastico, Apr 27, 2009.

  1. lastico

    lastico Guest

    Hi,

    What's the highest megapixels possible in APS-C
    DSLRs before noise makes the quality bad... 20
    Megapixels? 40 Megapixels? There will come
    a time when the pixel sizes will match the point
    & shoot department. Will DSLRs go back to 35mm
    lens? What's the roadmaps for Nikon, Canon, Sony
    in years, decades ahead?? What new technology
    will produce 50 megapixels DSLR with lightweight
    lens like the EFs. Or will DSLRs reach a certain
    limit like 30 megapixels where the manufacturers
    would no longer push it above but maintain it for
    decades or centuries to come?? Or will new
    pixel technology resistance to noise produce 120 Megapixels or even 1
    Gigapixels and beyond?

    lastico
    lastico, Apr 27, 2009
    #1
    1. Advertising

  2. ? "lastico" <> ?????? ??? ??????
    news:...
    > Hi,
    >
    > What's the highest megapixels possible in APS-C
    > DSLRs before noise makes the quality bad... 20
    > Megapixels? 40 Megapixels? There will come
    > a time when the pixel sizes will match the point
    > & shoot department. Will DSLRs go back to 35mm
    > lens? What's the roadmaps for Nikon, Canon, Sony
    > in years, decades ahead?? What new technology
    > will produce 50 megapixels DSLR with lightweight
    > lens like the EFs. Or will DSLRs reach a certain
    > limit like 30 megapixels where the manufacturers
    > would no longer push it above but maintain it for
    > decades or centuries to come?? Or will new
    > pixel technology resistance to noise produce 120 Megapixels or even 1
    > Gigapixels and beyond?
    >

    No idea, nobody can predict the future. Maybe cameras will evolve in a
    totally different way that we today cannot even imagine. If you think what
    people believed in the '80s the 21st century would be like, you will be
    amazed. Everybody thought that we would be having flying cars, colonies in
    the moon, starships travelling to jupiter.... OTOH, we have now mobile
    phones, the soviet bloc doesn't exist since 1989, the internet and in
    general the digital revolution.



    --
    Tzortzakakis Dimitrios
    major in electrical engineering
    mechanized infantry reservist
    hordad AT otenet DOT gr
    Tzortzakakis Dimitrios, Apr 27, 2009
    #2
    1. Advertising

  3. lastico

    Don Stauffer Guest

    Tzortzakakis Dimitrios wrote:
    > ? "lastico" <> ?????? ??? ??????
    > news:...
    >> Hi,
    >>
    >> What's the highest megapixels possible in APS-C
    >> DSLRs before noise makes the quality bad... 20
    >> Megapixels? 40 Megapixels? There will come
    >> a time when the pixel sizes will match the point
    >> & shoot department. Will DSLRs go back to 35mm
    >> lens? What's the roadmaps for Nikon, Canon, Sony
    >> in years, decades ahead?? What new technology
    >> will produce 50 megapixels DSLR with lightweight
    >> lens like the EFs. Or will DSLRs reach a certain
    >> limit like 30 megapixels where the manufacturers
    >> would no longer push it above but maintain it for
    >> decades or centuries to come?? Or will new
    >> pixel technology resistance to noise produce 120 Megapixels or even 1
    >> Gigapixels and beyond?
    >>

    > No idea, nobody can predict the future. Maybe cameras will evolve in a
    > totally different way that we today cannot even imagine. If you think what
    > people believed in the '80s the 21st century would be like, you will be
    > amazed. Everybody thought that we would be having flying cars, colonies in
    > the moon, starships travelling to jupiter.... OTOH, we have now mobile
    > phones, the soviet bloc doesn't exist since 1989, the internet and in
    > general the digital revolution.
    >
    >
    >

    I am not sure of the exact format size of APS-C, so I cannot compute it
    right now. However, even though "photo"lithography has moved to
    submicron feature size, I doubt if sensor pixels will go below 1 micron
    anytime soon. I see real problems with submicron pixels, although it is
    theoretically possible. So a fair benchmark would be an array of 1
    micron pixels.
    Don Stauffer, Apr 27, 2009
    #3
  4. lastico

    Chris H Guest

    In message <
    s.com>, lastico <> writes
    >Hi,
    >
    >What's the highest megapixels possible in APS-C
    >DSLRs before noise makes the quality bad...


    Currently about 15MP?

    At least after that they all seem to go FX

    --
    \/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\
    \/\/\/\/\ Chris Hills Staffs England /\/\/\/\/
    \/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/
    Chris H, Apr 28, 2009
    #4
  5. lastico

    Matt Ion Guest

    Don Stauffer wrote:
    > Tzortzakakis Dimitrios wrote:
    >> ? "lastico" <> ?????? ??? ??????
    >> news:...
    >>> Hi,
    >>>
    >>> What's the highest megapixels possible in APS-C
    >>> DSLRs before noise makes the quality bad... 20
    >>> Megapixels? 40 Megapixels? There will come
    >>> a time when the pixel sizes will match the point
    >>> & shoot department. Will DSLRs go back to 35mm
    >>> lens? What's the roadmaps for Nikon, Canon, Sony
    >>> in years, decades ahead?? What new technology
    >>> will produce 50 megapixels DSLR with lightweight
    >>> lens like the EFs. Or will DSLRs reach a certain
    >>> limit like 30 megapixels where the manufacturers
    >>> would no longer push it above but maintain it for
    >>> decades or centuries to come?? Or will new
    >>> pixel technology resistance to noise produce 120 Megapixels or even 1
    >>> Gigapixels and beyond?
    >>>

    >> No idea, nobody can predict the future. Maybe cameras will evolve in a
    >> totally different way that we today cannot even imagine. If you think
    >> what people believed in the '80s the 21st century would be like, you
    >> will be amazed. Everybody thought that we would be having flying cars,
    >> colonies in the moon, starships travelling to jupiter.... OTOH, we
    >> have now mobile phones, the soviet bloc doesn't exist since 1989, the
    >> internet and in general the digital revolution.
    >>
    >>
    >>

    > I am not sure of the exact format size of APS-C, so I cannot compute it
    > right now. However, even though "photo"lithography has moved to
    > submicron feature size, I doubt if sensor pixels will go below 1 micron
    > anytime soon. I see real problems with submicron pixels, although it is
    > theoretically possible. So a fair benchmark would be an array of 1
    > micron pixels.


    Any calculations and assumed limitations would also assume makers stay
    with current technologies. A completely new and different sensor
    technology could be discovered tomorrow that would completely invalidate
    the very concept of "megapixels".
    Matt Ion, Apr 28, 2009
    #5
  6. In article
    <>,
    lastico <> writes
    >Hi,
    >
    >What's the highest megapixels possible in APS-C
    >DSLRs before noise makes the quality bad... 20
    >Megapixels? 40 Megapixels?


    What makes you think there is a limit at all?

    In a conventional sensor, as was the case a couple of years ago, you
    could say that there was no point in making the pixel smaller than the
    diffraction limit of the optical system. In most cases that is around
    f/2, in a few cases it gets down to f/1.2, but few lenses meet this
    theoretical limit of resolution. For green 550nm light than makes the
    optical diffraction cut-off around 1500cy/mm, so no point in making
    pixels smaller than 0.33um, since it is impossible to resolve more than
    that with f/1.2. On a typical APS-C sensor of 23x15mm that works out at
    a snitch over 3gigapixels - the absolute limit.

    But who is going to go there and why, when most lenses are diffraction
    limited at closer to f/4 or f/5.6? That results in a maximum useful
    megapixel size of 145Mpix, and even that doesn't take account of the
    pixel resolution itself, just the optical limits.

    However, long before that you will see a change in the sensors
    occurring, indeed it has already started with some of the SRAW type
    concepts which trade resolution for noise.

    The idea here is similar to "bit-stream" audio, where a single bit DAC
    running at 100MHz with digital filtering reproduces better audio than a
    16-bit DAC at the 44.1kHz data recorded on CD. Digital processing takes
    that low bandwidth, high dynamic range digital signal and processes it
    into high bandwidth low dynamic range digital for the ADC so that it can
    be simply filtered in the analogue domain to a low bandwidth high
    dynamic range output without the need for high precision analogue
    components - literally making digital audio as cheap as chips!

    Why should the sensor just accumulate photons in pixel sized buckets
    that require high dynamic range analogue and ADC components to get good
    quality images? Why not make the pixels and the size of their
    individual buckets smaller and then use digital processing based on the
    known lens parameters to compute just the information that the lens at
    the aperture and zoom position can actually resolve? For example, a 3Gp
    sensor with only a few hundred electron capacity could, with appropriate
    processing capacity meet the resolution limits of all current optics
    with a signal to noise ratio that would outperform all of today's
    sensors.

    Take it another step and simply record the position that each photon has
    induced a free electron on the focal plane and process that data
    on-chip. Based on today's devices of, say a 5um pixel with 25,000
    electrons per pixel, that would be 0.4Tp for APS-C - with 1 electron
    each. The digital output off the chip could be 50Mp with 12bit dynamic
    range if the lens was at its sweet spot, or only 1Mp with 18-bit dynamic
    range if the lens was optically limited. You could have a variable
    pixel density and dynamic range - just like VBR MP3 coding - across the
    image, optimally encoding the image to account for the central sweet
    spot and the aberration limited corners.

    No, you are nowhere near the practical limits with today's technology
    and a long way from the limits of theory.
    --
    Kennedy
    Yes, Socrates himself is particularly missed;
    A lovely little thinker, but a bugger when he's pissed.
    Python Philosophers (replace 'nospam' with 'kennedym' when replying)
    Kennedy McEwen, Apr 29, 2009
    #6
  7. lastico

    jstuedle

    Joined:
    Apr 29, 2009
    Messages:
    1
    The limit will not be the pixel count of the sensor, but the quality of the lens to resolve X number of pixels. About 55 to 60 lpmm (line pairs per millimeter) is the max for the best quality glass currently available. ( and this is in the center where quality is best, quality/lpmm fall off as you approach the corners ) Another limit will be the auto focus technology and how it is implemented. At A.F. current implementation, slight dithering occurs and at max. aperture on larger aperture lenses most high resolution lenses look out of focus when viewed actual pixels on a larger monitor. This will only become more of an issue as pixel count goes up. Most industry insiders believe 30-34 MP is the limit on full frame sensors with the current state of the art in pro lens design and production and auto focus technology. To surpass this benchmark will require a substantial investment by the working pro in glass and camera bodies as to break the barriers now in place will most likely require hardware not backward compatible with existing pro level gear.
    jstuedle, Apr 29, 2009
    #7
  8. In article <>, Alfred Molon
    <> writes
    >Don't forget that full colour sensors are the future.


    With that level of oversampling it doesn't matter whether the sensors
    are "full colour" at each site or a Bayer type matrix of single colours.
    --
    Kennedy
    Yes, Socrates himself is particularly missed;
    A lovely little thinker, but a bugger when he's pissed.
    Python Philosophers (replace 'nospam' with 'kennedym' when replying)
    Kennedy McEwen, Apr 29, 2009
    #8
  9. Alfred Molon <> wrote:

    > Don't forget that full colour sensors are the future.


    It's a shame we won't be able to upgrade our eyes :)

    --
    Chris Malcolm
    Chris Malcolm, Apr 29, 2009
    #9
  10. lastico

    jdear Guest

    On Apr 28, 6:05 pm, Kennedy McEwen <> wrote:
    > In article
    > <>,
    > lastico <> writes
    >
    > >Hi,

    >
    > >What's the highest megapixels possible in APS-C
    > >DSLRs before noise makes the quality bad... 20
    > >Megapixels? 40 Megapixels?

    >
    > What makes you think there is a limit at all?
    >
    > In a conventional sensor, as was the case a couple of years ago, you
    > could say that there was no point in making the pixel smaller than the
    > diffraction limit of the optical system.  In most cases that is around
    > f/2, in a few cases it gets down to f/1.2, but few lenses meet this
    > theoretical limit of resolution.  For green 550nm light than makes the
    > optical diffraction cut-off around 1500cy/mm, so no point in making
    > pixels smaller than 0.33um, since it is impossible to resolve more than
    > that with f/1.2.  On a typical APS-C sensor of 23x15mm that works out at
    > a snitch over 3gigapixels - the absolute limit.
    >
    > But who is going to go there and why, when most lenses are diffraction
    > limited at closer to f/4 or f/5.6?  That results in a maximum useful
    > megapixel size of 145Mpix, and even that doesn't take account of the
    > pixel resolution itself, just the optical limits.
    >
    > However, long before that you will see a change in the sensors
    > occurring, indeed it has already started with some of the SRAW type
    > concepts which trade resolution for noise.
    >
    > The idea here is similar to "bit-stream" audio, where a single bit DAC
    > running at 100MHz with digital filtering reproduces better audio than a
    > 16-bit DAC at the 44.1kHz data recorded on CD.  Digital processing takes
    > that low bandwidth, high dynamic range digital signal and processes it
    > into high bandwidth low dynamic range digital for the ADC so that it can
    > be simply filtered in the analogue domain to a low bandwidth high
    > dynamic range output without the need for high precision analogue
    > components - literally making digital audio as cheap as chips!
    >
    > Why should the sensor just accumulate photons in pixel sized buckets
    > that require high dynamic range analogue and ADC components to get good
    > quality images?  Why not make the pixels and the size of their
    > individual buckets smaller and then use digital processing based on the
    > known lens parameters to compute just the information that the lens at
    > the aperture and zoom position can actually resolve?


    I don't see how your idea would work. The single bit ADCs you talk
    about
    are sigma delta converters, that is, the next sample is either one
    bit
    value higher or one bit value lower than the PREVIOUS sample. Still
    image
    data is not temporal data, so there is no previous data to compare to.
    There is spatial data that can be used for resolution purposes and for
    a
    very limited intensity purposes ( blown highlights ). A single bit
    sensor
    wouldn't know the difference between a patch of grey and a patch of
    white
    in the scene. Adding dither to the sensor would help, but not enough.
    jdear, Apr 29, 2009
    #10
  11. In article
    <>,
    jdear <> writes
    >I don't see how your idea would work. The single bit ADCs you talk
    >about
    >are sigma delta converters, that is, the next sample is either one
    >bit
    >value higher or one bit value lower than the PREVIOUS sample.


    No that isn't how sigma-delta ADCs work.
    See http://en.wikipedia.org/wiki/Digital_to_analog_converter

    These are oversampling DACs with noise shaping digital filters. The
    same effect can be achieved in ADCs with the digital filter on the
    output, as I suggest.

    >Still
    >image
    >data is not temporal data, so there is no previous data to compare to.


    Ignoring your previous error, this is also irrelevant because whilst
    there is no temporal previous data there is adjacent spatial data, not
    only that, but unlike the time axis, there are two relevant orthogonal
    spatial axes.


    >A single bit
    >sensor
    >wouldn't know the difference between a patch of grey and a patch of
    >white
    >in the scene.

    That is precisely the point. At the individual sample there are only
    two levels, however that sampling density is so high - much higher than
    the optical capabilities of any lens - that the useful image information
    is achieved by averaging over a dynamically variable area, matched to
    the capabilities of the lens. What you are measuring is a canonical
    ensemble, the electron density on the silicon. Current devices
    effectively have a single bit sensor already, individual electrons, but
    they are spatially clustered - averaged over relatively large, fixed,
    areas which are below the resolution of the best glass.
    --
    Kennedy
    Yes, Socrates himself is particularly missed;
    A lovely little thinker, but a bugger when he's pissed.
    Python Philosophers (replace 'nospam' with 'kennedym' when replying)
    Kennedy McEwen, Apr 30, 2009
    #11
  12. In article <>, Alfred Molon
    <> writes
    >In article <>, Kennedy McEwen
    >says...
    >
    >> With that level of oversampling it doesn't matter whether the sensors
    >> are "full colour" at each site or a Bayer type matrix of single colours.

    >
    >It does. Since you can't increase the pixel count for ever, the future
    >are better pixels and full colour pixels are better than single colour
    >ones.


    You have clearly missed the point, which is to eliminate pixels on the
    focal plane entirely by increasing the sampling density significantly
    beyond what can be resolved by any practical optic. The output pixels
    are downsampled from a super-resolution sampling scale. You don't need
    to increase the pixel count forever. Once you are past the point where
    an optical system can differentiate the spatial extent of any colour
    then spatially separated colour sensors (eg. Bayer) cease to have any
    differentiation from full colour spatially coherent sensors (eg.
    Foveon). However, single photo-electron localisation requires several
    orders of magnitude higher sampling density than an optical diffraction
    limit, so way beyond where it becomes possible to optically spatially
    separate colours.

    An analogy of this is to couple a conventional Bayer sensor to a very
    poor resolution lens, something like a Lensbaby with a >20Mp sensor.
    Since the Lensbaby cannot resolve any more than, say 10lp/mm, it cannot
    differentiate the spatial difference of the 6um or so between coloured
    pixels on the Bayer sensor. Rather, at 10lp/mm the smallest pixel that
    the lens can resolve is about 50x50um, which is just a shade more than
    8x8 Bayer pixels. Consequently after spatially filtering the image back
    to the maximum resolution that the Lensbaby can reproduce, each output
    pixel IS a full colour pixel. There need be no Bayer demosaicing
    process, because the lens resolution comes nowhere near the sensor
    resolution. Simply integrating the electron density in each colour
    within an area determined by the lens resolution limit is enough.

    Now you can argue that if each Bayer sensor was sensitive to three
    colours in a Foveon or similar arrangement then you would get more
    sensitivity, but that is just the limitations of the analogy not the
    concept. With the 1-electron capacity at the ideal sensor resolution
    limit, and hence single bit depth, that argument just ceases to be
    relevant.
    --
    Kennedy
    Yes, Socrates himself is particularly missed;
    A lovely little thinker, but a bugger when he's pissed.
    Python Philosophers (replace 'nospam' with 'kennedym' when replying)
    Kennedy McEwen, May 1, 2009
    #12
  13. In article <>, Alfred Molon
    <> writes
    >In article <>, Kennedy McEwen
    >says...
    >
    >> Now you can argue that if each Bayer sensor was sensitive to three
    >> colours in a Foveon or similar arrangement then you would get more
    >> sensitivity,

    >
    >... and that's essentially the reason why I was suggesting to have full
    >colour pixels. If you only capture only one colour component per spatial
    >area, you are throwing away 2/3 of the incoming light, i.e. you have
    >only 1/3 of the sentitivity.
    >

    Not if you are localising individual photo-electrons with 1-bit dynamic
    range! Take a look at some of the non-Foveon concepts for full colour
    pixels - the photoelectrons are stored on the silicon in different
    spatial locations depending on their colour. Spatial separation of
    colour signals does not always result in "throwing away 2/3 of the
    incoming light", even though that happens with Bayer arrays.
    --
    Kennedy
    Yes, Socrates himself is particularly missed;
    A lovely little thinker, but a bugger when he's pissed.
    Python Philosophers (replace 'nospam' with 'kennedym' when replying)
    Kennedy McEwen, May 1, 2009
    #13
  14. lastico

    jdear Guest

    On Apr 30, 12:00 am, Kennedy McEwen <> wrote:

    > >A single bit
    > >sensor
    > >wouldn't know the difference between a patch of grey and a patch of
    > >white
    > >in the scene.

    >
    > That is precisely the point.  At the individual sample there are only
    > two levels, however that sampling density is so high - much higher than
    > the optical capabilities of any lens - that the useful image information
    > is achieved by averaging over a dynamically variable area, matched to
    > the capabilities of the lens.  What you are measuring is a canonical
    > ensemble, the electron density on the silicon.  Current devices
    > effectively have a single bit sensor already, individual electrons, but
    > they are spatially clustered - averaged over relatively large, fixed,
    > areas which are below the resolution of the best glass.
    > --
    > Kennedy
    > Yes, Socrates himself is particularly missed;
    > A lovely little thinker, but a bugger when he's pissed.
    > Python Philosophers         (replace 'nospam' with 'kennedym' when replying)


    No, it won't work.

    Think about this, you have seen like this:

    00000000000000000000
    00000000000000000000
    00000000000000000000
    00000000000000000000
    00000000000000000000
    00000111111111100000
    00000111111111100000
    00000111111111100000
    00000111111111100000
    00000111111111100000
    00000000000000000000
    00000000000000000000
    00000000000000000000
    00000000000000000000

    the zeros are not exposed, the one's are. Since this is
    a single bit system, there is no way of telling if some of
    the one's were brighter than the others. Using adjacent
    bits won't help as there could be a dim spot in the center
    but still bright enough to be recorded as a one. Two much
    information has been lost and can not be recovered. This
    same pattern above would be recorded regardless of how
    bright the center spot is, as long as it was bright enough
    to cause a one to be recorded.
    The only way this system could work is if you had a series
    of VERY short exposures and you stacked the exposures.
    That amounts to reading every photon as they come in in real
    time.
    jdear, May 1, 2009
    #14
  15. lastico

    nospam Guest

    In article <>, Alfred
    Molon <> wrote:

    > ... and that's essentially the reason why I was suggesting to have full
    > colour pixels. If you only capture only one colour component per spatial
    > area, you are throwing away 2/3 of the incoming light, i.e. you have
    > only 1/3 of the sentitivity.


    the 2/3 that is 'thrown away' is reconstituted later and if the sensor
    outresolves the lens, there's essentially no loss at all. plus, having
    a full colour pixel would mean a raw file that's three times as big,
    requiring three times the bandwidth in the camera as well as three
    times as much flash and hard drive storage.

    > Besides I suspect that the additional circuitry to read and process a
    > pixel occupies sensor area, which can otherwise be used to capture
    > light. So, instead of having three single colour pixels it's better to
    > have just one full-colour pixel.


    except that there's a noise penalty to pay, even with some of the
    non-foveon designs, so you have a full colour pixel that doesn't work
    as well at higher isos.
    nospam, May 2, 2009
    #15
  16. In article
    <>,
    jdear <> writes
    >
    >No, it won't work.
    >
    >Think about this, you have seen like this:
    >
    >00000000000000000000
    >00000000000000000000
    >00000000000000000000
    >00000000000000000000
    >00000000000000000000
    >00000111111111100000
    >00000111111111100000
    >00000111111111100000
    >00000111111111100000
    >00000111111111100000
    >00000000000000000000
    >00000000000000000000
    >00000000000000000000
    >00000000000000000000
    >
    >the zeros are not exposed, the one's are.


    That is an impossible situation since we are talking about samples here
    which are many times higher than the optical resolution of any lens. You
    simply can't achieve areas of the sensor exposed as above, there is just
    a photo-electron density transition at the optical resolution. Lens
    resolution is finite, even for an optically perfect lens. Photoelectrons
    can be localised at much finer precision than the optical lens can
    resolve.

    You don't need to know that some of the 1's are brighter than others,
    because they aren't - its the total number of 1's in the entire area
    that matters and, in your diagram above, with 280 possible positions in
    what would be only part of an optically resolved sample, there is just
    over 8-bits of equivalent precision with a simple single order digital
    filter without any noise shaping.

    Your issue on exposure time is also misguided. Photoelectrons are
    generated by photons absorbed by the silicon, they don't just disappear,
    they remain there until something allows them to move off the focal
    plane. The concept isn't so different from the original photocathodes
    on CRT sensors, just in solid state at a finer resolution.
    --
    Kennedy
    Yes, Socrates himself is particularly missed;
    A lovely little thinker, but a bugger when he's pissed.
    Python Philosophers (replace 'nospam' with 'kennedym' when replying)
    Kennedy McEwen, May 2, 2009
    #16
  17. lastico

    nospam Guest

    In article <>, Alfred
    Molon <> wrote:

    > To put it in other terms, a Bayer sensor uses only 1/3 of incoming
    > light, therefore the full-colour sensor is 3 times as sensitive.


    with foveon, there's also a loss of sensitivity from slicing a pixel
    into three layers, so each 'colour' gets 1/3 the light (simplified),
    along with the losses in the conversion to rgb. with nikon's dichroic
    mirror patent, the 3 colour receptors have to fit into the space of one
    pixel (plus the mirrors). a smaller receptor results in a lower s/n
    ratio.

    real cameras are consistent with that loss. the nikon d300 goes to iso
    3200 with 6400 in extended mode. the sigma sd14 goes to iso 800, with
    iso 1600 in extended mode. plus, iso 800 on the sigma is pretty bad,
    worse than 3200 on a nikon d300.

    > > plus, having
    > > a full colour pixel would mean a raw file that's three times as big,
    > > requiring three times the bandwidth in the camera as well as three
    > > times as much flash and hard drive storage.

    >
    > Not much of an issue, since memory is cheap and getting cheaper. Besides
    > I suspect that you could use some data compression to reduce the memory
    > usage.


    regardless of its price, full colour raw images will have three times
    as much data to move and store than bayer images. compression doesn't
    matter since bayer raw images can be compressed too. in fact, bayer
    *is* a form of slightly lossy compression.
    nospam, May 2, 2009
    #17
  18. lastico

    J. Clarke Guest

    Kennedy McEwen wrote:
    > In article
    > <>,
    > jdear <> writes
    >>
    >> No, it won't work.
    >>
    >> Think about this, you have seen like this:
    >>
    >> 00000000000000000000
    >> 00000000000000000000
    >> 00000000000000000000
    >> 00000000000000000000
    >> 00000000000000000000
    >> 00000111111111100000
    >> 00000111111111100000
    >> 00000111111111100000
    >> 00000111111111100000
    >> 00000111111111100000
    >> 00000000000000000000
    >> 00000000000000000000
    >> 00000000000000000000
    >> 00000000000000000000
    >>
    >> the zeros are not exposed, the one's are.

    >
    > That is an impossible situation since we are talking about samples
    > here which are many times higher than the optical resolution of any
    > lens. You simply can't achieve areas of the sensor exposed as above,
    > there is just a photo-electron density transition at the optical
    > resolution. Lens resolution is finite, even for an optically perfect
    > lens. Photoelectrons can be localised at much finer precision than
    > the optical lens can resolve.
    >
    > You don't need to know that some of the 1's are brighter than others,
    > because they aren't - its the total number of 1's in the entire area
    > that matters and, in your diagram above, with 280 possible positions
    > in what would be only part of an optically resolved sample, there is
    > just over 8-bits of equivalent precision with a simple single order
    > digital filter without any noise shaping.
    >
    > Your issue on exposure time is also misguided. Photoelectrons are
    > generated by photons absorbed by the silicon, they don't just
    > disappear, they remain there until something allows them to move off
    > the focal plane. The concept isn't so different from the original
    > photocathodes on CRT sensors, just in solid state at a finer
    > resolution.


    Sounds like you're proposing that the existing systems be replaced with some
    kind of digital half-toning with very, very small active sites. That,
    unless there is a huge breakthrough in sensor technology, is going to flush
    your sensitivity down the toilet.
    J. Clarke, May 2, 2009
    #18
  19. lastico

    nospam Guest

    In article <>, Alfred
    Molon <> wrote:

    > > with foveon, there's also a loss of sensitivity from slicing a pixel
    > > into three layers, so each 'colour' gets 1/3 the light (simplified),

    >
    > No. With a full colour sensor each photon is captured, while with Bayer
    > 2/3 of the photons are thrown away.


    they may be thrown away but the data can be regenerated. furthermore,
    each individual foveon layer is itself noisier than one thick pixel.

    > > along with the losses in the conversion to rgb.

    >
    > What losses do you mean?


    unlike what sigma's ads show, foveon doesn't capture true rgb, but
    rather three overlapping spectra that must be converted to rgb. for
    instance, the top layer is mostly blue but has a lot of green and even
    some red. the middle layer is mostly green but has a lot of blue and
    red. simplifying, to get rgb, the layers need to be subtracted which
    reduces the s/n ratio. the conversion is actually quite complex and
    part of the reason why the camera and software is so slow.

    > > in fact, bayer
    > > *is* a form of slightly lossy compression.

    >
    > It's not. With bayer the missing colour data is guessed (incorrectly for
    > most pixels).


    it's calculated, not guessed. it's also very accurate except in a
    couple of edge cases that don't matter in the real world. if most of
    the bayer pixels were as inaccurate you claim, then why do the photos
    look as good as they do?

    foveon also calculates the pixels (or should i say guesses). in fact,
    there's more guessing in foveon than with bayer. if you take multiple
    photos in succession with a sigma camera you might get different
    colours. there are even differences between multiple cameras, which
    shouldn't happen if it was more accurate.
    nospam, May 2, 2009
    #19
  20. In article <>, Alfred Molon
    <> writes
    >In article <>, Kennedy McEwen
    >says...
    >
    >> Not if you are localising individual photo-electrons with 1-bit dynamic
    >> range! Take a look at some of the non-Foveon concepts for full colour
    >> pixels - the photoelectrons are stored on the silicon in different
    >> spatial locations depending on their colour.

    >
    >That sounds like Foveon... or are you suggestion that not in every
    >location on the sensor you are capturing all colours? Then you are
    >throwing away photons.


    No, Foveon use penetration depth to differentiate photon energy and thus
    wavelength, however that isn't the only way and spatial storage does not
    mean that photons are "thrown away", as you call it, just because that
    is what Bayer filtering does, although by much less than the 2/3 that
    you claim.

    To explain how this is feasible, start by recognising that the image
    sensing and storage operations are completely separate functions. The
    sensor can image every part of the focal plane, whilst the sensing and
    storage uses is only part of it. That happens already, using
    microlenses. These enable image capture across a much larger area than
    that of the underlying photodiode area. Indeed, recent gap-less
    microlens designs capture 100% of the incident photons. Since that is
    today's technology I don't expect you to disagree that it is practical -
    all of the photons incident on the focal plane are captured by the
    microlenses. Only the Bayer filter of the current sensors cause some
    photons to be filtered out in parts of the focal plane and thus lost.
    Without Bayer filters, all of the photons would be stored on the
    underlying photodiode area - but without colour discrimination.

    Now, instead of uniform micro lenses, lets replace them with
    microprisms, or a highly chromatic microlenses, or even diffraction
    gratings - indeed, almost any systematic structure on the focal plane at
    this resolution, less than the wavelength of light, will produce spatial
    chromatic dispersion. Rather than separating the wavelength by
    penetration depth on the silicon, as with Foveon, you now have a spatial
    chromatic separation. All of the incident photons are captured and
    separated spatially by photon energy; wavelength; colour. Remember
    this is at a resolution well beyond anything the lens can achieve, so
    there is no spatial image content at this level. All photons are
    captured, all produce photoelectrons which are localised by wavelength
    at a resolution well in excess of optical resolutions.

    Thus the spatial location of the photoelectron determines what
    wavelength of photon generated it and, since we are grossly oversampling
    the optical resolution, the electron density can readily be downsampled
    to optical resolution levels in full colour by noise shaping filters as
    in bitstream technology.
    --
    Kennedy
    Yes, Socrates himself is particularly missed;
    A lovely little thinker, but a bugger when he's pissed.
    Python Philosophers (replace 'nospam' with 'kennedym' when replying)
    Kennedy McEwen, May 2, 2009
    #20
    1. Advertising

Want to reply to this thread or ask your own question?

It takes just 2 minutes to sign up (and it's free!). Just click the sign up button to choose a username and then you can ask your own questions on the forum.
Similar Threads
  1. Walter Steiner
    Replies:
    0
    Views:
    709
    Walter Steiner
    Jul 19, 2005
  2. ivy

    Highest Scores - Poll

    ivy, Apr 8, 2004, in forum: MCSD
    Replies:
    4
    Views:
    558
    Al Manint
    Apr 12, 2004
  3. Bill Hilton

    39 megapixels? 31 megapixels? Get 'em here ...

    Bill Hilton, Jul 16, 2005, in forum: Digital Photography
    Replies:
    7
    Views:
    332
    Bill Hilton
    Jul 18, 2005
  4. jdanield
    Replies:
    0
    Views:
    249
    jdanield
    Oct 17, 2012
  5. Rob
    Replies:
    0
    Views:
    383
Loading...

Share This Page