B&H Photo gets it right, fixes error on website - Sigma SD10 is 3.4 Megapixel

Discussion in 'Digital Photography' started by Steven M. Scharf, Oct 8, 2004.

  1. Steven M. Scharf, Oct 8, 2004
    #1
    1. Advertising

  2. Steven M. Scharf

    Guest

    In message <Bcy9d.6790$>,
    "Steven M. Scharf" <> wrote:

    >http://www.bhphotovideo.com/bnh/controller/home?O=productlist&A=details&Q=&s
    >ku=305163&is=REG
    >
    >Sigma Sigma SD10, 3.4 Megapixel, SLR, Digital Camera Kit with Sigma 18-50mm
    >& 55-200mm Lenses
    >
    >Looks like my e-mail to them had results!


    It still says "3.4M Red + 3.4M Green + 3.4M Blue = 10.2 Megapixel Total
    Resolution", which is not true.
    --

    <>>< ><<> ><<> <>>< ><<> <>>< <>>< ><<>
    John P Sheehy <>
    ><<> <>>< <>>< ><<> <>>< ><<> ><<> <>><
    , Oct 11, 2004
    #2
    1. Advertising

  3. Let's do the math.

    My Nikon 5700 has 2560 x 1920 sensors each producing 12 bits of data. Forget
    colour for a moment here, and that is 4,915,200 distinct sensors. If we
    multiply the sensors by 12 and divide by 8 we get 7,372,800 MB, which is in
    fact the size of a RAW file. Now, the camera uses a Bayer Filter across the
    Sensor array, so some sensors represent RED values, some GREEN and some
    BLUE. When you grab the data from the camera using Nikon Capture, it creates
    an image that is 2560 x 1920 pixels, each pixel having 36 bits of data. In
    this case though, each pixel now represents all three colours, RED, GREEN
    and BLUE. Whrre did all this extra data come from you ask? The Nikon Capture
    program processed the incoming RAW data and interpolated values for colours
    for each pixel.

    The SIGM SD10 has 2268 X 1512 sensors in a RED plane, 2268 x 1512 sensors in
    a GREEN plane and 2268 x 1512 sensors in a BLUE plane, and each sensor in
    each plane produce 12 bits of data. This is effectively, 2268 x 1512 x 3,
    which is 10,242,288 sensors. If we multiply this by 12 and divide by 8 we
    get, 15,363,432 MB. Would anyone care to confirm that this is the size of a
    RAW data file from the SIGMA? When the image data is read form the camera,
    the program does not have to interpolate values for each pixel, as the
    camera already produces 36 bits of data per pixel. The result should be a
    truer colour image.

    The Nikon produces a great image, but, the actual pixel values are derived
    from a mathmatical algorithym, either by Nikon Capture in the case of RAW
    data, or by the smaller more compact algorythm inside the camera, in the
    case of JPEG.

    Now I have done 5 years of work in Digital Image Processing, specifically
    dealing with Earth Resource Sattellite data and RADAR data. I can assure you
    that interpolating image data values is a very strong science and produces
    amazing if not uncanny results, but it is all based on probability. When
    creating data, interpolation, you form the new data based on the nearest
    neighbours data, assigning a weight of influence on the adjacent sensors
    that falls off rapidly as the distance between target and input data
    increases. The algorythms used work on specific colours and then combine the
    results to give you you final image in pixels.

    A Pixel is a Picture Element, and the SIGMA SD10 by the nature of it's
    design has in fact 2268 x 1512 pixels in it's array, unlike all other
    non-foveon camera that only have a distinct bank of sensors in their arrays.

    By it's very design, the SIGMA has to produce truer colour representation
    that thta of an interpolated value camera as almost all others are. I belive
    Polaroid has just signed on the use of a Foveon Sensor and this is probably
    a good thing.

    I you want to know how well the math in this stuff is, consider that one can
    process a sattellite image and if lucky, with the right filtering applied to
    an image, see the disturbance pattern produced by a submarine under water.
    So, although you cannot see the submarine, the disturbance it leaves in the
    water can be processed for and you can thus say that a submarine was under
    the water. The chances of capturing a satlleite image and have a submarine
    under the waters surface at the same time is very low probability, but wiht
    a good algorythm and some hard work, you can find the signs of the submarine
    in the image.

    rtt




    ----== Posted via Newsfeeds.Com - Unlimited-Uncensored-Secure Usenet News==----
    http://www.newsfeeds.com The #1 Newsgroup Service in the World! >100,000 Newsgroups
    ---= East/West-Coast Server Farms - Total Privacy via Encryption =---
    Richard Tomkins, Oct 11, 2004
    #3
  4. Steven M. Scharf

    Guest

    In message <416af83c$1_4@127.0.0.1>,
    "Richard Tomkins" <> wrote:

    >Let's do the math.
    >
    >My Nikon 5700 has 2560 x 1920 sensors each producing 12 bits of data. Forget
    >colour for a moment here, and that is 4,915,200 distinct sensors. If we
    >multiply the sensors by 12 and divide by 8 we get 7,372,800 MB, which is in
    >fact the size of a RAW file. Now, the camera uses a Bayer Filter across the
    >Sensor array, so some sensors represent RED values, some GREEN and some
    >BLUE. When you grab the data from the camera using Nikon Capture, it creates
    >an image that is 2560 x 1920 pixels, each pixel having 36 bits of data. In
    >this case though, each pixel now represents all three colours, RED, GREEN
    >and BLUE. Whrre did all this extra data come from you ask? The Nikon Capture
    >program processed the incoming RAW data and interpolated values for colours
    >for each pixel.


    Yes. Very accurate (but not perfect) at the pixel level for luminance,
    with full resolution. Less accurate for hue, and at 1/2 the resolution.

    >The SIGM SD10 has 2268 X 1512 sensors in a RED plane, 2268 x 1512 sensors in
    >a GREEN plane and 2268 x 1512 sensors in a BLUE plane, and each sensor in
    >each plane produce 12 bits of data. This is effectively, 2268 x 1512 x 3,
    >which is 10,242,288 sensors. If we multiply this by 12 and divide by 8 we
    >get, 15,363,432 MB. Would anyone care to confirm that this is the size of a
    >RAW data file from the SIGMA? When the image data is read form the camera,
    >the program does not have to interpolate values for each pixel, as the
    >camera already produces 36 bits of data per pixel. The result should be a
    >truer colour image.


    In theory, yes, but in reality the Foveon has a very flaky method of
    separating the color channels, and they really don't separate very well.
    The total of the three sensors at each pixel have an extremely accurate
    luminance, but the distribution amongst the 3 channels is full of noise
    and error. Rather than 3 overlapping bell curves, you have curves that
    are almost flat in the areas between green and blue, so their effective
    resolution is very poor. What you wind up with is hue posterization
    with blue/green combinations; seas and skies wind up with magenta and
    green blotchy casts all over.

    >The Nikon produces a great image, but, the actual pixel values are derived
    >from a mathmatical algorithym, either by Nikon Capture in the case of RAW
    >data, or by the smaller more compact algorythm inside the camera, in the
    >case of JPEG.


    >Now I have done 5 years of work in Digital Image Processing, specifically
    >dealing with Earth Resource Sattellite data and RADAR data. I can assure you
    >that interpolating image data values is a very strong science and produces
    >amazing if not uncanny results, but it is all based on probability. When
    >creating data, interpolation, you form the new data based on the nearest
    >neighbours data, assigning a weight of influence on the adjacent sensors
    >that falls off rapidly as the distance between target and input data
    >increases. The algorythms used work on specific colours and then combine the
    >results to give you you final image in pixels.


    In RGB pixels, rather than R, G, *or* B pixels.

    >A Pixel is a Picture Element, and the SIGMA SD10 by the nature of it's
    >design has in fact 2268 x 1512 pixels in it's array, unlike all other
    >non-foveon camera that only have a distinct bank of sensors in their arrays.


    The definition of a pixel does not require any color information. You
    can have UV pixels, IR pixels, panchromatic pixels, or an array where
    red sensitivity increases from left to right on the sensor, and blue
    increases from top to bottom, and they all have as many pixels as they
    have spatial witnesses.

    >By it's very design, the SIGMA has to produce truer colour representation
    >that thta of an interpolated value camera as almost all others are. I belive
    >Polaroid has just signed on the use of a Foveon Sensor and this is probably
    >a good thing.


    IMO, Polaroid has signed on because the sensor is cheap, and it has the
    potential to be marketed with a 3x inflated pixel count. Because there
    is no color moire due to a mosaic, anti-aliasing filters are easier to
    leave out, because color aliasing is much more pronounced, but luminance
    moire is misinterpretted by the optically naive as image sharpness.
    --

    <>>< ><<> ><<> <>>< ><<> <>>< <>>< ><<>
    John P Sheehy <>
    ><<> <>>< <>>< ><<> <>>< ><<> ><<> <>><
    , Oct 12, 2004
    #4
  5. > "Richard Tomkins" <> wrote:
    []
    >> Now I have done 5 years of work in Digital Image Processing,
    >> specifically dealing with Earth Resource Sattellite data and RADAR
    >> data. I can assure you that interpolating image data values is a
    >> very strong science and produces amazing if not uncanny results, but
    >> it is all based on probability. When creating data, interpolation,
    >> you form the new data based on the nearest neighbours data,
    >> assigning a weight of influence on the adjacent sensors that falls
    >> off rapidly as the distance between target and input data increases.
    >> The algorythms used work on specific colours and then combine the
    >> results to give you you final image in pixels.



    I would have thought that in 5 years you would have learnt the importance
    of correct anti-aliasing to avoid unpleasant and misleading artefacts.
    Anti-aliasing is not present on some (? all) of the Sigma sensors. This,
    together with the poor colour due to overlapping spectral response curves,
    rules them out for any serious work right now.

    I think everyone wishes they were better, as it's an excellent idea.

    BTW: the spelling is algorithm.

    Cheers,
    David
    David J Taylor, Oct 12, 2004
    #5
  6. Steven M. Scharf

    gsum Guest

    The maths is only half of the argument. You also need to consider
    how the eye/brain perceive colour and luminance. As the eye is
    much more sensitive to luminance than to colour, Bayer sensors
    are able to represent detail via luminance but interpolate colour with
    no visible loss of image quality.

    Also, Foveon relies on the use of silicon as a colour filter. Whilst a nice
    idea, this is demonstrably not very good in practice.

    Graham


    "Richard Tomkins" <> wrote in message
    news:416af83c$1_4@127.0.0.1...
    > Let's do the math.
    >
    > My Nikon 5700 has 2560 x 1920 sensors each producing 12 bits of data.

    Forget
    > colour for a moment here, and that is 4,915,200 distinct sensors. If we
    > multiply the sensors by 12 and divide by 8 we get 7,372,800 MB, which is

    in
    > fact the size of a RAW file. Now, the camera uses a Bayer Filter across

    the
    > Sensor array, so some sensors represent RED values, some GREEN and some
    > BLUE. When you grab the data from the camera using Nikon Capture, it

    creates
    > an image that is 2560 x 1920 pixels, each pixel having 36 bits of data. In
    > this case though, each pixel now represents all three colours, RED, GREEN
    > and BLUE. Whrre did all this extra data come from you ask? The Nikon

    Capture
    > program processed the incoming RAW data and interpolated values for

    colours
    > for each pixel.
    >
    > The SIGM SD10 has 2268 X 1512 sensors in a RED plane, 2268 x 1512 sensors

    in
    > a GREEN plane and 2268 x 1512 sensors in a BLUE plane, and each sensor in
    > each plane produce 12 bits of data. This is effectively, 2268 x 1512 x 3,
    > which is 10,242,288 sensors. If we multiply this by 12 and divide by 8 we
    > get, 15,363,432 MB. Would anyone care to confirm that this is the size of

    a
    > RAW data file from the SIGMA? When the image data is read form the camera,
    > the program does not have to interpolate values for each pixel, as the
    > camera already produces 36 bits of data per pixel. The result should be a
    > truer colour image.
    >
    > The Nikon produces a great image, but, the actual pixel values are derived
    > from a mathmatical algorithym, either by Nikon Capture in the case of RAW
    > data, or by the smaller more compact algorythm inside the camera, in the
    > case of JPEG.
    >
    > Now I have done 5 years of work in Digital Image Processing, specifically
    > dealing with Earth Resource Sattellite data and RADAR data. I can assure

    you
    > that interpolating image data values is a very strong science and produces
    > amazing if not uncanny results, but it is all based on probability. When
    > creating data, interpolation, you form the new data based on the nearest
    > neighbours data, assigning a weight of influence on the adjacent sensors
    > that falls off rapidly as the distance between target and input data
    > increases. The algorythms used work on specific colours and then combine

    the
    > results to give you you final image in pixels.
    >
    > A Pixel is a Picture Element, and the SIGMA SD10 by the nature of it's
    > design has in fact 2268 x 1512 pixels in it's array, unlike all other
    > non-foveon camera that only have a distinct bank of sensors in their

    arrays.
    >
    > By it's very design, the SIGMA has to produce truer colour representation
    > that thta of an interpolated value camera as almost all others are. I

    belive
    > Polaroid has just signed on the use of a Foveon Sensor and this is

    probably
    > a good thing.
    >
    > I you want to know how well the math in this stuff is, consider that one

    can
    > process a sattellite image and if lucky, with the right filtering applied

    to
    > an image, see the disturbance pattern produced by a submarine under water.
    > So, although you cannot see the submarine, the disturbance it leaves in

    the
    > water can be processed for and you can thus say that a submarine was under
    > the water. The chances of capturing a satlleite image and have a submarine
    > under the waters surface at the same time is very low probability, but

    wiht
    > a good algorythm and some hard work, you can find the signs of the

    submarine
    > in the image.
    >
    > rtt
    >
    >
    >
    >
    > ----== Posted via Newsfeeds.Com - Unlimited-Uncensored-Secure Usenet

    News==----
    > http://www.newsfeeds.com The #1 Newsgroup Service in the World! >100,000

    Newsgroups
    > ---= East/West-Coast Server Farms - Total Privacy via Encryption =---
    gsum, Oct 12, 2004
    #6
  7. Steven M. Scharf

    bob Guest

    "Richard Tomkins" <> wrote in
    news:416af83c$1_4@127.0.0.1:

    > If we multiply this by 12 and divide by 8 we
    > get, 15,363,432 MB. Would anyone care to confirm that this is the size
    > of a RAW data file from the SIGMA? When the image data is read form
    > the camera, the program does not have to interpolate values for each
    > pixel, as the camera already produces 36 bits of data per pixel. The
    > result should be a truer colour image.
    >


    (Unfortunately) there's a lot more to image quality than math.

    Bob

    --
    Delete the inverse SPAM to reply
    bob, Oct 12, 2004
    #7
  8. Steven M. Scharf

    Dan Pidcock Guest

    "Richard Tomkins" <> wrote in message news:<416af83c$1_4@127.0.0.1>...
    > Let's do the math.
    >
    > My Nikon 5700 has 2560 x 1920 sensors each producing 12 bits of data. Forget
    > colour for a moment here, and that is 4,915,200 distinct sensors. If we
    > multiply the sensors by 12 and divide by 8 we get 7,372,800 MB, which is in
    > fact the size of a RAW file. Now, the camera uses a Bayer Filter across the
    > Sensor array, so some sensors represent RED values, some GREEN and some
    > BLUE. When you grab the data from the camera using Nikon Capture, it creates
    > an image that is 2560 x 1920 pixels, each pixel having 36 bits of data. In
    > this case though, each pixel now represents all three colours, RED, GREEN
    > and BLUE. Whrre did all this extra data come from you ask? The Nikon Capture
    > program processed the incoming RAW data and interpolated values for colours
    > for each pixel.
    >
    > The SIGM SD10 has 2268 X 1512 sensors in a RED plane, 2268 x 1512 sensors in
    > a GREEN plane and 2268 x 1512 sensors in a BLUE plane, and each sensor in
    > each plane produce 12 bits of data. This is effectively, 2268 x 1512 x 3,
    > which is 10,242,288 sensors. If we multiply this by 12 and divide by 8 we
    > get, 15,363,432 MB. Would anyone care to confirm that this is the size of a
    > RAW data file from the SIGMA? When the image data is read form the camera,
    > the program does not have to interpolate values for each pixel, as the
    > camera already produces 36 bits of data per pixel.


    2268x1512x3 = 10,287,648

    Indeed. 36 bits per pixel * 3.4 megapixels gives 36*3429216=123451776
    bits
    123451776 bits / 8 bits/byte gives 15,431,472 bytes - as you would
    have calculated if you had used 2268 not 2258.

    So you have just proved the SD10 is a 3.4 Megapixel camera.

    It is a 10.2 Megasensor camera.
    Dan Pidcock, Oct 12, 2004
    #8
    1. Advertising

Want to reply to this thread or ask your own question?

It takes just 2 minutes to sign up (and it's free!). Just click the sign up button to choose a username and then you can ask your own questions on the forum.
Similar Threads
  1. Georgette Preddy

    B&H Photo gets it right - Sigma SD10 is 13.72MP

    Georgette Preddy, Jun 6, 2004, in forum: Digital Photography
    Replies:
    90
    Views:
    1,591
  2. Brian C. Baird

    The Human Eye: 120 Megapixel Monochrome, 6 Megapixel Color

    Brian C. Baird, Jun 15, 2004, in forum: Digital Photography
    Replies:
    44
    Views:
    4,098
    Dave Haynie
    Jun 17, 2004
  3. Georgette Preddy

    Sigma Photo Pro 2.1 released for Simga SD9 and SD10

    Georgette Preddy, Aug 27, 2004, in forum: Digital Photography
    Replies:
    54
    Views:
    1,167
    Summitar
    Sep 5, 2004
  4. Mark

    5 Megapixel VS. 4 Megapixel camera

    Mark, Mar 8, 2005, in forum: Digital Photography
    Replies:
    13
    Views:
    611
    Paul H.
    Mar 9, 2005
  5. George Dingwall

    D200 Firmware Update Fixes Sigma Lens Fault.

    George Dingwall, Oct 11, 2006, in forum: Digital Photography
    Replies:
    0
    Views:
    561
    George Dingwall
    Oct 11, 2006
Loading...

Share This Page