# B&H Photo gets it right, fixes error on website - Sigma SD10 is 3.4 Megapixel

Discussion in 'Digital Photography' started by Steven M. Scharf, Oct 8, 2004.

1. ### Steven M. ScharfGuest

Steven M. Scharf, Oct 8, 2004

2. ### JPSGuest

In message <Bcy9d.6790\$>,
It still says "3.4M Red + 3.4M Green + 3.4M Blue = 10.2 Megapixel Total
Resolution", which is not true.
--

JPS, Oct 11, 2004

3. ### Richard TomkinsGuest

Let's do the math.

My Nikon 5700 has 2560 x 1920 sensors each producing 12 bits of data. Forget
colour for a moment here, and that is 4,915,200 distinct sensors. If we
multiply the sensors by 12 and divide by 8 we get 7,372,800 MB, which is in
fact the size of a RAW file. Now, the camera uses a Bayer Filter across the
Sensor array, so some sensors represent RED values, some GREEN and some
BLUE. When you grab the data from the camera using Nikon Capture, it creates
an image that is 2560 x 1920 pixels, each pixel having 36 bits of data. In
this case though, each pixel now represents all three colours, RED, GREEN
and BLUE. Whrre did all this extra data come from you ask? The Nikon Capture
program processed the incoming RAW data and interpolated values for colours
for each pixel.

The SIGM SD10 has 2268 X 1512 sensors in a RED plane, 2268 x 1512 sensors in
a GREEN plane and 2268 x 1512 sensors in a BLUE plane, and each sensor in
each plane produce 12 bits of data. This is effectively, 2268 x 1512 x 3,
which is 10,242,288 sensors. If we multiply this by 12 and divide by 8 we
get, 15,363,432 MB. Would anyone care to confirm that this is the size of a
RAW data file from the SIGMA? When the image data is read form the camera,
the program does not have to interpolate values for each pixel, as the
camera already produces 36 bits of data per pixel. The result should be a
truer colour image.

The Nikon produces a great image, but, the actual pixel values are derived
from a mathmatical algorithym, either by Nikon Capture in the case of RAW
data, or by the smaller more compact algorythm inside the camera, in the
case of JPEG.

Now I have done 5 years of work in Digital Image Processing, specifically
dealing with Earth Resource Sattellite data and RADAR data. I can assure you
that interpolating image data values is a very strong science and produces
amazing if not uncanny results, but it is all based on probability. When
creating data, interpolation, you form the new data based on the nearest
neighbours data, assigning a weight of influence on the adjacent sensors
that falls off rapidly as the distance between target and input data
increases. The algorythms used work on specific colours and then combine the
results to give you you final image in pixels.

A Pixel is a Picture Element, and the SIGMA SD10 by the nature of it's
design has in fact 2268 x 1512 pixels in it's array, unlike all other
non-foveon camera that only have a distinct bank of sensors in their arrays.

By it's very design, the SIGMA has to produce truer colour representation
that thta of an interpolated value camera as almost all others are. I belive
Polaroid has just signed on the use of a Foveon Sensor and this is probably
a good thing.

I you want to know how well the math in this stuff is, consider that one can
process a sattellite image and if lucky, with the right filtering applied to
an image, see the disturbance pattern produced by a submarine under water.
So, although you cannot see the submarine, the disturbance it leaves in the
water can be processed for and you can thus say that a submarine was under
the water. The chances of capturing a satlleite image and have a submarine
under the waters surface at the same time is very low probability, but wiht
a good algorythm and some hard work, you can find the signs of the submarine
in the image.

rtt

Richard Tomkins, Oct 11, 2004
4. ### JPSGuest

In message <[email protected]>,
Yes. Very accurate (but not perfect) at the pixel level for luminance,
with full resolution. Less accurate for hue, and at 1/2 the resolution.
In theory, yes, but in reality the Foveon has a very flaky method of
separating the color channels, and they really don't separate very well.
The total of the three sensors at each pixel have an extremely accurate
luminance, but the distribution amongst the 3 channels is full of noise
and error. Rather than 3 overlapping bell curves, you have curves that
are almost flat in the areas between green and blue, so their effective
resolution is very poor. What you wind up with is hue posterization
with blue/green combinations; seas and skies wind up with magenta and
green blotchy casts all over.
In RGB pixels, rather than R, G, *or* B pixels.
The definition of a pixel does not require any color information. You
can have UV pixels, IR pixels, panchromatic pixels, or an array where
red sensitivity increases from left to right on the sensor, and blue
increases from top to bottom, and they all have as many pixels as they
have spatial witnesses.
IMO, Polaroid has signed on because the sensor is cheap, and it has the
potential to be marketed with a 3x inflated pixel count. Because there
is no color moire due to a mosaic, anti-aliasing filters are easier to
leave out, because color aliasing is much more pronounced, but luminance
moire is misinterpretted by the optically naive as image sharpness.
--

JPS, Oct 12, 2004
5. ### David J TaylorGuest

I would have thought that in 5 years you would have learnt the importance
of correct anti-aliasing to avoid unpleasant and misleading artefacts.
Anti-aliasing is not present on some (? all) of the Sigma sensors. This,
together with the poor colour due to overlapping spectral response curves,
rules them out for any serious work right now.

I think everyone wishes they were better, as it's an excellent idea.

BTW: the spelling is algorithm.

Cheers,
David

David J Taylor, Oct 12, 2004
6. ### gsumGuest

The maths is only half of the argument. You also need to consider
how the eye/brain perceive colour and luminance. As the eye is
much more sensitive to luminance than to colour, Bayer sensors
are able to represent detail via luminance but interpolate colour with
no visible loss of image quality.

Also, Foveon relies on the use of silicon as a colour filter. Whilst a nice
idea, this is demonstrably not very good in practice.

Graham

gsum, Oct 12, 2004
7. ### bobGuest

(Unfortunately) there's a lot more to image quality than math.

Bob

bob, Oct 12, 2004
8. ### Dan PidcockGuest

2268x1512x3 = 10,287,648

Indeed. 36 bits per pixel * 3.4 megapixels gives 36*3429216=123451776
bits
123451776 bits / 8 bits/byte gives 15,431,472 bytes - as you would
have calculated if you had used 2268 not 2258.

So you have just proved the SD10 is a 3.4 Megapixel camera.

It is a 10.2 Megasensor camera.

Dan Pidcock, Oct 12, 2004