Velocity Reviews > Re: The death of the Bayer filter? Maybe not.

# Re: The death of the Bayer filter? Maybe not.

TheRealSteve
Guest
Posts: n/a

 04-23-2012

On Mon, 23 Apr 2012 05:02:29 +0200, Wolfgang Weisselberg
<(E-Mail Removed)> wrote:

>TheRealSteve <(E-Mail Removed)> wrote:
>> On Sat, 21 Apr 2012 07:25:36 -0700, nospam <(E-Mail Removed)>

>
>>>the data is precisely calculated and with a known error that is nearly
>>>always imperceptible.

>
>> Now you're playing the word games. Precisely calculated with an known
>> error. You're not making sense. If it's precisely calculated, it
>> wouldn't have an error, known or otherwise. And if there is a known
>> error, it can be eliminated.

>
>You have never worked with measurements with a known error band.
>You have not understood the difference between precision and
>exactness.

And you don't understand the difference between accuracy and
precision.

>Specially for you:
>If I measure a paper to be 0.01 mm thick, with a measurement error
>of 10%, then 1000 sheets of that paper would be 10±1mm thick.
>That's a precisely calculated result, with a known error (band).

Actually, it's a precisely calculated estimate, and it's only an
estimate. It can be very precise but not be accurate.

As an example showing the difference, think of an archer hitting a
target with arrows that are spread all over the place but the average
position of them all is centered at the bullseye. That is high
accuracy but low precision. Now, if all the arrows hit within 1mm of
eachother but were far off the center, that is high precision but low
accuracy.

>> The problem is that the error is unknown
>> within a statistical bounds. That's why it's only an estimate. You
>> either don't understand simple concepts or you're just playing games.

>
>You're talking about things you don't understand well enough.
>All you could say is something about where to set the error bounds.

It's obvious that you don't understand these simple mathmatical
concepts. You don't understand how calculating an estimate with a
statistical error associated with it is different than calculating the

Steve

TheRealSteve
Guest
Posts: n/a

 04-23-2012

On Mon, 23 Apr 2012 02:53:04 +0200, Wolfgang Weisselberg
<(E-Mail Removed)> wrote:

>TheRealSteve <(E-Mail Removed)> wrote:
>> On Fri, 20 Apr 2012 13:08:31 +0200, Wolfgang Weisselberg
>>>> On Wed, 18 Apr 2012 00:18:22 +0200, Wolfgang Weisselberg
>>>>>TheRealSteve <(E-Mail Removed)> wrote:
>>>>>> On Mon, 16 Apr 2012 11:48:05 -0800, http://www.velocityreviews.com/forums/(E-Mail Removed) (Floyd L.
>>>>>>>TheRealSteve <(E-Mail Removed)> wrote:

>
>>>>>> The sampling frequency for pixels of the same
>>>>>> color is what determines the Nyquist Limit for the sensor at sensing
>>>>>> that particular color.

>
>>>>>If there was no response in other colour pixels, that would
>>>>>even be correct. Unfortunately, this premise is only true in
>>>>>3-sensor systems. *Especially* the green pixels also react
>>>>>to red and blue. Just look at the transmission curves.

>
>>>> Which of course doesn't matter if the red or blue channels are
>>>> aliased.

>
>>>OK, so how are the 'red or blue channels' calculated?

>
>>>Hint: They're not only calculated from the red or blue pixels.

>
>> Here's a hint for you: when it comes to whether or not the red or blue
>> channels are aliased, it matters not a bit how they are calculated.
>> All that matters is how they are sampled.

>
>Yep, they are sampled at every pixel, with a strong response
>in red pixels, a medium response in green pixels and a weak
>response in blue pixels. And then there is an AA filter ...

And there enlies the problem. The strong response of the red color
sampling in the red pixels is sampled at a lower rate than the weak
red response in the green pixels. Therefore, the red sampling in the
red pixels has a much better chance of being aliased than the red
sampling of the green pixels. And once you get aliasing in any of the
channels, you get alias artifacts. The only way to lower the greater
amount of alias artifacts coming from the lower sampled channels is to
ignore the lower sampled rate data. But that brings up other problems,
like not having enough data at all to come up with chroma information.

And since we're trying to eliminate the harsh AA filter, that's not

>>>>>> The significance is that it doesn't matter what type of demosaicing
>>>>>> you are doing to generate luma and chroma. It doesn't matter that the
>>>>>> luma resolution may or may not be the same as the pixel resolution of
>>>>>> the bayer sensor. All that discussion is specious. What matters is the
>>>>>> sample frequency of the individual color channels since if any of them
>>>>>> are aliased, the final image will have artifacts.

>
>>>>>If the red channel was aliased, then the green channel with
>>>>>it's higher resolution would show that it is aliased. Thus
>>>>>the problem is solved.

>
>>>> Absolutely not problem solved. If the red channel is aliased, it
>>>> doesn't matter what the green channel is showing. The red channel will
>>>> still have alias artifacts that will make it into the final demosaiced
>>>> image.

>
>>>Only with a naive implementation.

>
>> Show me an implementation of a bayer cfa demosaicing algorithm that
>> can get rid of alias artifacts in the final image if the individual
>> color channels are aliased. I'd really would actually like to see one.

>
>Show me a test chart that
>a) creates aliasing in a colour channel

Easy. You don't even need a test chart. Just look at the picture of
the suit jacket that has been circulated.

>b) defies the AA filter in the camera

Which we're trying to reduce or eliminate because it robs the camera
of resolution. If you have to blur the pictures to mush, what's the
point of having a high resolution sensor?

>c) would not create aliasing with a monochrome sensor (i.e. one
> with not a per-pixel filter) with the same pixel size and
> density.

This is the easiest one of all. Just use a spatial resolution in the
test chart that's greater than the red/blue or even green pixel
density but is not greater than the overall monochrome pixel density.

>d) shows a situation that happens in the real world

Also easy. Just look at the picture of the coat. There is a very high
chroma mosaic pattern but not much luma mosaic pattern.

>Then we can talk ...

Somehow I still doubt you'd be qualified to talk about it.

>>>>>> In a bayer sensor, the green is 1/2 the linear resolution and the red
>>>>>> and blue are 1/4 the linear resolution of an equivalent 3 sensor
>>>>>> system. That 1/2 and 1/4 the resolution of a bayer sensor vs. a 3
>>>>>> sensor system means you have more of a chance to get aliasing with
>>>>>> bayer sensor. That is significant to this discussion.

>
>>>>>And exactly *how* did you calculate that an "equivalent" 3-sensor
>>>>>system would need between 840% and more than 1,800% sensels of the
>>>>>bayer sensor sensels? In which way would that be "equivalent"
>>>>>when neither the pixel count nor the sensel count nor the
>>>>>resolution nor the price is comparable?

>
>>>> An equivalent 3 sensor system will have 3 times the number of pixels
>>>> than the bayer cfa.

>
>>>So it's *not* equivalent. Neither in price, nor in weight nor
>>>in 'pixels'. It's like saying "an equivalent 15 inch gun has
>>>many times the shot weight than an .22 gun".

>
>> But it is equivalent in sensor size

>
>Nope, the total sensor size is 3 times as large.

Yup, because each sensor is the same size. The whole point of the 3
sensor system is to get more pixels and more overall sensor space
without having to use larger and higher density sensors. If you want
to eliminate the whole point of using a 3 sensor system, then you're
biasing the result. That's just as stupid as if I were to say, if the
3 sensor system has to use sensors that are 1/3 the size of the bayer
sensor then you have to use a monochrome filter on your bayer sensor,
just like one of the 3 sensor ones, or else it's not equivalent. Now
try to resurrect a color image from sensor with a monochrome filter.
You *have* to do that to keep it equivalent with the 3 sensor system.
Idiotic.

>> and pixel density,

>
>The pixel density at the 8 MPix 20D and the 22 MPix 5D3 is also
>practically identical.
>
>So the 20D and 5D3 are equivalent. By your logic, that is.

By my logic, only if the sensor size is the same. Remember, I said
pixel density *and* individual sensor size are the same. You're
comparing different size sensors so as usual, you have an invalid
analogy trying to prove inane logic.

>> which are the
>> technological parameters that limit resolution.

>
>Let's just name the AA filter, the lenses, the aperture, the
>camera shake, etc as further technological parameters that limit
>resolution. Oh, and the difference between 1 and 3 sensors.

Exactly. And it's the difference between the 1 and 3 sensors that
gives the 3 sensor system the resolution advantage over a single bayer
cfa of the same size and density as the ones used in the 3 sensor
system. Now you're finally starting to get it, I think.

>> You just have 3 of the
>> same sensor instead of 1.

>
>Just 3 times the sensor size, 5 times the cost, 10 times the
>weight, ...

Exactly. The resolution advantage of the 3 sensor system over the

>> The number of sensors is not a technological
>> parameter that limits resolution

>
>Beam splitters can and will impact the resolution at some point.

Maybe at some point, but we're not there yet. And since they are
better than the lenses you're likely to run into, they are not the
limiting factor.

>> unless for some reason, you cannot
>> produce more than a single sensor or cannot combine them into a 3
>> sensor system.

>
>So why not use a 100-sensor system? The non-perfect alignment of
>the sensors allows us to render subpixels! Why not a 10,000-sensor
>system? That would allow even smaller subpixel rendering!

Because you don't need to and because we were discussing a 3 sensor
system vs. a bayer cfa. No need to confuse yourself even further with
specious arguments.

>> I never said anything about being equivalent in terms of price or
>> weight, only in terms of sensor size and pixel density and one of them
>> vs. 3 of them. So your gun analogy is specious.

>
>Sensor size is 3 times as large.

No, it's not. Sensor size is the same. Overall sensor size is 3x as
large, which is the point of using a 3 sensor system. You get 3x the
sensor area with the same sensor size and pixel density by using 3 of
them.

>> However, if you want a gun analogy that makes sense, say you want to
>> get a certain amount of shot on a target downrange and you have a 12ga
>> shotgun. There is a limit to the amount of shot you can put in a 12ga
>> shot shell. You can keep putting in more and more until you reach that
>> technological limit, even if you go from a 2 3/4" shell to a 3" shell
>> to maybe a 3 1/2" shell, you'll eventually reach a max that your gun
>> will allow. If you need to get more shot on a target than a single gun
>> can do, use 3 of the same equivalent guns.

>
>> There's your valid gun analogy.

>
>And now the gun is much larger, has an equivalent diameter of
>over 2 times larger --- and you say they're equivalent?

It's not one gun that's any larger. It's 3 equivalent guns vs. one.
Another one of your invalid analogies trying to prove inane logic. You
will need more people to shoot them and coordinate the shot, which
adds complexity to the overall system. Just like there is more
complexity in the overall 3 sensor system vs. a single bayer cfa. But
you are getting an advantage for that overall complexity because you
are getting more shot downrange, or more image resolution by using
three guns/sensors vs. one equivalent one.

>>>> It will, of course, be more expensive. But it's
>>>> that expense that buys you the extra resolution that you can't get
>>>> with the bayer sensor at whatever pixel density you choose as the
>>>> current technological limit.

>
>>>The current technological limit --- as I already wrote --- is
>>>at *least* 176 MPix for FF.
>>> http://www.dpreview.com/news/2010/8/24/canon120mpsensor

>
>>>> I.e., if current technology limits the
>>>> resolution of a monochrome sensor to X, A bayer sensor will have an
>>>> overall resolution of X but the individual color channels will be
>>>> sampled at 1/2 X for green and 1/4 X for red and blue.

>

>
>> Ok, not to be misleading, I won't quote any resolution except to say
>> that the red and blue channels have *less* resolution than the green
>> channel.

>
>If the channel is built solely from red respective blue pixels,
>and then interpolated from that data only, yes.

Even if it's not. The fact is that you are sampling different
frequency bands of light at different spatial resolutions and whether
aliasing is present in the final image is determined by the lowest
sampled resolution, not the overall resolution.

>However, that's a naive implementation.

But it's reality. The reality is that whether you get alias artifacts
in the final image when you combine the channels is determined by the
resolution of the lowest sampled channel.

>> And all of the channels have *less* resolution than an
>> equivalent monochrome sensor would have. And by equivalent, I mean the
>> same size and pixel density.

>
>> How's that? Not misleading anymore I hope.

>
>Yep --- however, the monochrome sensor doesn't have *any*
>colour information. No chroma resolution at all. So I'll

Which is why you use 3 of them, to get the color information at the
same resolution as a monochrome sensor.

>trade some luma resolution for chroma resolution ... and gain
>in the end.

You can do all the trades you want. You'll still get alias artifacts
in the final image if any of the color channels are aliased.

>>>> An equivalent 3
>>>> sensor system using 3 of the same sensors as the bayer sensor that is
>>>> at the current technological resolution limit will have a resolution
>>>> of X for each of the color channels.

>
>>>Due to non-aligned sensors, it won't.

>
>> If the sensors are aligned, it will.

>
>If I was a millionaire ...

Both non-sequitur and specious.

>> If they are not aligned and
>> instead are corrected in software, it will still have better
>> resolution than an equivalent bayer sensor. And by equivalent, I mean
>> each of the 3 monochrome sensors is the same size and pixel density as
>> the bayer cfa.

>
>That's 3 times the sensor size.

Now you're getting it. You get 3 times the sensor size by using 3
sensors that are equivalent to the bayer sensor. Finally!!!!!

>>>> That's why 3 sensor systems are used,

>
>>>Where?
>>>At what resolution?

>
>> How about TV cameras that are old enough to be require 3 sensor
>> systems because a single sensor that could provide the required
>> resolution was beyond the limit of the technology at the time?

>
>the colour filters for the pixels and to interpret them fast
>enough. Old TV cameras were all analogue.

I just saw a 60 minutes piece in high def that used a 3 sensor system
for the camera. So it's not only about old analog TV.

>>>> to get greater resolution in the color channels

>
>>>Which your eye can't see anyways.

>
>> Which doesn't matter a single bit when it comes to aliasing.

>
>See above: give me a test case.

Already provided in the example of the suit jacket with high amounts
of color banding but much less luma banding.

>>>> without having
>>>> to go beyond the technological limit of sensor resolution.

>
>>>Name *one* 170 MPix DSLR. Oops --- none available. Not even
>>>close. So basically 'having to go beyond the technological limit
>>>of sensor resolution' is a smoke grenade. It's just no problem
>>>anywhere ...

>
>> The fact that you can't name a single 170MP DSLR proves my point that
>> there is always a limit to what a single sensor can do.

>
>LOL. You're the one claiming a limit. I showed you the
>pixel density is there, and not a problem.

Lol! Show me a 170MP DSLR camera.

>> If you need a
>> resolution greater than what you can get with the current crop of
>> bayer cfa sensors, a 3 sensor system could give you what you need. Or
>> it may not. But it will give you better resolution than a single
>> sensor albeit at greater cost, weight, complexity, etc.

>
>Name me one 3-sensor DSLR from within the last 5 years.

Right after you show me a 170MP DSLR. And don't worry, I will be able
to show you a camera system that has the color resolution of the
monochrome sensor of an equivalent bayer cfa. But you have to show me

>>>>>Me thinks you're mussing some basic facts here and from your
>>>>>wrong assumptions terribly confused results are appearing.

>
>>>> Just because you are confused doesn't mean I'm mussing anything.

>
>>>It's not me who is confused. It's you ..

>
>> By your posts, you've proven that's not the case.

>
>Says you.

Good comeback.

Steve

TheRealSteve
Guest
Posts: n/a

 04-23-2012

On Mon, 23 Apr 2012 03:52:57 +0200, Wolfgang Weisselberg
<(E-Mail Removed)> wrote:

>TheRealSteve <(E-Mail Removed)> wrote:
>
>> On Fri, 20 Apr 2012 13:52:26 +0200, Wolfgang Weisselberg
>> <(E-Mail Removed)> wrote:

>
>>>TheRealSteve <(E-Mail Removed)> wrote:
>>>> On Wed, 18 Apr 2012 00:33:14 +0200, Wolfgang Weisselberg
>>>>>TheRealSteve <(E-Mail Removed)> wrote:

>
>>>>>> No I'm not comparing apples to oranges. I'm comparing apples to
>>>>>> apples, just one apple to 3 apples.

>
>>>>>For the price of your 3 apples I can buy a very large apple
>>>>>that easily outweights the resolution of your 3 apples.

>
>>>> And then I can buy 3 of those very large apples.

>
>>>Sure.

>
>>>For thrice the price.

>
>> Yes, for thrice the price.

>
>And that's equivalent, then?

In terms of sensor size and pixel density, yes it's equivalent.

>>>And then --- for the *same* price you pay, I buy an even
>>>larger apple that easily outresolves your 3 apples. And I

>
>> And I can then buy 3 of those larger apples for thrice the price that
>> easily outresolves your single larger apple.

>
>And I'll take thrice the money and buy a whole planet sized apple.

And I'll buy 3 of the exact same whole planet sized apples and have a
system with equivalent apples that outresolves your single planet
sized apple.

>>>don't need to stop until the resolution is so high that no
>>>lens can keep up and nothing's won by 3 times the money.

>
>> Once you get there, then there's no more reason to use a 3 sensor
>> system.

>
>And we're there.

Not yet we aren't. If we were, then an AA filter wouldn't be needed at
all and I'd have what I wanted all along.

>> As I said before, if the bayer sensor is not the limiting
>> factor to the resolution needed to prevent aliasing without an AA
>> filter, then there's no need to go to a more complex and costly
>> system.

>
>The Bayer sensor is not the limiting factor, AA filter or not.

The fact that it needs an AA filter at all to prevent aliasing under
any possible conditions proves that the bayer sensor *is* the limiting
factor.

When there finally is a camera with a 170MP bayer sensor then maybe
the bayer sensor won't be the limiting factor anymore. Actually, you
don't need anything nearly that high. But we aren't yet where we need
to be in order to have the bayer sensor not be the limiting factor.

>>>And did I mention you cannot align the 3 apples properly?

>
>> Of course I can. The technology exists to align the 3 apples perfectly
>> using both hardware and software.

>
>Or so you think. Feel free to build a tech demo.

>>>>>> There is a technological limit to
>>>>>> the resolution of any sensor. No matter where you draw that limit,
>>>>>> there is a limit. And whatever that limit is, if your bayer sensor is
>>>>>> at the current limit (which it frequently is when it comes to high-end
>>>>>> cameras) the only way to get the higher resolution of a 3 sensor
>>>>>> system over a bayer sensor is to use 3 of them.

>
>>>>> http://www.dpreview.com/news/2010/8/24/canon120mpsensor
>>>>>And that's not even a full frame sensor. As a FF, that'd be
>>>>>175 MPix. And that's just the current demonstration, not a
>>>>>hard limit. So there's no way any DSLR is even near the limit,

>
>>>> Despite your technological ignorance, I'm not saying anything is near
>>>> a limit. I'm saying that whatever the technological limit is (and it
>>>> has changed drastically over the years) you can do better job of
>>>> sampling the individual color channels with 3 sensors at that limit
>>>> than with 1.

>
>>>You're fresh out of arguments, thus the personal attacks.

>
>> If personal attacks signals being fresh out of arguments, you would
>> have lost long ago.

>
>Kindly differenciate between shooting down your arguments and

I am.

>>>> When that 120 MP sensor is eventually used in a camera (who knows when
>>>> that will be) then the question of whether an AA filter is required
>>>> will be answered with a resounding NO. And there will be no point in
>>>> comparing that sensor with a 3 sensor system of the same resolution
>>>> because a 120MP bayer sensor is way more than adequate. A 3 sensor
>>>> system of that resolution is entirely overkill with any of the lenses
>>>> any of us are likely to run into.

>
>>>naah, naah, naah" is, in the end, completely pointless?

>
>>>Thank you.

>
>> Absolutely not pointless given the criteria that the single bayer
>> sensor cannot resolve the individual color channels well enough to
>> prevent aliasing.

>
>Have you ever, *ever* heard of AA filters?
>So where exactly is your problem?

You mean the ones that turn photos into much unnecessarily and still
don't prevent aliasing in all circumstances? Yes, I have heard of
them.

>And even if you haven't heard of an AA filter, the technology
>is there for very high MPix sensors ...

We're waiting.

>> then an equivalent 3 sensor
>> system (equivalent in terms of individual sensor size and pixel
>> density) may be the answer.

>
>Much as applying a hammer to your fingers may be the answer.

It may be if the question is how to cause a blood blister and/or break

>> when you start bringing up 170MP
>> sensors that you can't buy anywhere yet.

>
>> Thank you.

>
>Oh, but you can. At worst, you'll have to buy up Canon.

Much easier to use a 3 sensor system.

>>>>>Unfortunately, even at a mere 8 MPix, 3-sensor systems have hit
>>>>>*their* practicable limit. They cannot be aligned properly.
>>>>>them on the fly, if that's even possible technologically, they'd
>>>>>need to guess an awful lot without a specific target. And so on.

>
>> And here you are admitting that using 8MP sensors (which were a
>> technological limit at one time) a 3 sensor system would outperform a
>> single bayer sensor.

>
>And here you admit you do not need any drugs to hallucinate things.

non-sequitur.

>>>> If you have the necessary color information at each sensor (which a 3
>>>> sensor system does) it's not a big deal to align them with processing,
>>>> similar to what something like registax does. It just takes processing
>>>> power, which today's computers have more than enough of. You only need
>>>> to interpolate values between any 2 pixels of the same color, which is
>>>> much easier and better than what the demosaicing algorithms of a bayer
>>>> sensor have to deal with.

>
>>>4 pixels, they're misaligned horizontally, vertically and
>>>rotationally. So you're interpolating between 4 values, need
>>>demosaicing techniques (like gradient observation) to make actual
>>>use of the pixels and are still nowhere as good as you claim
>>>you are.

>
>> You need to interpolate way more than 4 pixels to get a decent image
>> out of a bayer cfa. So you're still ahead with the 3 sensor system.

>
>'So'? While there is a "interpolation is worse than the
>data straight from the pixel" rule, there is no "more pixels
>interpolated means worse data" rule. In fact, for better quality
>you need to interpolate between more than just 4 pixels in 3-sensor
>systems to get better quality.

But the interpolation isn't the problem at all. The problem is
aliasing due to undersampling. And while interpolation and upsampling
is a good way to make it easier to remove aliasing with a
reconstruction filter, it doesn't help if the original data is
undersampled.

>>>>>And they are much heavier, troublesome and expensive, for no
>>>>>better results.

>
>>>> The only comparison point I was ever claiming that a 3 sensor
>>>> monochrome system is better than a bayer cfa is in reducing aliasing
>>>> in the color channels.

>
>>>Irrelevant due to AA filter.

>
>> Which limits resolution even further

>
>Which is only relevant if the needed resolution is met.

Which it isn't, by definition of the argument. Obviously if a bayer
cfa with an AA filter meets the required resolution, then that's all
you need. But the argument is that if it doesn't meet the required
resolution, then one way of increasing the resoltion is by using a 3
sensor system with 3 sensors, each one equivalent in size and density
of the bayer. Where have you been? No wonder you're so confused. You
don't even understand what you're trying to argue against.

>> and doesn't solve aliasing.

>
>Aha. Do you have proof for that? Not only that *some* camera
>implementations have an too light AA filter, but that there *can*
>be *no* AA filter that solves aliasing?

It's always a tradeoff. If you have a strong AA filter then you limit
resolution to much lower than the sensor resolution. If that's all you
need, then fine, that's all you'll get. If you need more, then the 3
sensor system can give you more. You really need to understand what
you're arguing against or else you'll just look as confused as you
are.

>>>> And it is better in doing that, and any claims
>>>> otherwise show an ignorance of sampling theory.

>
>>>Not at the same cost.
>>>Not at the same weight.
>>>Not at the same sensel count.

>
>> Yes, yes, and yes. I never claimed the same cost, weight or selsel
>> count. I claimed more cost, more weight and 3x the sensel count. And
>> it's that 3x the sensel count that gives you the benefit I'm claiming,

>
>And 3 times the silicon real space.

Using 3 separate equivalent sensors.

>That's not equivalent in any way.

See the sentence above for why it is equivalent in a way.

>>>BTW, Bayer is also better than Bayer in reducing aliasing in
>>>the colour channels, if you triple the sensel count.

>
>> Which you cannot do without increasing the sensor size and/or pixel
>> density.

>
>So?

Lol... Talk about equivalence. You want to increase the sensor size
and/or pixel density and say they are equivalent. Once again, you're
looking fooling because you don't understand what you're arguing
against.

>> So that's a specious argument when the criteria for the
>> comparison is the same sensor size and pixel density since those are
>> the technogically limiting parameters.

>
>It's easy enough to join sensors border to border with at most
>one pixel row missing. So the sensor size isn't a real problem.
>And pixel density? Do I really need to remind you of the Canon
>sensor?

And once that Canon sensor is available, you might not need the 3
sensor system that still has better resolution than the bayer sensor
.... simply because the bayer sensor will finally be good enough.

>>>> Yes, they are much
>>>> heavier, troublesome and expensive for not much better results except
>>>> for the one and only claim I was making.

>
>>>And that same better result is being archived simply by using
>>>a 'better' Bayer sensor.

>
>> I'm waiting. Once again, you're confused as to the comparison point. A
>> 'better' bayer sensor can always be bested by a 'better' 3 sensor
>> system given the same sensor size and pixel density.

>
>Sure.

>It's irrelevant, as you have admitted yourself.

Admitting you're wrong and finally agreeing with me is not irrelavent.

>>>>>> Now I can see why you're so confused. You're artificially limiting
>>>>>> each of the 3 sensors to 1/3 the resolution of the bayer sensor when
>>>>>> in reality, each of the 3 sensors has the same resolution as the bayer
>>>>>> sensor. You just have 3 of them giving 3x the number of total pixels.

>
>>>>>OK, if it is soooo easy, show us the high resolution cameras
>>>>>using 3 sensors. If that's a viable technology, it surely is
>>>>>being actively used.

>
>>>> I never said it was easy. I only said it solves the problem of
>>>> pre-mature aliasing in the color channels of a bayer cfa of the same
>>>> pixel density in lpm as a 3 sensor system. Nothing more, nothing less.

>
>>>In other words: in irrelevant circumstances.

>
>> You would consider aliasing an irrelevant circumstance? I woldn't. And
>> I suspect many others wouldn't either.

>
>I consider an AA filter.

Which doesn't eliminate aliasing and also reduces resolution. Already
discussed. No need to bring it up again.

>>>>>Or are you insisting on a global conspiracy against 3 sensor
>>>>>cameras?

>
>>>> Yes, there's a global conspiracy. Ssshhhhhhhhhhhhhhush.

>
>>>Now you said it, now I've got to kill you.

>
>> Uh oh... (hiding)

>
>An ICBM with nuke-tipped MIRVs is on the way.

I'll have my patriots ready. And not the old ones from the first gulf
war.

Steve

TheRealSteve
Guest
Posts: n/a

 04-23-2012

On Mon, 23 Apr 2012 03:55:56 +0200, Wolfgang Weisselberg
<(E-Mail Removed)> wrote:

>TheRealSteve <(E-Mail Removed)> wrote:
>> On Fri, 20 Apr 2012 13:56:13 +0200, Wolfgang Weisselberg
>>>TheRealSteve <(E-Mail Removed)> wrote:

>
>>>> Ok, I'll definitely admit that a bayer sensor is cheaper and lighter
>>>> than an equivalent 3 sensor system if you admit that the 3 sensor
>>>> system samples the color channels at a greater lpm for the same
>>>> overall sensor lpm.

>
>>>Sure it samples, but unaligned ... and (since the eye isn't
>>>so good at colour resolution) unneeded.

>
>> Samples can be aligned and still beat a bayer sensor.

>
>A difference that's not worth the added complications.

That depends on whether the additional resolution is needed or not.
For the sake of argument, it's needed. If additional resolution isn't
needed, a bayer sensor with a mushy AA filter may be good enough.

>> What the eye can
>> see is irrelevant when it comes to sampling. You can see color banding
>> in the final image from a sampled resolution far above what the eye
>> can see.

>
>That's what a proper AA filter is for.

Yes, turning your image to mush without eliminating the chance of
getting aliasing.

Steve

TheRealSteve
Guest
Posts: n/a

 04-23-2012

On Mon, 23 Apr 2012 04:48:14 +0200, Wolfgang Weisselberg
<(E-Mail Removed)> wrote:

>TheRealSteve <(E-Mail Removed)> wrote:
>> On Sat, 21 Apr 2012 07:25:49 -0700, nospam <(E-Mail Removed)>
>>>In article <(E-Mail Removed)>, TheRealSteve
>>><(E-Mail Removed)> wrote:

>
>>>> >data on successive occasions. But in fact it is calculated,
>>>> >and the same data will always produce the same result because
>>>> >there is no guess work by the algorithm.

>
>>>> Just because the result is repeatable doesn't mean it's not a guess.

>
>>>if it's repeatable, it's *not* a guess.

>
>> Wrong. You can easily make the same guess of the outcome of something
>> given the same conditions. Even flipping a coin, I can write an
>> algorithm that always guess heads and be wrong only half the time. And
>> it's still just a guess even though it's repeatable.

>
>Write an algorithm that's right in 98% of the time ...

You can write an algorithm that's right more than 98% of the time if
it has enough data input to it. Things like the vector of force
applied in flipping the coin, mass, CoG and CoP of the coin,
atmospheric conditions like density, air movement, etc., distance from
flip to the landing surface, material properties of the landing
surface, a high resolution 3d map of the gravitational field the coin
will traverse, etc. etc.

All that data input to the right algorithm can turn a dumass 50/50
guess into a much more educated guess that can be correct much more
than 50% of the time. Maybe even as high as 98% of the time or higher.
But it's still an educated guess and could still be wrong. Calculating
2+2 can give the correct answer without much chance of it being wrong.

Steve

TheRealSteve
Guest
Posts: n/a

 04-23-2012

On Mon, 23 Apr 2012 04:44:00 +0200, Wolfgang Weisselberg
<(E-Mail Removed)> wrote:

>TheRealSteve <(E-Mail Removed)> wrote:
>> On Fri, 20 Apr 2012 18:29:29 +0200, Wolfgang Weisselberg
>>>TheRealSteve <(E-Mail Removed)> wrote:
>>>> On Tue, 17 Apr 2012 20:40:34 +0200, Wolfgang Weisselberg
>>>>>TheRealSteve <(E-Mail Removed)> wrote:
>>>>>> On Mon, 16 Apr 2012 19:47:55 +0200, Wolfgang Weisselberg
>>>>>>>TheRealSteve <(E-Mail Removed)> wrote:

>
>>>>>>>> And finally, you need to understand how a 3 sensor camera system works
>>>>>>>> as opposed to a bayer filter. How a 3 sensor system uses 3
>>>>>>>> monochromatic sensors, each of which samples it's assigned color at
>>>>>>>> the full rate of the substrate under the filter while a bayer sensor
>>>>>>>> samples each of it's assigned colors at only 1/2 or 1/4 the rate of
>>>>>>>> the substrate under the filter.

>
>>>>>>>You're forgetting that a 3-sensor system has a rather narrow
>>>>>>>bandwith for each of the 3 colours. Bayer pattern sensors have
>>>>>>>a much wider bandwith in each colour filter.

>
>>>>>>>Which in turn means that bayer pixels pick up more signal ---
>>>>>>>especially about luminance. In the real world high frequency
>>>>>>>luminance detail does not matter much to humans; humans are not
>>>>>>>equipped to detect it.

>
>>>>>> Nonsense. It has nothing to do with the bandwidth of the color
>>>>>> filters.

>
>>>>>Of course it has. The wider the filter, the more signal it
>>>>>can pick up. Or would you say a bandpass filter of just one
>>>>>wavelength would let through the same amount of light as a
>>>>>filter so wide that all of the light can pass?

>
>>>> I'm not saying anything about the amount of light let through.

>
>>>So you agree that the green pixels in a bayer sensor also
>>>pick up red and blue signals?

>
>> The green pixels may pick up a little red and blue signals because
>> like AA filters, real world color filters aren't perfect.

>
> http://infoscience.epfl.ch/record/17...poorEI12_1.pdf
>It's not just a little.

Which, of course as you should know, doesn't matter. If one of the
channels is aliased due to undersampling, you will get alias artifacts
in the resultant demosaiced image.

>> But just
>> because the green pixel may pick up a tiny bit of unaliased red and
>> blue light doesn't mean that the majority of the red and blue light
>> coming from the aliased red and blue pixels aren't going to cause
>> major problems in the final image.

>
>Test case, please. A real world test case is preferred.

Look at the jacket picture provided earlier in the thread. It's an
example of the red and blue color channels having more aliasing than
the green color channel. And, while the example doesn't show it, by
extension, the green color channel has more aliasing than an
equivalent monochrome sensor would have had.

>>>> All I'm
>>>> saying is that any one of the color channels can be aliased and data
>>>> from the other color channels cannot completely solve that problem
>>>> without tedius, time consuming and intesive work at guessing just how
>>>> to remove the aliased information.

>
>>>That's the same "teduous, time consuming and intensive work"
>>>needed to realign 3 sensors, to which you only had to say
>>>that computers are fast enough?

>
>> If you actially believe that the complexity of the algorithms to
>> realign 3 sensors (something similar to registax) is anything near the
>> complexity needed to remove artifacts caused by aliasing (which really
>> can not be perfectly removed no matter what you do algorithmically)
>> then you better cut back on the meds.

>
>Maybe you should *take* your meds. Yes, that's a personal attack,
>but that's tit for tat.

Right, since you can't effectively argue against the point, you made a
personal attack without even trying to argue against it. On the other
hand, I made a personal attack *and* successfully argued my point.

>AA filter.

You mean the ones that reduce resolution without eliminating aliasing?
Those AA filters? Very weak.

>>>>>> The transmittance of the color filters may have something to
>>>>>> do with it but that is likely the exact same or extrememly close for a
>>>>>> bayer cfa vs. monochrome filter.

>
>>>>>Ah --- no. A dichromatic filter is more efficient. But not
>>>>>wide enough for bayer.

>
>>>>>> If anything, a monochrome filter is
>>>>>> likely more transparent because it's easier to produce.

>
>>>>>Black paint is easier to produce than a filter, is it more
>>>>>transparent? No? Then you have found your logic fails.

>
>>>> Flaws in your analogies don't equate to flaws in my logic.

>
>>>Flaws in your logic brought to their ultimate conclusion do
>>>not equate to flaws in my analogies.

>
>> But since your analogy is flawed it proves nothing about my logic.

>
>My analogy isn't flawed, however.

Unfortunately for you, it is. If it was not flawed, I could just as
easily say that breaking the sensor under my shoe is easier than
producing black paint. That was just as valid an analogy as your black
paint being easier to produce than a filter.

>>>>>> And also, specious when it comes to what humans are or are not able to
>>>>>> detect. We're talking about aliasing artifacts like moire, which can
>>>>>> be detected easily by the human eye even if the real world high
>>>>>> frequency detail that caused the aliasing is not.

>
>>>>>Visible aliasing artifacts are low frequency details.

>
>>>>>The correct solution, then, is to use a properly blurry enough
>>>>>signal so all those high frequency details that can produce
>>>>>aliasing with the sensor sampling frequency are washed away.

>
>>>> Or, have a sensor resolution great enough that an AA filter isn't
>>>> necessary because the other parts of the camera system (like the lens,
>>>> focusing, etc.) can't resolve the high frequency details enough to
>>>> cause aliasing. That is the best of both worlds... you have the
>>>> maximum resolution that the rest of the camera system is capable of
>>>> without the blurring of an AA filter that reduces the resolution to
>>>> lower than what the rest of the camera system is capable of.

>
>>>Which does not preclude a Bayer system from being that
>>>sensor.

>
>> You finally said something that's correct.

>
>See? 3-sensor systems aren't necessary.

All along I've said that a 3 sensor system would be necessary only if
you needed more resolution than an equivalent (in terms of sensor size
and pixel density) bayer cfa can provide. And I'm still saying that.
So if you don't need more resolution than a mushy output from an AA
filtered bayer cfa, then obviously a 3 sensor system isn't necessary.

>>>> There are still
>>>> frequencies of light in the original image that can be aliased enough
>>>> in one color channel to cause visible artifacts in the final image,
>>>> but not in the other color channels.

>
>>>Q: does that happen in the real world or only in specially
>>>prepared experiments?

>
>> Yes, it happens in the real world if the camera system (lens,
>> focusing, etc.) is able to resolve well enough that the resolution of
>> the image projected on the sensor exceeds nyquist, even with the
>> imperfect AA filter.

>

See the jacket.

>> Getting "technical", real world images, like that suit jacket that had
>> the moire pattern or maybe a picture of the crowd at a baseball game,
>> have a very widely distributed range of spatial frequencies across the
>> wideband temporal frequencies of light. Because you're sampling
>> different, narrower temporal frequency bands at different spatial
>> rates, the amount of aliased artifacts can be different for the
>> different temporal frequency bands.

>
>Show pix. Prove that no AA filter would work there.

Look up the link. It's eariler in the thread. A stronger AA filter may
work there, at the expense of resolution. Which it the point all
along. You're making my argument for me when you argue that a stronger
AA filter is needed.

>> Assuming that the sampled temporal bands have similar spatial
>> frequency content (a very good assumption for the baseball crowd,
>> maybe not so good for a field of green grass) the temporal bands that
>> are sampled at higher spatial rate will have less aliasing present
>> than those temporal bands that are sampled at a lower spatial rate
>> because less of the spatial frequency content will be aliased at the
>> higher sample rate.

>
>Temporal bands? We're not doing video, are we?

Temporal in this case is the frequency (or wavelength if you want to
work that way) of light, which determines the color we see.

>> [...]
>>>>>> Interpolating between not only 2x2 but 3x3 or more
>>>>>> sensels of a bayer sensor also dampens high frequency detail.

>
>>>>>Since the interpolation is not simply averaging --- as you
>>>>>seem to think for some reason --- your argument is not as
>>>>>valid as you think it is.

>
>>>> I never said it was simply averaging. Although it's close to that but
>>>> with different weights associated with different adjacent pixels.

>
>>>With different *dynamic* weights.

>
>> Doesn't matter whether weights are dynamic or not when it comes to
>> aliasing. In fact, the eitner algorithm doesn't really matter either
>> because none of them can remove the aliasing artifacts. The
>> differences are in how the artifacts are presented.

>
>AA filter.

Thanks for making my argument for me once again.

>And using all pixels.
>
>
>>>> What
>>>> I'm saying is that it doesn't matter what the interpolation algorithm
>>>> is.

>
>>>Actually, it does matter.

>
>> Nope, it doesn't when it comes to whether the image has aliasing
>> artifacts. Only how they are presented.

>
>The same is true for n-sensor systems: if there's aliasing, there
>is aliasing. And a 3-sensor system isn't any less susceptible
>to aliasing, assuming they're well designed.

Ah, but a 3 sensor system is less susceptible to aliasing for the same
sensor size and pixel density, but using 3 of them in a 3 sensor
system vs. one with a bayer cfa over it. Which is, once again, the
point.

>End of story.

Correct, end of story. It just didn't end the way you'd like.

>>>> If it's using sample data that is aliased to come up with
>>>> estimates of the missing data, the resultant estimates will also be
>>>> affected by the aliased data.

>
>>>And that's as true for Bayer as for Foveon as for 3-sensor as
>>>for any other sampling system. A proper low-pass filter
>>>guards against that problem.

>
>> Correct, except for one thing. A Fovean or 3-sensor system will have
>> less aliasing than a bayer sensor of the same spatial resolution.

>
>Wrong. A Bayer sensor may have a stronger AA filter or need
>more pixels --- at worst --- but that's all.

HA!!! Thank you. You have finally admitted I'm right all along...
You've finally admitted that a bayer sensor needs more pixels or a
stronger AA filter (reducing the resolution of the image) to even
compete with a 3-sensor or fovean system in terms of resolution of all
the color channels and/or eliminating alias artifacts. Sheesh,
finally.

No more needs to be said so I'll snip the rest.

Steve

TheRealSteve
Guest
Posts: n/a

 04-23-2012

On Mon, 23 Apr 2012 04:38:40 -0500, Andrew Haley
<(E-Mail Removed)> wrote:

>TheRealSteve <(E-Mail Removed)> wrote:
>>
>> On Sun, 22 Apr 2012 09:57:19 -0500, Andrew Haley
>> <(E-Mail Removed)> wrote:
>>
>>>TheRealSteve <(E-Mail Removed)> wrote:
>>>>
>>>> On Wed, 18 Apr 2012 11:01:38 -0500, Andrew Haley
>>>> <(E-Mail Removed)> wrote:
>>>>
>>>>>>>Only a couple of posts back you were saying that you didn't want
>>>>>>>the "blurring" of an anti-aliasing filter. But now you are saying
>>>>>>>that some high-frequency high-contrast details should be removed
>>>>>>>in software.
>>>>>>
>>>>>> Correct. Those are not opposites. Keep all the high-frequency
>>>>>> high-contrast details that would occur in almost any real-life
>>>>>> situation but get rid of the output from obviously bad sensors.
>>>
>>>That's done anyway, in the defect map.

>>
>> Problem is when pixels go bad over time. Right now, at least for
>> Nikon, you have to send the camera in for repair to update the defect
>> map when it could be done automatically.

>
>concentrate on a pixel that is illuminated by a bright white point, a
>situation that is not so unusual.

Actually, it is fairly unusual with something like a 36MP sensor.
Especially considering phase detect "hunt and miss" focusing systems
vs. one that can iteratively come up with a better focus but takes
more time and also considering that if you're taking a picture in the
situation you mentioned, bright sunlit white seaspray causing sprat
highlights, it's very likely that you're stopped down to something
less than, say, f11 and that will cause airy discs. So yes, it is
fairly unsual to have a saturated pixel totally surounded ones that
have very low light. And that's the only situation the bad pixel
detect algorithm will respond to. But we'll let all that slide for
now.

>>>> A white highlight from sea spray is not from a single pixel and
>>>> would be kept.
>>>
>>>That is simply not so. Given a very sharp lens, in the absence of an
>>>anti-aliasing filter, a bright white point would illuminate only
>>>one pixel. That pixel would show up as as blue, red or green.

>>
>> Correct, that it would be blue, red or green. Actually, not perfectly
>> blue red or green. But noticeably shifted in that direction from
>> white, which is what it should be. But since you said it's showing up
>> as white,

>
>I didn't say that. I said it was white in the scene, not that it was
>showing up as white in the image. It shows up as coloured in the
>image.

Ok. Originally it was showing up as white in the image but we'll let
that slide for now also.

>> then it's obviously not hitting just a single pixel and so would not
>> be removed by a bad pixel detection algorithm.
>>
>>>> A single hot pixel would show as a blue, red or green spot on an
>>>> image even if you had the lens cap on. Obviously not real data and
>>>> nothing an AA filter can help with. It would be nice to remove that
>>>> noise from the image.
>>>
>>>That is the whole point, it is not obviously "not real data." In the
>>>absence of an anti-aliasing filter you have no way to tell whether
>>>it's bad data or not. It's not at all likely to be a hot pixel: these
>>>are due to sensor defects and don't show up for just one image.

>>
>> Um, yes you do.

>
>OK, how? If you have no AA filter and you see a red, green, or blue
>pixel, how do you know if it's bad data or not?

You don't. That's the whole point I was making. The sensor resolution
isn't good enough. But with a 3 sensor system, you might be able to
tell if it's real data or not because it has overall greater
resolution for the same pixel density and individual sensor size.

>> The whole point is to make the sensor not be the limiting factor wrt
>> resolution. If you make the sensor able to resolve better than the
>> rest of the camera/lens system then you don't need an AA filter
>> blurring things unnecessarily and you won't get single pixel
>> highlights that could be mistaken for real data.

>
>In which case, you have the lens performing anti-aliasing. True, but
>that is not the case I'm talking about, which is a 36 Mpix camera with
>a really sharp lens. (And, if there's not a really sharp lens,
>there's little point either in having 36 Mpix or disabling the AA
>filter.)

But there is a point to having a 36MP sensor and disabling the AA
filter, and that especially true if you don't have a really sharp
lens. The point is not to let the sensor unnecessarily be the limiting
factor when it comes to resolution. If your lens can resolve down to
36MP then make it a 40MP sensor. If the lens can resolve down to 40MP,
make it a 50MP sensor. And if you get to the point where the lens can
resolve better than the current limit of a single sensor (which, for a
bayer cfa is necessarily less than the overall sensor resolution) make
it a 3 sensor system. If you never get to that point then don't bother
with a 3 sensor system.

Steve

nospam
Guest
Posts: n/a

 04-23-2012
In article <(E-Mail Removed)>, TheRealSteve
<(E-Mail Removed)> wrote:

> Look at the jacket picture provided earlier in the thread. It's an
> example of the red and blue color channels having more aliasing than
> the green color channel. And, while the example doesn't show it, by
> extension, the green color channel has more aliasing than an
> equivalent monochrome sensor would have had.

a monochrome, foveon or a 3 sensor system of the same resolution would
alias the same, but without false colour.

> All along I've said that a 3 sensor system would be necessary only if
> you needed more resolution than an equivalent (in terms of sensor size
> and pixel density) bayer cfa can provide. And I'm still saying that.
> So if you don't need more resolution than a mushy output from an AA
> filtered bayer cfa, then obviously a 3 sensor system isn't necessary.

bayer sensors with aa filters do not produce mushy output.

> >The same is true for n-sensor systems: if there's aliasing, there
> >is aliasing. And a 3-sensor system isn't any less susceptible
> >to aliasing, assuming they're well designed.

>
> Ah, but a 3 sensor system is less susceptible to aliasing for the same
> sensor size and pixel density, but using 3 of them in a 3 sensor
> system vs. one with a bayer cfa over it. Which is, once again, the
> point.

it is not less susceptible at all.

> No more needs to be said so I'll snip the rest.

best you do, as you keep digging yourself into a deeper and deeper hole.

nospam
Guest
Posts: n/a

 04-23-2012
In article <(E-Mail Removed)>, TheRealSteve
<(E-Mail Removed)> wrote:

> It's obvious that you don't understand these simple mathmatical
> concepts. You don't understand how calculating an estimate with a
> statistical error associated with it is different than calculating the

how do you know they are *exactly* 2? you don't. therefore, the answer
is 'an estimate,' as you call it.

TheRealSteve
Guest
Posts: n/a

 04-24-2012

On Mon, 23 Apr 2012 08:10:36 -0700, nospam <(E-Mail Removed)>
wrote:

>In article <(E-Mail Removed)>, TheRealSteve
><(E-Mail Removed)> wrote:
>
>> It's obvious that you don't understand these simple mathmatical
>> concepts. You don't understand how calculating an estimate with a
>> statistical error associated with it is different than calculating the

>
>how do you know they are *exactly* 2? you don't. therefore, the answer
>is 'an estimate,' as you call it.

More proof you don't really understand simple concepts. Keep those
humorous posts coming.