[A complimentary Cc of this posting was sent to

HvdV

<(E-Mail Removed)>], who wrote in article <46287046$0$2026$(E-Mail Removed)>:

> If it were indeed Gaussian, and the image noise-free, then indeed

> you could do an inversion resulting in unlimited

> resolution. However, the blur function is band limited, varies from

> point to point in your image AND there is noise, always.
Another point is quantization (which could be considered as noise too,

of course). E.g., at 5sigma, one loses 18 bits of S/N; even if there

is no noise, one needs to quantize the result at about 28bits to get

decent results.

[Here "at 5sigma" means the spacial frequency;

e.g., for a Gauss blur with radius r, this corresponds to wavenumber

5/r, or half-wavelength of r*pi/5. E.g., this is applicable to

maximal resolution details blurred with a gaussian of radius 1.6 pixels.]

But indeed, the main "theoretical" reason is that diffraction is not

Gaussian; it COMPLETELY cuts off the high frequencies (see keywords

"Fourier optic" for math behind this; this is what is called "band

limited" above, but I wanted to emphasize it more).

And adding lens aberrations on top of diffraction can only worsen

things by adding some additional zeros in the MTF...

Hope this helps,

Ilya