The problem here is that modern error diffusion dither algorithms do not

use a FIXED number of printer dots per pixel. Old 'mock halftone'

printing did, say an array of 4 x 4 printer dots per pixel. But modern

algorithms maintain only an average number of dots per pixel, and even

that average number is somewhat variable.

Sidenote- seems funny to use the term modern to exclude things that we

used only about fifteen years ago or so

Timur wrote:

>

> I read several articles (both on magazines and on usenet posting)

> stating the maximum size that one can print a digital picture (given a

> fixed number of megapixel) without losing quality. None of them though

> was explaining the proper (scientific way) to calculate that...

>

> I assume that there should be one, I was thinking something like:

>

> the number of DPI the printer is capable of times the size we want to

> print should be not minor than the number of megapixel. Of course it

> is not working like that (just try with your favourite number).

>

> One ratio that I saw quite often is that 3megapixel allows maximum

> 8x10 (inches) printout. (BTW, in my very personal experience also

> 10x12 are good at that resolution)

>

> For those who believe that the number I provided is correct: how do

> you make the calculation to prove that? Is it just practical

> experience?

> For those who think that the numbers are wrong: how do you proof your

> point?

>

> Thanks a lot,

> Timur
--

Don Stauffer in Minnesota

http://www.velocityreviews.com/forums/(E-Mail Removed)
webpage-

http://www.usfamily.net/web/stauffer