Velocity Reviews > Why do pictures appear sharper than they should?

Why do pictures appear sharper than they should?

Robert D Feinman
Guest
Posts: n/a

 12-22-2003
There is a never ending discussion of resolution vs
print size and capture media.

The mathematics and empirical testing usually
show that the usual expected resolution is in the range of
40-60 lines per mm. A good print should have 6 to 8 lpm.
So simple logic means that the maximum enlargement should
be 5 to 8x. Thus the best size print one could expect from
35mm would be on the order of 8x12 inches with correspondingly
smaller sizes from digital cameras.
In spite of this people still get prints that are "sharp" with
much larger magnifications.
Personally, I've been scanning 35mm color negative film with the
new Minolta 5400 lately and can print out inkjets that look "sharp"
all the way up to the 18x maximum scanning resolution.
I'm not one of those who doesn't know what a "sharp" print looks like
either, since I use formats all the way up to 4x5.

So what's going on?
My conjecture (a theory in progress):

For people pictures shot at normal distances we are used to seeing
detail only in limited areas of the face such as the eyes (lashes
and reflections in the pupil) and perhaps loose strands of hair.
For landscapes and the like, we can't see all that much detail in
distant leaves and grass, but we do see bare branches, telephone wires
and the like as sharp.
For buildings and other man made structures the detail is seen in the
building edges and things like window frames.

In all cases the "sharp" things are not those with a lot of fine detail,
but rather those with good edge contrast. In other words acutance.
Most digital processing involves a certain degree of sharpening. This
doesn't do much for real detail, but does increase acutance. This makes
those features that we search for in real life appear "sharper" so we
read the image as being sharp. We don't expect to see much fine detail
so we are not surprised when it is lacking as long as those sharpness
indicators have good edge definition.
There are categories of images where the detail is important such
as scanning electron microscope images and we always comment on
how much detail we see in them when viewed. This shows that we don't
normally expect to see the fine structures in an image.
So I'm guessing that since the images conform to our expectations from
viewing such scenes in real life we accept them as sharp even though
the resolution figures would indicate that they are not really that
detailed.

As I said, a theory in progress, comments welcome..

--
Robert D Feinman
Landscapes, Cityscapes and Panoramic Photographs
http://robertdfeinman.com
mail: http://www.velocityreviews.com/forums/(E-Mail Removed)

Robert A. Barr
Guest
Posts: n/a

 12-22-2003
> In other words acutance.

>
> Most digital processing involves a certain degree of sharpening. This
> doesn't do much for real detail, but does increase

> acutance.

....any idea what OP means by this word? Acuteness, maybe?

Joe
Guest
Posts: n/a

 12-22-2003
In article <(E-Mail Removed)>, Not.for.@harvest
says...
> > In other words acutance.

>
> >
> > Most digital processing involves a certain degree of sharpening. This
> > doesn't do much for real detail, but does increase

>
> > acutance.

>
> ...any idea what OP means by this word? Acuteness, maybe?
>
>

Nope.

Acutance = In photography, the density gradient across an edge
separating light from darkness, a physically measurable quantity that
correlates well with subjectively observed sharpness of definition. By
extension, in machine vision, a measure of the sharpness of edges in an
image, as the average squared rate of change of the density across the
edge divided by the total density difference from one side of the edge
to the other.

Now you have a computer, buy a good dictionary like OED or learn to
Google I know what I would have bought first.

bmoag
Guest
Posts: n/a

 12-23-2003
Digital photography thrives for the same reason that MP3 compression
survives in audio: neither is as good as the real thing but convenience and
illusion make up for the actual technical limitations of the media.

gsum
Guest
Posts: n/a

 12-23-2003
I'm mystified by this. I don't find that a scanned
35mm negative can be printed to a greater size than
a wet printed print as grain becomes dominant.
The limit for ISO 100 film is about 8x10 inches.
Obviously, there is much greater information in a
6mp digital image than a 35mm scan (unless very
slow film is used) and the limit is about 12x18 inches
for 'photo' quality.

Graham

"Robert D Feinman" <(E-Mail Removed)> wrote in message
news:(E-Mail Removed)...
> There is a never ending discussion of resolution vs
> print size and capture media.
>
> The mathematics and empirical testing usually
> show that the usual expected resolution is in the range of
> 40-60 lines per mm. A good print should have 6 to 8 lpm.
> So simple logic means that the maximum enlargement should
> be 5 to 8x. Thus the best size print one could expect from
> 35mm would be on the order of 8x12 inches with correspondingly
> smaller sizes from digital cameras.
> In spite of this people still get prints that are "sharp" with
> much larger magnifications.
> Personally, I've been scanning 35mm color negative film with the
> new Minolta 5400 lately and can print out inkjets that look "sharp"
> all the way up to the 18x maximum scanning resolution.
> I'm not one of those who doesn't know what a "sharp" print looks like
> either, since I use formats all the way up to 4x5.
>
> So what's going on?
> My conjecture (a theory in progress):
>
> For people pictures shot at normal distances we are used to seeing
> detail only in limited areas of the face such as the eyes (lashes
> and reflections in the pupil) and perhaps loose strands of hair.
> For landscapes and the like, we can't see all that much detail in
> distant leaves and grass, but we do see bare branches, telephone wires
> and the like as sharp.
> For buildings and other man made structures the detail is seen in the
> building edges and things like window frames.
>
> In all cases the "sharp" things are not those with a lot of fine detail,
> but rather those with good edge contrast. In other words acutance.
> Most digital processing involves a certain degree of sharpening. This
> doesn't do much for real detail, but does increase acutance. This makes
> those features that we search for in real life appear "sharper" so we
> read the image as being sharp. We don't expect to see much fine detail
> so we are not surprised when it is lacking as long as those sharpness
> indicators have good edge definition.
> There are categories of images where the detail is important such
> as scanning electron microscope images and we always comment on
> how much detail we see in them when viewed. This shows that we don't
> normally expect to see the fine structures in an image.
> So I'm guessing that since the images conform to our expectations from
> viewing such scenes in real life we accept them as sharp even though
> the resolution figures would indicate that they are not really that
> detailed.
>
> As I said, a theory in progress, comments welcome..
>
>
> --
> Robert D Feinman
> Landscapes, Cityscapes and Panoramic Photographs
> http://robertdfeinman.com
> mail: (E-Mail Removed)

Jeff
Guest
Posts: n/a

 12-23-2003
"Robert A. Barr" <Not.for.@harvest> wrote in news:3FE7598C.856BF5A6
@worldnet.att.net:

>> In other words acutance.

>
>>
>> Most digital processing involves a certain degree of sharpening. This
>> doesn't do much for real detail, but does increase

>
>> acutance.

>
> ...any idea what OP means by this word? Acuteness, maybe?
>

Acutance
Definition:
In photography, the density gradient across an edge separating light
from darkness, a physically measurable quantity that correlates well
with subjectively observed sharpness of definition. By extension, in
machine vision, a measure of the sharpness of edges in an image, as the
average squared rate of change of the density across the edge divided by
the total density difference from one side of the edge to the other.

more detailed explanation at:
http://www.pvinc.com/tutorial/tutori...y-acutance.htm

Skee
Guest
Posts: n/a

 12-23-2003
On Mon, 22 Dec 2003 15:43:38 -0500, Robert D Feinman
<(E-Mail Removed)> wrote:

>There is a never ending discussion of resolution vs
>print size and capture media.
>
>The mathematics and empirical testing usually
>show that the usual expected resolution is in the range of
>40-60 lines per mm. A good print should have 6 to 8 lpm.
>So simple logic means that the maximum enlargement should
>be 5 to 8x. Thus the best size print one could expect from
>35mm would be on the order of 8x12 inches with correspondingly
>smaller sizes from digital cameras.
>In spite of this people still get prints that are "sharp" with
>much larger magnifications.
>Personally, I've been scanning 35mm color negative film with the
>new Minolta 5400 lately and can print out inkjets that look "sharp"
>all the way up to the 18x maximum scanning resolution.
>I'm not one of those who doesn't know what a "sharp" print looks like
>either, since I use formats all the way up to 4x5.
>
>So what's going on?
>My conjecture (a theory in progress):
>

For many examples of pictures that appear "sharper than they should,"
you need only look at some of the photos appearing in Newsweek and
Time these days--this oversharpening fad seems to have become suddenly
de rigeur. This too shall pass...hopefully.

Wdflannery
Guest
Posts: n/a

 12-23-2003
I think you are correct ... I noticed something similar when I had taken some
pics of a redhead ... red skin, pale eyes, ...etc. ... that looked blurrier
than I had anticipated ....... sharpening had the effect of increasing contrast
in the face and all of a sudden the pic seemed acceptably sharp.

Robert A. Barr
Guest
Posts: n/a

 12-23-2003
Joe wrote:

> In article <(E-Mail Removed)>, Not.for.@harvest
> says...
> > > In other words acutance.

> >
> > >
> > > Most digital processing involves a certain degree of sharpening. This
> > > doesn't do much for real detail, but does increase

> >
> > > acutance.

> >
> > ...any idea what OP means by this word? Acuteness, maybe?
> >
> >

> Nope.
>
> Acutance = In photography, the density gradient across an edge
> separating light from darkness, a physically measurable quantity that
> correlates well with subjectively observed sharpness of definition. By
> extension, in machine vision, a measure of the sharpness of edges in an
> image, as the average squared rate of change of the density across the
> edge divided by the total density difference from one side of the edge
> to the other.
>
> Now you have a computer, buy a good dictionary like OED or learn to
> Google I know what I would have bought first.
>

I tried a few resources -- not Google, though -- with no luck. I wasn't
being a smartass, just curious. I even fed it to Microsoft's Bookshelf 2000,
which is pretty thorough. Not perfect, but thorough.