Why not do moire filtration in post-process?

Discussion in 'Digital Photography' started by Rich, Jun 13, 2007.

  1. Rich

    Rich Guest

    Is it possible? It would allow for full resolution exploitation of a
    sensor if this could be done in PS
    or some other program. In some instances, moire never shows up, in
    others it does. Why not leave the images where it doesn't show up
    alone?
     
    Rich, Jun 13, 2007
    #1
    1. Advertisements

  2. Rich

    Toby Guest

    I suppose that you don't want to force users to post-process images.

    Toby
     
    Toby, Jun 13, 2007
    #2
    1. Advertisements

  3. Rich

    =\(8\) Guest


    This was talked about back when the K10D started to have sample shots
    posted. Some of the sample shots had moiré in them. There really isn't any
    way to get rid of it post process. If your camera has it it is the fault of
    the camera. Some are more susceptible than others because the filter
    (hardware) that gets rid of it or reduces it is better than others. The
    Leica M8 has some major problems.

    Once the moiré pattern is in the image the only way to get rid of it is to
    blur the image and that isn't something people want.

    =(8)
     
    =\(8\), Jun 13, 2007
    #3

  4. Kodak DCS SLRs were extremly sharp, but had a lot of problems with
    moire. There was a program, photo or picture mechanic that Kodak
    absorbed into Photo Desk, the Kodak RAW program. Moire went away with
    heavier AA filters, cutting back on over all sharpness. I worked in a
    studio from 1998 to 2004, they used Kodak DCS 460s and 760s, always
    had to process for moire. But they rented a D1X when one camera was
    down, he rejected every pic from the D1X for not being sharp enough,
    though he was using his own lenses.

    Tom
     
    thomas.c.monego, Jun 13, 2007
    #4
  5. Rich

    acl Guest


    It's not possible. Look, for example, at
    http://cnx.org/content/m0050/latest/
    in particular fig. 2. The top part is the frequency spectrum of the
    actual signal; if you record it with insufficient sampling rate (ie
    number of pixels per mm), you get the thing in the middle, while if
    you have high enough sampling point density you get the bottom signal
    (all in frequency space). As you can see, from the bottom signal you
    can recontstruc the original with no problem, but from the middle on
    you can't: some high-frequency information has been lost due to
    overlapping images of the real frequency spectrum.

    Now if you don't have enough sampling points to avoid aliasing, the
    only solution is to remove the high frequencies from the signal (ie to
    low pass filter it), which is what most cameras do. In other words,
    you blur the image. There is always a tradeoff between blurring too
    much and having artifacts; whole papers have been written on how to
    optimize this tradeoff for various definitions of better.

    Anyway, the short answer is you can't avoid either aliasing or
    blurring. What you can do (in cameras with bayer sensors) is
    automatically recognise aliasing artifacts and desaturate the colour
    there, making them much less visible. I have no idea if this is done
    by any camera (or raw converter) automatically.

    But low-pass filtering with most current cameras certainly looks much
    more natural to me. Opinions of course differ.
     
    acl, Jun 13, 2007
    #5
  6. Rich

    Aaron Guest

    It's definitely possible. MaxMax, the somewhat venerable service
    company known for IR conversions, offers what they call "HotRod"
    service, which involves removing the anti-moire filter from your
    digital camera.

    http://maxmax.com/

    They have some example before/after images to give you a feeling for
    what it's like not to have an anti-moire filter. There are certainly
    cases where the filter does more harm than good.
     
    Aaron, Jun 13, 2007
    #6
  7. Rich

    =\(8\) Guest

    But you had better like the look of moiré because without that filter or if
    you camera has a weak one (the Pentax K10D) do that you have more sharpness
    there is no way post process to get rid of it without major blurring.
    Personally, I would rather have the "it shows only on certain patterns
    otherwise isn't there with a balance of image detail and sharpness.) So far
    I think most companies do a decent job of balancing the two.

    =(8)
     
    =\(8\), Jun 13, 2007
    #7
  8. =(8) wrote on Wed, 13 Jun 2007 07:56:11 -0700:
    ??>> And lo, Rich <> emerged from the ether
    ??>> and spake thus:
    ??>>> Is it possible? It would allow for full resolution
    ??>>> exploitation of a sensor if this could be done in PS or
    ??>>> some other program. In some instances, moire never shows
    ??>>> up, in others it does. Why not leave the images where it
    ??>>> doesn't show up alone?
    ??>>
    ??>> It's definitely possible. MaxMax, the somewhat venerable
    ??>> service company known for IR conversions, offers what they
    ??>> call "HotRod" service, which involves removing the
    ??>> anti-moire filter from your digital camera.
    ??>>
    ??>> http://maxmax.com/
    ??>>
    ??>> They have some example before/after images to give you a
    ??>> feeling for what it's like not to have an anti-moire
    ??>> filter. There are certainly cases where the filter does
    ??>> more harm than good.
    ??>>
    I don't have any examples to try it on but relatively
    inexpensive programs like PrintShop have Moire removal. My Canon
    scanner can also be set in an anti-Moire mode.

    James Silverton
    Potomac, Maryland

    E-mail, with obvious alterations:
    not.jim.silverton.at.verizon.not
     
    James Silverton, Jun 13, 2007
    #8
  9. Rich

    =\(8\) Guest

    Yes, and it is a simple blur, you loose image detail and you loose a lot.
    Once the pixels are arranged in a pattern that is a moiré pattern the only
    way to get rid of it is to either add enough noise to obliterate it or you
    blur it out. There is no way to do anything other than that to it. You can't
    add detail that isn't there and if there was detail there you wouldn't have
    a moiré pattern because a moiré pattern isn't detail.

    =(8)
     
    =\(8\), Jun 14, 2007
    #9
  10. Exactly. Note, by the way that digital images not only cannot resolve above
    the Nyquist frequency, they can't resolve above the Nyquist frequency times
    the Kell factor.

    Here's a game to play. First of all accept the following definition: the
    "reliable resolution" of a camera is the maximum resolution at which both
    the relative intensities and relative widths of features in an image are
    correctly rendered. If you don't care if your photographs bear no relation
    to the subject, you can stop here<g>.

    First, download the resolution chart images for the Canon 10D and Sigma SD10
    from this page.

    http://www.dpreview.com/reviews/sigmasd10/page18.asp

    Open both in your favorite editor, apply a bit of sharpening, and observe
    both at 400%.

    Now, keeping in mind that there are exactly nine lines in the test pattern,
    start at the wide end of the horizontal pattern and find the first point
    (highest resolution) where the where the camera fails to render the pattern
    as nine lines of equal darkness. The point below that is the "reliable
    resolution".

    To my eye, the Canon 10D, the camera produces intensity variations at "15"
    (1500 lph) and 9 even lines at "14" (1400 lph).

    The Sigma is having nasty jaggy problems from the start, and is showing
    different width lines at "9" (900 lph), and intensity variations at "11". So
    it's "8" if you need correct feature widths, and "10" if you just require
    correct feature intensities.

    So I see the 10D has having a "reliable resolution" of around 1400 lph, and
    the SD10 of having a "reliable" resolution of around 1000 lph (since I'm in
    a generous mood today: in a more serious mood, since the SD10 messes up the
    line widths at 900 lph, it's really an 800 lph camera). A 40% difference is
    an enormous difference, but a lot of people don't care that their images are
    correctly resolved, and think that the SD9/SD10 are roughly similar to the
    6MP dSLRs.

    So here's a question: how many pixels do these cameras require to reliably
    resolve a line? The 10D requires 2000/1400 = 1.43 (to get feature width and
    intensities OK), and the SD10 requires 1536/800 = 1.92 for correct feature
    widths and 1536/1000 = 1.56 for correct feature intensities.

    So my take is that the low-pass filter _improves_ resolution for people who
    require that features that appear in their images actually correspond
    correctly to features that exist in the subject.

    David J. Littleboy
    Tokyo, Japan
     
    David J. Littleboy, Jun 14, 2007
    #10
  11. Rich

    acl Guest

    But why would we have a kell factor different from unity if not
    because of low-pass filtering and/or aliasing? (it's a question, I
    have no clue: I just looked up "kell factor" in wikipedia but am too
    lazy to find out more at the moment).
     
    acl, Jun 14, 2007
    #11
  12. Rich

    John Sheehy Guest

    Even if you blur it, the aliasing still can leave a low-frequency
    component.

    --
     
    John Sheehy, Jun 14, 2007
    #12
  13. Rich

    John Sheehy Guest

    Me too. When people say that aliased images like the ones from Sigma DSLRs
    look more natural to them, I can't figure it out. They cry "HORIZONTAL AND
    VERTICAL ARRAYS OF PIXELS THAT SNAP EDGES TO THEMSELVES!"

    Natural, to me, is running out of resolution at some point and things
    getting soft; not distinct edges of obvious mosaics, with Tetris-piece
    objects.



    --
     
    John Sheehy, Jun 14, 2007
    #13
  14. Rich

    John Sheehy Guest

    There's only one type of case; the MTF zone of optics where the MTF is just
    soft enough to do all the necessary anti-aliasing; then the AA filter just
    filters unnecessarily. For sharper optics, the filter is needed, and for
    low contrast subject resolutions, the AA filter is irrelevant.

    --
     
    John Sheehy, Jun 14, 2007
    #14
  15. But why would we have a kell factor different from unity if not
    because of low-pass filtering and/or aliasing? (it's a question, I
    have no clue: I just looked up "kell factor" in wikipedia but am too
    lazy to find out more at the moment).
    <<<<<<<<<<<<<<<<<<<<<<<<

    The Kell factor is about _display_ of digital images. Basically, you can't
    use the information above the Kell factor, even if it's captured. (I suppose
    you could upsample, and avoid Kell factor issues, though.)

    David J. Littleboy
    Tokyo, Japan
     
    David J. Littleboy, Jun 14, 2007
    #15
  16. Rich

    John Sheehy Guest

    Hopefully, someday, cameras will have so many pixels that filters are not
    necessary. Then, the user can choose an output resolution for JPEGs (or
    whatever replaces them) or linear-DNG-style RAWs at a user-chosen
    resolution and sharpening style. For now, we're stuck with these big ugly
    pixels that face too many responsibilities and compromises.

    --
     
    John Sheehy, Jun 14, 2007
    #16
  17. Rich

    John Sheehy Guest

    Thank you. I'm glad to see that I was not hallucinating. In another
    forum, I pointed out that the 10D and the SD10 started to go awry at
    about the same pixel frequency, and I was treated as a retard by the
    forum gods there.

    The questions I posed and they all ignored, were:

    1) Is it possible that if, rather than the graduated line resolution, you
    made a chart with parallel lines at the maximum Sigma/nyquist resolution,
    that the lines would dissapear completely, into grey, if the registration
    between the chart and the camera shifted by just 1/2 pixel?

    2) If you took a lot of shots of the test chart with the SD10, with the
    registration varying slightly on all of them, and then upsampled them
    all, and registered them together and stacked them, would it be possible
    that all of that "extra resolution" would disappear?

    I am just confounded by the number of people who think that aliasing is
    "resolution" or "detail".

    --
     
    John Sheehy, Jun 14, 2007
    #17
  18. Rich

    acl Guest

    I see. I thought it was the general name of the factor accounting for
    any resolution loss due to effects other than those taken into account
    by the Shannon theorem.

    In that case (ie if it's only about display), it's obviously
    irrelevant to this discussion (for the reason you mentioned).
     
    acl, Jun 14, 2007
    #18
  19. Rich

    acl Guest

    Well if you did upsample and average them and you got actual (correct)
    detail beyond nyquist, then I guess Sigma is next in line for a Fields
    medal (or at least a strong candidate).

    That people like it is not so strange, though. It's probably for the
    same reason that most people initially like highly saturated and
    contrasty shots, and slowly move to less and less saturation and
    contrast with time for general shots (well most do).
     
    acl, Jun 14, 2007
    #19
  20. Rich

    Guest Guest

    let me guess - dpreview?
    someone on dpreview took a black & white checkerboard and calculated
    the appropriate lens and distance from the camera so it would exactly
    match the sd-14 pixel grid spacing. he had it pretty close, and where
    it matched up, it resolved fairly well, but a few inches over where the
    pattern was misaligned, it was almost pure grey. it was an excellent
    example of why an anti-alias filter is so important.
     
    Guest, Jun 14, 2007
    #20
    1. Advertisements

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments (here). After that, you can post your question and our members will help you out.