interlaced vs progressive scan question

Discussion in 'DVD Video' started by wdoe999@yahoo.com, Nov 10, 2006.

  1. Guest

    I'm try to decipher why progressive scan video is considered better
    then interlaced.

    Assuming that we are not dealing with old 1930's phosphors, would it be
    fair to say that the following 2 images would be of the exact same
    quality:

    480i camera --> 480i display
    480p camera --> 480p display

    >From what I can gather, the real problem is when you try and mix the

    formats and you get jaggies?

    Can I conclude that:

    Progressive scan TVs are considered to be better, MOSTLY because DVD
    are encoded from film, which is essentially a progressive scan source.
    That is, the source and display are both progressive scan.
    , Nov 10, 2006
    #1
    1. Advertising

  2. Joshua Zyber Guest

    Joshua Zyber, Nov 11, 2006
    #2
    1. Advertising

  3. JoeBloe Guest

    On 9 Nov 2006 19:49:32 -0800, Gave us:

    >I'm try to decipher why progressive scan video is considered better
    >then interlaced.


    Because each field is a fully rendered frame. Interlacing is a
    bunch of zigsaw puzzles stacked together for each frame.

    Progressive gives one a richer rendering of each frame for the eye.
    JoeBloe, Nov 11, 2006
    #3
  4. Guest

    Thanks. Yes, I had seen that article before and it does seem to make
    the most sense. They are essentially confirming that progressive scan
    is NOT better than interlaced per se, rather a progressive scan image
    is better when the source is progressive scan (such as film).

    What had me confused is that a good percentage of the information on
    the web seems to be quite wrong (what else is new about the internet).
    Most articles make crazy statements about interlaced images having
    "half the resolution", or gaps between the lines in interlaced images
    (as if progressive scan images have fatter lines or something).

    It really shouldn't matter how the image is displayed (from top to
    bottom, bottom to top, sideways, from the centre out) as long as the
    source is scanned in the same manner.

    Joshua Zyber wrote:
    > <> wrote in message
    > news:...
    > > I'm try to decipher why progressive scan video is considered better
    > > then interlaced.

    >
    > Read this:
    >
    > http://www.hometheaterhifi.com/volume_7_4/dvd-benchmark-part-5-progressive-10-2000.html
    >
    > Everything you need to know about progressive scan.
    , Nov 15, 2006
    #4
  5. Jukka Aho Guest

    wrote:

    > I'm try to decipher why progressive scan video is considered better
    > then interlaced.


    1) Non-interlaced video is easier to deal with computers,
    image-processing algorithms, and compression algorithms. (And, they're
    easier to get your head around, if you're not very bright. Some computer
    programmers who try to make video processing products, aren't.)

    2) Modern display technologies do not "scan". Only CRTs and the ancient
    electromechanical Nipkow disk televisions are based on "scanning",
    natively. (It can be argued, though, that it would be possible to
    _emulate_ the scanning pattern of a CRT with, for instance, a SED
    display.)

    3) When "progressive scan" (which is beginning to be a misnomner these
    days - see point #2) is applied, it is usually assumed that at least
    twice the bandwidth is used for delivering the images. Instead of
    drawing, say, 240-line progressive pictures 60 times a second, or
    480-line progressive pictures 30 times a second - both of which would
    have the same bandwidth as an interlaced 480-line 60 Hz system, you draw
    480-line progressive pictures 60 times a second. ("60" in the above is
    really 60*1000/1001, and "30", respectively, 30*1000/1001.)

    > Assuming that we are not dealing with old 1930's phosphors,


    Do you assume those to be faster-decaying or slower-decaying than the
    modern phosphors? What is the problem you assume there being with 1930s
    phosphors? (This is not a trick question - there just does not seem to
    be a consensus about this. Some say the early CRT-based televisions had
    faster-decaying phosphors, and insist that interlaced scanning was
    designed, in part, to combat this problem. Others maintain that they had
    a longer afterglow. Go figure.)

    > would it be fair to say that the following 2 images would be of
    > the exact same quality:
    >
    > 480i camera --> 480i display
    > 480p camera --> 480p display


    Depends. Do you mean a 30 fps 480p system or a 60 fps 480p system? Still
    scenes or motion?

    Coincidentally, the Wikipedia article about interlace is currently under
    scrutiny. You might want to read the discussion page, where lots have
    been said about the relative merits of an interlaced system and the
    various "related" progressive systems.

    <http://en.wikipedia.org/wiki/Talk:Interlace>

    See, especially, my contribution (yes, this is a shameless plug!), where
    I compare an interlaced system to three related progressive systems:

    <http://en.wikipedia.org/wiki/Talk:Interlace#Comparing_inter
    lace_to_progressive>

    (Please copy and paste the two parts of the URL together manually if
    your newsreader program does not do that automatically.)

    > From what I can gather, the real problem is when you try and mix the
    > formats and you get jaggies?


    That's a real problem whenever some sort of automatic conversion from
    interlaced domain to non-interlaced domain is applied, and the source
    material alternates between film-originated, "progressive scan" video,
    and regular interlaced video.

    > Progressive scan TVs are considered to be better, MOSTLY because DVD
    > are encoded from film, which is essentially a progressive scan source.
    > That is, the source and display are both progressive scan.


    That's about correct. But note that NTSC (525-line 59.94
    fields-per-second) countries have some additional complications due to
    the 3:2 pulldown pattern which is used when transferring 24 fps film
    frames to video. PAL (625-line 50 fields-per-second) countries
    circumvent those problems by speeding up the film by 4 % when it is
    transferred to video, so that each pair of adjacent fields comes from
    the same film frame.

    Also note that in the age of digital television broadcasts it is
    possible to shoot real "progressive" (i.e. non-interlaced) 60 fps video
    signal - such as 480p/60 or 720p/60 - and display it "as is" on a
    non-interlaced display, without any tricks.

    --
    znark
    Jukka Aho, Nov 15, 2006
    #5
  6. Jukka Aho Guest

    wrote:

    > Joshua Zyber wrote:
    >
    >> Read this:
    >>
    >> http://www.hometheaterhifi.com/volume_7_4/dvd-benchmark-p
    >> art-5-progressive-10-2000.html


    > Thanks. Yes, I had seen that article before and it does seem to make
    > the most sense. They are essentially confirming that progressive scan
    > is NOT better than interlaced per se, rather a progressive scan image
    > is better when the source is progressive scan (such as film).


    That's a correct conclusion if we're talking about content that was
    produced _natively_ as interlaced fields or _natively_ as non-interlaced
    frames. Neither system benefits when it's being converted to the other
    system.

    But note that the refresh rate / frame rate matters, too. 60 Hz (60 *
    1000/1001 Hz) "progressive scan" display is not ideal for
    film-originated content. 24 fps film-originated video would be best
    displayed with a non-scanning display that updates the pictures 24 times
    a second, or with a scanning display that flashes the frames two times
    (48 Hz) or three times (72 Hz) in a row, like movie projectors do.

    The article that you were referred to in the above appears to be mostly
    correct but the animated tomato illustration and the animated depiction
    of an interlaced scanning pattern appear to give false impressions about
    the topic. Both illustrations seem to make a somewhat ludicruous (or at
    least inaccurate) claim that your brain would somehow integrate _exactly
    two adjacent fields at a time_ into a single picture. You will get a
    better description of what is really happening from here:
    <http://lurkertech.com/lg/fields/fields.html>.

    The "Interlace Scan" illustration also appears to suggest that "Field 1"
    would be retained on the screen while "Field 2" is being drawn
    in-between its lines. That's not true. The phosphors on modern CRT
    screens fade away long before a single field refresh is complete. See,
    for example:

    <http://en.wikipedia.org/wiki/Image:Refresh_scan.jpg>

    > What had me confused is that a good percentage of the information on
    > the web seems to be quite wrong (what else is new about the internet).
    > Most articles make crazy statements about interlaced images having
    > "half the resolution", or gaps between the lines in interlaced images
    > (as if progressive scan images have fatter lines or something).


    Most of the time confusion arises because it is not clearly stated what
    kind of a progressive system the writer has in his mind when he is
    making these comparisons.

    In my previous message to this thread, I gave a link to a Wikipedia
    discussion page where I compared three different (but technically
    related) "progressive scan" systems to a single interlaced system [1].
    For example if you're comparing an interlaced system to a "Progressive
    variant A" system, as defined on that page, you _will_ get more visible
    gaps between the scanlines (or rather, more discernible scanline
    structure) - but note: the gaps are visible in the _progressive_ system,
    not in the interlaced system. And, if you're comparing an interlaced
    system to a corresponding "Progressive variant C" system (as defined on
    that page as well), each field in the interlaced system has only half of
    the vertical resolution when compared to the frames in the progressive
    system. It all depends on what you're comparing to what.

    _____

    [1] Here's the link again: <http://en.wikipedia.org/wiki/Talk:Inter
    lace#Comparing_interlace_to_progressive>

    --
    znark
    Jukka Aho, Nov 15, 2006
    #6
  7. Guest

    Thanks - I've seen so much hype about progressive scan that it is good
    to see (in the article you referenced) that people are questioning some
    of the crazy reasons that are given for the superiority of progressive
    scan.

    Jukka Aho wrote:
    > wrote:
    >
    > > I'm try to decipher why progressive scan video is considered better
    > > then interlaced.

    >
    > 1) Non-interlaced video is easier to deal with computers,
    > image-processing algorithms, and compression algorithms. (And, they're
    > easier to get your head around, if you're not very bright. Some computer
    > programmers who try to make video processing products, aren't.)
    , Nov 15, 2006
    #7
  8. Bill's News Guest

    Jukka Aho wrote:
    <snip>
    > Do you assume those to be faster-decaying or slower-decaying
    > than the
    > modern phosphors? What is the problem you assume there being
    > with
    > 1930s phosphors? (This is not a trick question - there just
    > does not
    > seem to be a consensus about this. Some say the early
    > CRT-based
    > televisions had faster-decaying phosphors, and insist that
    > interlaced
    > scanning was designed, in part, to combat this problem. Others
    > maintain that they had a longer afterglow. Go figure.)
    >

    <snip>

    From recollection:
    USA power cycle 60 Hz; human visual perception 40 Hz.
    Interlacing was the tradeoff between bandwidth and cost vs.
    watchability. Apparently, were humans 25% quicker at
    perception, the continent might have gone progressive scan from
    the beginning!

    Regarding TV phosphors:
    By the late 60's and very early 70's three methods of placing
    alpha-numerics onto video screens were: Stroke writer (IBM -
    white), sawtooth (DataPoint - green), raster scan (aka Standard
    TV, by Hazeltine - gold). Of those, the latter was the most
    economical because of the mass production of the TV subassembly.
    However, standard TV bottles - as produced by Ball (the Mason
    Jar folk) at that time - had major problems due to the fast
    decay time of their white phosphor. The home TV was usually
    viewed under incandescent lighting, while office alpha-numeric
    monitors were typically viewed under florescent lighting. At
    that time a longer decay time was the solution and a goldish
    colored phosphor was mixed which satisfied all
    memory/refresh/lighting conditions except "smooth scrolling."
    Viewed under incandescent lights, the phosphor persistence was
    more noticeable.

    It was about 10 years before all the ingredients came together
    to render "smooth scrolling" on raster TVs with short
    persistence phosphors, highly repeatable deflection, and fast
    enough memory systems.

    But then, memory is the second thing to go - so my recollection
    may be dimmer than I imagine ;-0)
    Bill's News, Nov 18, 2006
    #8
  9. Jukka Aho Guest

    Bill's News wrote:

    > From recollection:
    > USA power cycle 60 Hz; human visual perception 40 Hz.
    > Interlacing was the tradeoff between bandwidth and cost vs.
    > watchability. Apparently, were humans 25% quicker at
    > perception, the continent might have gone progressive scan from the
    > beginning!


    Hmm. I don't see how this follows. If we suppose the 40 Hz figure holds
    true, 1.25 * 40 = 50 Hz. The current refresh rate, and temporal rate,
    for "real" video shot with an interlacing video camera is about 60 Hz in
    the US. What kind of progressive system are you suggesting for these
    hypothetical humans with a 50 Hz visual perception (however "visual
    perception" would be defined in this context)?

    > Regarding TV phosphors:
    > By the late 60's and very early 70's three methods of placing
    > alpha-numerics onto video screens were:


    "Placing alpha-numerics onto video screens" means computer monitors in
    this context, right?

    > Stroke writer (IBM - white), sawtooth (DataPoint - green), raster scan
    > (aka Standard TV, by Hazeltine - gold). Of those, the latter
    > was the most economical because of the mass production of the TV
    > subassembly. However, standard TV bottles - as produced by Ball (the
    > Mason Jar folk) at that time - had major problems due to the fast
    > decay time of their white phosphor. The home TV was usually
    > viewed under incandescent lighting, while office alpha-numeric
    > monitors were typically viewed under florescent lighting. At
    > that time a longer decay time was the solution and a goldish
    > colored phosphor was mixed which satisfied all
    > memory/refresh/lighting conditions except "smooth scrolling."
    > Viewed under incandescent lights, the phosphor persistence was
    > more noticeable.


    What you appear to be saying is that in the late 60s (or early 70s),
    when first "glass terminals" started to appear, standard tv phosphors
    were already fast-decaying and appeard a bit too flickery for computer
    use - at least at typical tv refresh rates. Right?

    But in the historical context, it would be more interesting to know how
    fast or slow decaying the phosphors were in the 1930s when interlaced
    scanning was invented.

    > It was about 10 years before all the ingredients came together
    > to render "smooth scrolling" on raster TVs with short
    > persistence phosphors, highly repeatable deflection, and fast
    > enough memory systems.


    That's all true - for computer monitors. But if standard tv phosphors
    were already too flickery (too fast-decaying) for computer use at tv
    refresh rates by the late 60s, or early 70s, does that not mean that
    they were fast enough for scrolling or panning without smearing the
    picture (in tv use, that is)?

    --
    znark
    Jukka Aho, Nov 18, 2006
    #9
  10. Bill's News Guest

    Jukka Aho wrote:
    > Bill's News wrote:
    >
    >> From recollection:
    >> USA power cycle 60 Hz; human visual perception 40 Hz.
    >> Interlacing was the tradeoff between bandwidth and cost vs.
    >> watchability. Apparently, were humans 25% quicker at
    >> perception, the continent might have gone progressive scan
    >> from the
    >> beginning!

    >
    > Hmm. I don't see how this follows. If we suppose the 40 Hz
    > figure
    > holds true, 1.25 * 40 = 50 Hz. The current refresh rate, and
    > temporal
    > rate, for "real" video shot with an interlacing video camera
    > is about
    > 60 Hz in the US. What kind of progressive system are you
    > suggesting
    > for these hypothetical humans with a 50 Hz visual perception
    > (however
    > "visual perception" would be defined in this context)?
    >


    Oh, I was being facetious - as humans will always be slower than
    1/50 sec perception. But my guess is that, were it otherwise,
    since Europe was already on a 50 Hz electrical standard they
    might have been able to implement progressive scanning TV
    instead of interlaced. I guess I needed a smiley thingy there.
    Sorry;-0)

    >> Regarding TV phosphors:
    >> By the late 60's and very early 70's three methods of placing
    >> alpha-numerics onto video screens were:

    >
    > "Placing alpha-numerics onto video screens" means computer
    > monitors in
    > this context, right?
    >


    Yes! Though, as I recall, equipment to provide "real-time" text
    overlays to video camera images was emerging at the same time -
    using much larger fonts and aided by moving backgrounds. 5x7 in
    7x9 at 80x25 characters on a black background made for "tight"
    specs back then.

    By the way, the same lab at which I worked in the 60's had a
    square sheet of metal oxide coated Mylar spinning on a platter
    under a record/playback head. Their thought was sports
    broadcast "instant playback" and NOT what eventually became the
    vast market of CD/DVD.

    Video tape did not lend itself to "instant replay" or anything
    like what we have come to know in contemporary sports-replay
    selection and composition.

    >> Stroke writer (IBM - white), sawtooth (DataPoint - green),
    >> raster
    >> scan (aka Standard TV, by Hazeltine - gold). Of those, the
    >> latter
    >> was the most economical because of the mass production of the
    >> TV
    >> subassembly. However, standard TV bottles - as produced by
    >> Ball (the
    >> Mason Jar folk) at that time - had major problems due to the
    >> fast
    >> decay time of their white phosphor. The home TV was usually
    >> viewed under incandescent lighting, while office
    >> alpha-numeric
    >> monitors were typically viewed under florescent lighting. At
    >> that time a longer decay time was the solution and a goldish
    >> colored phosphor was mixed which satisfied all
    >> memory/refresh/lighting conditions except "smooth scrolling."
    >> Viewed under incandescent lights, the phosphor persistence
    >> was
    >> more noticeable.

    >
    > What you appear to be saying is that in the late 60s (or early
    > 70s),
    > when first "glass terminals" started to appear, standard tv
    > phosphors
    > were already fast-decaying and appeard a bit too flickery for
    > computer
    > use - at least at typical tv refresh rates. Right?
    >


    Yes. But recall that this was not obvious until viewed in
    ambient florescent lighting. Not the TV watcher's typical
    environment.

    > But in the historical context, it would be more interesting to
    > know
    > how fast or slow decaying the phosphors were in the 1930s when
    > interlaced scanning was invented.
    >


    My memory is not that good. I think you touched on it in
    another post. Visibly decayed below perception in no more than
    vertical retrace time (less than 1/60th second, no?).
    Otherwise, the image would have looked similar to today's
    display of interlaced images on digital.

    >> It was about 10 years before all the ingredients came
    >> together
    >> to render "smooth scrolling" on raster TVs with short
    >> persistence phosphors, highly repeatable deflection, and fast
    >> enough memory systems.

    >
    > That's all true - for computer monitors. But if standard tv
    > phosphors
    > were already too flickery (too fast-decaying) for computer use
    > at tv
    > refresh rates by the late 60s, or early 70s, does that not
    > mean that
    > they were fast enough for scrolling or panning without
    > smearing the
    > picture (in tv use, that is)?


    Yes and no! In that era both inexpensive memory systems and
    inexpensive deflection yokes were not up to the task of
    repeatable (hitting the same spot on the phosphor) dot drawing
    at high rates (hmmm, perhaps any rate?). Don't forget that the
    FAA was already using drum memory driven 1000x1000 monitors at
    higher refresh rates with better beam deflection to present
    graphic and alpha-numerics at in-flight control centers. Large
    screen projection was a reality too, schlieren - I think - was
    the art of the day. And this is really stretching my recall -
    it was an oil based screen written to by a cathode ray. Later
    fresnel got into the act.

    Any way, back to phosphor decay. When faster memory came along
    and the "art" of controlling the beam economically advanced,
    fast decay was necessary to implement "smooth scroll." This was
    a huge leap forward in the human interface, as previously the
    "moving" alpha-numerics were unreadable until NOT scrolling.

    To some degree, Hazeltine was restrained by technology in which
    they held patents and expertise - TV and magnetic memory among
    them. When they finally accepted volatile memory as a
    substitute for iron cores and LSI for discretes, they began to
    move forward more quickly.

    DataPoint, with their non-TV approach, implemented "smooth
    scroll" earlier than Hazeltine - and perhaps spurred Hazeltine's
    further efforts in raster control. IBM, at that time, only used
    page presentation, with a limited amount of characters, and
    never attempted to scroll.

    By the time we got to the 90's CRTs for PCs, 60 Hz was no longer
    a constraint and mass production of these newer designs allowed
    for similar economies of scale as realized in the 60's.

    We've come a long way, as once upon a time we could be
    sterilized by sitting too close to out GE color TV ;-0)
    Admittedly, I tried - it didn't work :-(
    Bill's News, Nov 18, 2006
    #10
    1. Advertising

Want to reply to this thread or ask your own question?

It takes just 2 minutes to sign up (and it's free!). Just click the sign up button to choose a username and then you can ask your own questions on the forum.
Similar Threads
  1. Jason Ash

    Progressive scan question again....

    Jason Ash, Nov 9, 2003, in forum: DVD Video
    Replies:
    7
    Views:
    1,540
    Rutgar
    Nov 11, 2003
  2. jack lift
    Replies:
    7
    Views:
    1,888
    Waterperson77
    Dec 9, 2003
  3. DJ
    Replies:
    4
    Views:
    655
    Joshua Zyber
    Mar 2, 2004
  4. Aphelion
    Replies:
    7
    Views:
    1,725
    michym
    Jan 18, 2010
  5. Replies:
    4
    Views:
    755
Loading...

Share This Page