It's time for 29/30fps to die the death it has long deserved

Discussion in 'Digital Photography' started by RichA, May 28, 2014.

  1. RichA

    David Taylor Guest

    David Taylor, May 30, 2014
    1. Advertisements

  2. RichA

    Peter Irwin Guest

    The unit, not the man, is fairly recent.

    The unit was proposed in 1930 by the IEC, but it did not become standard
    until 1960 when it was adopted as an SI unit by the CGPM.

    Practically all textbooks and technical journals published before
    1960 used c/s or cycles per second instead of Hz or hertz. Most
    changed during the 1960s, although I wouldn't be too surprised to see
    the old style in something printed as late as the 1970s.

    (Nowadays it marks someone as very old fashioned in the same way
    that using "mikey mike" for micro-microfarad or using mho instead
    of siemens do.)
    Sure they can. A cycle length of one millisecond is the same thing
    as a cycle rate of one kilohertz which in turn is the same thing
    as a rate of 360 000 degrees per second, or two thousand pi radians
    per second or 60 000 revolutions per minute. You can express it any
    way that suits your purpose. It might sound strange to say that
    your record player spins at .55 Hz or has a cycle length of 1.8
    seconds, but they can be perfectly sensible ways of putting it
    in some contexts.

    Peter Irwin, May 30, 2014
    1. Advertisements

  3. RichA

    David Taylor Guest

    On 30/05/2014 21:55, Alan Browne wrote:
    The other characteristic of CRT displays was the phosphor persistence,
    and a long-persistence phosphor would reduce flicker. The persistence
    could be chosen to match the application.

    Agreed on update rate. I would have thought that the problem with some
    modern vision systems was the lag caused by digital processing - that's
    certainly been one objection to electronic viewfinders. I've recently
    moved to a camera with an EVF, but not yet needed to take any action
    photos. Too early for me to make a judgement.
    David Taylor, May 31, 2014
  4. RichA

    Whisky-dave Guest

    No, I think 1850 is "relatively recent", and not "recent" when refering to the history of electricity and its units.

    which went through to the sixties.

    yep sorry but you should also have known
    Whisky-dave, Jun 2, 2014
  5. RichA

    Sandman Guest

    Uh, that chart actually confirms what I wrote. It shows where the test
    subjects could see flicker in different light situations. As you can see,
    no one could detect flicker over 50Hz at any light situation, and since all
    CRT flicker is at 50Hz or over, your eyes would never detect it, regardless
    of light situation.
    Sandman, Jun 3, 2014
  6. RichA

    David Taylor Guest

    On 03/06/2014 08:52, Sandman wrote:
    Your statement: "Your eye wouldn't be more or less suspectible to
    noticing CRT flicker depending on enviromental lighting either." was
    incorrect, unless you add the 50 Hz caveat.

    Even then, that's just one example. Flicker has been reported on 60 Hz
    displays as well, and likely on higher frequencies too. Fortunately, or
    not, LCD displays have different characteristics.
    David Taylor, Jun 3, 2014
  7. RichA

    Sandman Guest

    That's implicit in the "CRT" part, which all are 50Hz or more.
    By whom and when?
    Sandman, Jun 3, 2014
  8. RichA

    android Guest

    Good q. Of course considering the whole chain of equipment and processes
    you can get flicker in the output from a 200Hz OLED...
    android, Jun 3, 2014
  9. RichA

    Whisky-dave Guest

    Whisky-dave, Jun 3, 2014
  10. RichA

    android Guest

    android, Jun 3, 2014
  11. RichA

    PeterN Guest

    Many moons ago, Lanier was one of the leading publishers of their
    proprietary word processing system. When they came out with a newer
    model, (which you had to buy to get continuing support,) I "upgraded."
    The CRT screen flickered under fluorescent lighting, to a point where it
    was unusable. I replaced it with an IBM as WordPerfect. Neither my IBM,
    not Apple II monitors flickered. Your explanation was the likely
    reason. That Lanier screw-up also saved me a lot of money. The Lanier
    system cost about ten grand. IBM, with WP, considerably less. .
    PeterN, Jun 3, 2014
  12. RichA

    Sandman Guest

    PAL is 50Hz, NTSC is 60Hz
    The frame has no relevancy on the topic of flicker. NTSC is 60Hz, which is
    far over the human's eye's capability to detect flicker.
    As I said - 50Hz fields per second is far sufficient to avoid any detection
    of flicker for the human eye.
    No, only the refresh rate will be seen as flicker if it is low enough.
    The flicker fusion threshold is 16Hz, and both PAL and NTSC far exceeds

    It is true, as should be noted here though, that our peripheral vision is
    more likely to detect flicker, and back in the day, the CRT displays with
    high refresh rates was born from the need when people started using large
    computer monitors that were standing close to their eyes, so their
    peripheral vision could detect a very slight flicker. That's when you
    started seeing 100Hz and 120Hz monitors - and later, TV's - even though you
    didn't really need it for a TV since no parts of it were ever in your
    peripheral vision.
    Sandman, Jun 3, 2014
    1. Advertisements

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments (here). After that, you can post your question and our members will help you out.