It's time for 29/30fps to die the death it has long deserved

Discussion in 'Digital Photography' started by RichA, May 28, 2014.

  1. RichA

    RichA Guest

    I can't take blurring that happens at even 720P or higher when any motion is happening, it just looks awful. Ever look at video closely? One second, a stationary face is clear, good resolution. As soon as the person on film/video moves, even slightly, the resolution is gone. Look at water with any kind of movement (waves, wavelets) and see the patches of blur that appear because the frame or data-rate just can't keep up with the "information."
    This does not happen in real-life, the human eye can perceive resolution (from what I've seen) of objects at 1/1000th of a second. It's possible it's even higher than that.
    The other effect is a juddering of movement that happens because of too fewframes and/or the way they drop frames in broadcasting. Things don't flowsmoothly like they did on analog TV broadcasts, moving objects seem to jerk forward in short jumps.
    They HAVE to shift to at least 60fps. Even 120fps would be advisable if possible. Otherwise, what is the POINT in having 4k displays at all, unless you intend to stare at still shots only? My only fear is that if the levelof data transfer already taxes broadcasting and portable sources of video,things could look even worse!
     
    RichA, May 28, 2014
    #1
    1. Advertisements

  2. RichA

    Whisky-dave Guest

    This is one case where analogue is better than digital, and one of the reasons I kiept my CRT TV until a cpouple of years ago.
    In some shops in the UK they won't let you see broadcast TV on 4K displays because they look so bad, what they offer is to show you test movies and demo modes.

    http://gizmodo.com/what-is-the-resolution-of-the-human-eye-1541242269
    It's a complicated question, one that must take into account the peculiar anatomy of the eye which is different than the less peculiar engineering of a digital camera. As such, it's worth watching all ten minutes of the video, explaining not only how we see but also how well. Spoiler: the human eye is 576 megapixels--but really only about 7 megapixels matter. [YouTube]
     
    Whisky-dave, May 28, 2014
    #2
    1. Advertisements

  3. RichA

    Rikishi42 Guest

    Yes, because more is better.... don't go there, please.

    Are you a pigeon, perhaps?
    They can scan up to 250 fps, so I heard somwhere.

    But we humans, we're stuck at 20, at most. So 25 fps more than we need. If a
    still image of a static scene can be good, why would the equally still
    image of a motion be blury?


    I do agree with you about the poor quality of motion in some videos, but
    that is not the result of lack of resolution or slow framerate. A poor
    rendering of the images at production/convertion is a more likely suspect.
     
    Rikishi42, May 28, 2014
    #3
  4. RichA

    Joe Kotroczo Guest

    What you're describing are compression artifacts, which are dependent on
    whatever compression algorithms were used in the the signal chain.

    Good old fashioned cinema is and has always been 24fps, did it bother
    you there?

    The Hobbit was shot 48fps and looked shit.
     
    Joe Kotroczo, May 28, 2014
    #4
  5. RichA

    Sandman Guest

    Blur in motion in video does not come from the framerate, but from the
    shutter speed, you know - just like in a real camera. 24 frames per second
    has been the standard for movie framerates since forever.

    The human eye can in the best of situations percieve some 10 or 12 seperate
    frames per second. 24 fps is more than adequate to create a smooth motion.

    You should watch an old animated movie in 1080p, like Snow White, which is
    drawn at 24 fps, and I challenge you to find any blur. Or go watch a stop
    motion movie, like Wallace and Gromit, also 24fps and totally devoid of
    blur.
     
    Sandman, May 29, 2014
    #5
  6. RichA

    David Taylor Guest

    What every post I've seen so far has missed is the effect of the viewing
    environment. In a cinema, the ambient light level is very low, and the
    eye's response time is proportionately longer, allowing the use of lower
    frame rates. Put the display on a daylight environment (fighter cockpit
    for example) and you need a much higher frame rate. I recall 120 Hz
    being talked about for CRT displays.
     
    David Taylor, May 29, 2014
    #6
  7. RichA

    RichA Guest

    Forget compression artifacts and dropped frame effects.
    30fps won't keep action "contained." It's not flicker, it's actual movement between frame onscreen. That's why the NFL shifted to a higher frame-rate, so they could do slow-motion playback, because it's impossible to clearly show it otherwise. However, some people may not see this happening. I've noticed that in theatre, I used to be able to perceive when a projector bulb was about to go because of small variations in its light-output, but other people didn't notice. Some people's brains probably "fill-in" the missing steps so the action looks normal. Kind of like how a DVD player will correct errors.
     
    RichA, May 29, 2014
    #7
  8. RichA

    RichA Guest

    If 30fps was good enough, we wouldn't have any need for 120hz EVF displays in Cameras.
     
    RichA, May 29, 2014
    #8
  9. RichA

    David Taylor Guest

    On 29/05/2014 07:26, RichA wrote:
    []
    It depends. For 30 fps you would need 90 fps if the display was frame
    sequential colour. But as I said, bright ambient lighting may well
    require a higher frame rate.

    Of course, there may also be a compromise with battery life....
     
    David Taylor, May 29, 2014
    #9
  10. RichA

    Peter Irwin Guest

    Motion is a separate issue from flicker. Silent films were standardised
    at 16 frames per second with each frame projected three times. In
    practice many silent films of the 1920s were shot and intended to
    be projected at a slightly higher rate. When optical soundtracks
    were introduced the frame rate was increased to 24fps primarily to
    improve sound quality because of the limitations of the materials
    of the time. But sound films are projected only twice per frame,
    so the flicker rate is unchanged at 48hz.

    Peter.
    --
     
    Peter Irwin, May 29, 2014
    #10
  11. RichA

    Sandman Guest

    No, that's not the case at all. 120Hz isn't the frame rate, it's the
    refresh rate of the CRT screen. Most film projectors have a frame rate of
    24 frames per second, but each frame is illuminated two or three times, so
    the refresh rate is either 48Hz or 72Hz.

    On a CRT TV, the frame rate of the content is usually based on either PAL
    or NTSC.

    PAL is 25 frames per second, and 50Hz, so each frame is "displayed" twice.
    NTSC is 30 frames per second, and 60Hz, also displayed twice.

    A 120Hz NTSC TV displays each frame four times.

    This eliminates any appreciations of flicker, but tells you nothing about
    how smooth the movements of the actual video content is.
     
    Sandman, May 30, 2014
    #11
  12. RichA

    Sandman Guest

    Which is why we don't. EVF's in cameras aren't CRT displays. LCD refresh
    times are measured in milliseconds, not hertz.
     
    Sandman, May 30, 2014
    #12
  13. RichA

    Sandman Guest

    High framerate cameras doesn't make the movie play back at a higher
    framerate, it just slows down the movements, so people move slower and thus
    is no longer an action shot.
    Not "kind of" like that at all. The human eye, yours included, does "fill
    in" the gaps between frames, which of course isn't needed since a frame
    rate of 24fps is higher than the eye can discern, as have been said.
     
    Sandman, May 30, 2014
    #13
  14. RichA

    David Taylor Guest

    On 30/05/2014 08:19, Sandman wrote:
    []
    Well, actually in the application in question, there was no NTSC video
    involved at all. The CRT refresh rate is indeed what was being
    described, and this because of the ambient lighting level, and desire to
    avoid visible flicker. My point being that the flicker depends on the
    eye response time, which is heavily dependent on the ambient light
    level, which is grossly different between a darkened cinema and an
    aircraft cockpit. Different situations, different requirements.

    I do accept the point that interlaced systems can show motion blurring,
    but there may be times when 50i is preferable to 25p.
     
    David Taylor, May 30, 2014
    #14
  15. RichA

    Sandman Guest

    No, the human eye would not find 24fps "blurry" in either a dark or
    brightly lit environment.

    If the video is blurry, it's due to shutter speed and not the frame rate.

    Now, for 3D video, there's another story, 3D video shown at 24fps at
    refresh rate 48Hz will show only each frame once for each eye, which means
    that some do experience blurriness, which is also due partly to how 3D
    glasses work as well.
     
    Sandman, May 30, 2014
    #15
  16. RichA

    Whisky-dave Guest

     
    Whisky-dave, May 30, 2014
    #16
  17. RichA

    David Taylor Guest

    On 30/05/2014 15:25, Sandman wrote:
    []
    You miss that I am talking about flicker as one driver for high refresh
    rates, not motion blur.

    I have no interest in 3D
     
    David Taylor, May 30, 2014
    #17
  18. RichA

    Jeff Guest

    The response time is how long it takes for the liquid crystals to
    transition from one frame to the next. The refresh rate is how often the
    frame changes. In your 50Hz (20ms) example, the display changes for 2 ms
    and holds that frame for 18 ms until the next frame change begins.
     
    Jeff, May 30, 2014
    #18
  19. RichA

    Sandman Guest

    If you think 1850's is "recent"...
    Those early 19th century books and their terminology...
    Meters per second? "Millisecond" is "ms".
    "ms" is the amount of time it takes a LCD pixel to go from one value to
    another and back again. Hertz is how many times the entire screen is
    redrawn. They're entirely different, and one frequency measurement can't be
    used to describe the other.
     
    Sandman, May 30, 2014
    #19
  20. RichA

    Sandman Guest

    Ys, it was the OP that talked about motion blur, I was just keeping on
    topic.

    Your eye wouldn't be more or less suspectible to noticing CRT flicker
    depending on enviromental lighting either.
     
    Sandman, May 30, 2014
    #20
    1. Advertisements

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments (here). After that, you can post your question and our members will help you out.