Why does Digital Video look so terrible?

Why do movies look smooth at 24 fps, but video games look terrible at 24 fps? Is it because of motion blur?

  • If yes, how does that affect video games vis-à-vis movies?

  • Answer:

    Back in the early days of cinema, 16fps was found to be the minimum speed at which you could record motion picture and have it play back with the appearance of motion instead of staggered still images... with the invention of sound this had to be bumped to 24fps to accommodate limitations of early audio recording technology. So that's where the 24fps "standard" comes from. There's a number of reasons that we experience framerate differences between movies and video games. Motion blur (as you guessed) is indeed a big one - but so is audience interaction and resolution of recording media. Let's look at Eddward Muybridge's historic study of a running horse from 1878: http://en.wikipedia.org/wiki/Eadweard_Muybridge Even though this animation only uses 11  "frames" it was recorded from a subject that had an infinite number of potential frames. Because the camera and the subject are separate there are all kinds of tiny little cues that help "smooth out" the effect of motion. Because the camera records at a fixed shutter speed there is always some aspect of motion blur - which increases  as the subject speeds up.  This also sort of creates a natural "frame interpolation" (or http://neuron2.net/LVG/interlacing.html effect. Video game graphical engines on the other hand aren't working from recording real life moving images, they're essentially highly sophisticated automated animation. Computer game models are puppets that aren't moving in actual real-time, they're mathematical approximations of movement, trying to come as close as possible to guessing what the "key information" to show a player is. This approach has some huge advantages - things like very complicated lighting effects, shiny surfaces, texture mapping are all much easier to apply to series of "still frames" but it can lead to results that look very frozen and appear "jittery" when they are stitched together to simulate movement. http://www.mobygames.com/game/windows/need-for-speed-most-wanted-black-edition Of course video game artists are always working to improve the engines they work with, and latest games are doing their best to try and include as many of those "subconscious cues" as possible including motion blur, artificial grain, and simulating the way an actual camera would record a fully moving subject. http://www.reddit.com/r/explainlikeimfive/comments/1k546z/eli5_why_is_it_that_movies_look_unbelievably/ However, our brains are really *really* good at interpreting motion and let us know when there are  patterns or artifacts in our sensory input that don't match exactly what we're expecting to see in real life. And we're *far* more critical of material that we're directly interacting with (like video games) than material we're passively observing (like movies, or animated visual programs, or flip books). Part of this is just conditioning, part of it is the http://en.wikipedia.org/wiki/Uncanny_valley effect that all CG artists battle against, that the closer you get to "realistic" the harder it is to bridge that final gap. A great example of this convergence is the ongoing discussions about "The Hobbit" shooting in HFR (High Frame Rate - in this case 48fps). () Even though that approach technically provided twice the visual information to the audience, the large amount of CG in the film (and the unfamiliar experience for the audience) left many people complaining it felt "wrong" and "cheap" and "video-gamey".

Brad Fox at Quora Visit the source

Was this solution helpful to you?

Other answers

There is a big difference in feel of the gameplay when input & response happen only 24 times per second vs. 60 times per second, especially for fast-paced games such as first person shooters. Network buffers and input buffers are filled on separate threads, which means new state from the game server, or button presses from your gamepad, must wait until the next iteration in the game engine's "update loop". This wait can be as long as 42 ms for 24 updates per second, while only 16 ms for 60 updates per second. That's a 26 ms difference or roughly 25% of the "lag" we experience on a 150 ms server connection vs. a 50 ms server connection. Cameras in the real-world have what's called a shutter, which is open for a continuous range of time, determined by the "shutter angle" or "shutter speed". For instance a moving picture captured at 24 frames per second might have the shutter open for 0.02083 seconds per frame (1/48 of a second, or 180° shutter angle). This continuous interval of time captures and blends all motion happening therein, leading to what we see as motion blur. Games on the other hand, render only an instantaneous moment of time. There is no equivalent interval where motion is recorded and blended, and instead you create what is essentially a crystal clear sample of the world at a particular instant -- something that is not possible in the real world. Because no motion is recorded in the rendered frame, movement on-screen can look jerky unless the frame rate is increased to compensate (by capturing more in between motion). By increasing frame rate you essentially converge on real life "frame rates", leaving us with the biological motion blur we get from our eyes (which are like shutters that are always open). Though modern games feature "motion blur" now, this only captures motion blur under certain assumptions, and does not (yet) fully recreate the motion blur we see in film or in high-quality CGI renderings.

Animesh Pandey

The brain is an amazing thing! Our eyes work at a frequency of 60hz. Something that cycles at a lower rate appears to flicker, or seem to pulse. Things that cycle faster appear to be constant. We cannot perceive cycles faster than 60 hz. This is equivalent to 60 fps in projected media. Dogs perceive at 80 hz, cats at 60 hz . This is why cats will be drawn to a television screen while dogs will ignore it, to them it doesn't look real. Just think of what happens when you take a photograph of a cathode tube television, it looks like there are grey bars across the screen. You can get rid of this effect by taking the picture at 1/60 second. Most flat screen televisions have a much higher refresh rate, usually at a multiplication of 60 hz. (60, 120, 240, 600hz) These higher rates do help reduce motion blur because the image is refreshed faster than we can perceive. 24fps is less than half the speed at which our system functions. In an analog devise this may make the image appear soft or a little bit out of focus or it may flicker. On a digital device it will definitely flicker. Our brains will compensate for this by 'filling in" the missing bits, appearing to smooth it out. If we pay attention to the image quality, rather than the content of the image, we will see the flicker clearly. Indeed, once you notice it, it can be hard to ignore! In a video game where motion and tracking are so important, the gaps in the projected image frustrate us. It is like someone is putting a grey piece of paper in front of our eyes every 1/4 second, and we lose track of the object, but then it hasn't moved during the instant we are blocked so our brains try to fill in a gap that isn't there. There is a reason nobody over the age of 12 wants to play on a Nintendo game boy!

David Moore

It's because in film, the camera normally does not exceed a set pan speed, so as not to break the illusion of movement the viewer creates from the still frames. In first person video games, players are in control of speed and often turn rapidly. This shatters the illusion of movement at anything under 50–60 fps. In vr, this leads to feeling sick. On TV, it merely leads to feeling uneasy.

Luke Wood

Movies in an old-fashioned cinema have intervals of black between the frames. This rapid strobe can be disturbing, but it also gives a good illusion of smooth motion at low frame rates. Our brains fill in the black gaps between frames, so we see smooth motion, whereas with modern LCD monitors and digital movie projectors the frames persist, and we can see objects jerking from frame to frame unless the frame rate of the video is very high.

Sam Watkins

Related Q & A:

Just Added Q & A:

Find solution

For every problem there is a solution! Proved by Solucija.

  • Got an issue and looking for advice?

  • Ask Solucija to search every corner of the Web for help.

  • Get workable solutions and helpful tips in a moment.

Just ask Solucija about an issue you face and immediately get a list of ready solutions, answers and tips from other Internet users. We always provide the most suitable and complete answer to your question at the top, along with a few good alternatives below.