A story about the sense of sight, frame perception and refresh rate, motion blur and television screens. (see also the translation of the article by the same author “ The Illusion of Speed ” - translation of the comment.)
Introduction
You could hear the term frames per second (FPS), and that 60 FPS is a really good reference point for any animation. But most console games go for 30 FPS, and movies usually record for 24 FPS, so why should we aim for 60 FPS?
Frames ... per second?
The early days of filmmaking
Filming a 1950 Hollywood film "Julius Caesar" with Charlton Heston ')
When the first filmmakers began making films, many discoveries were made not by the scientific method, but by trial and error. The first cameras and projectors were controlled manually, and the film was very expensive - so expensive that when shooting they tried to use the lowest possible frame rate, just to save the film. This threshold was usually between 16 and 24 FPS. When a sound tape was put on the physical tape (audio track) and played back simultaneously with the video, manual playback was a problem. It turned out that people normally perceive a variable frame rate for video, but not for sound (when both tempo and pitch changes), so the cinematographers had to choose a constant speed for both. We chose 24 FPS, and now, after almost a hundred years, it remains the standard in the cinema. (In television, the frame rate had to be slightly changed due to how CRT TVs synchronize with the frequency of the power supply).
Frames and the human eye
But if 24 FPS is hardly acceptable for a movie, then what is the optimal frame rate? This is a tricky question, because there is no optimal frame rate.
Motion perception is the process of deducing the speed and direction of the elements of a scene based on visual, vestibular and proprioceptive sensations.Although the process seems simple for most observers, it turned out to be a difficult problem from a computational point of view and extremely difficult to explain from a neural processing point of view.- Wikipedia
The eye is not a camera. He does not perceive movement as a series of frames. He perceives a continuous stream of information, rather than a collection of individual pictures. Why then do the frames work at all?
Two important phenomena explain why we see movement when we look at rapidly changing images: inertia of visual perception and phi-phenomenon (stroboscopic illusion of continuous movement - approx. Lane).
Most filmmakers think that the inertia of visual perception is the only cause, but it is not; although confirmed, but not proven scientifically, the inertia of visual perception is a phenomenon, according to which the residual image is likely to persist for about 40 milliseconds on the retina. This explains why we do not see a dark flicker in cinemas or (usually) on a CRT.
Fi-phenomenon in action.Noticed the movement in the picture, although nothing moves on it?
On the other hand, many consider it the fi-phenomenon to be the true reason that we see movement behind individual images. This is an optical illusion of perception of continuous movement between individual objects, if they are quickly shown one after another. But even the phi-phenomenon is being questioned , and scientists have not come to a common opinion.
Our brain helps to fake motion very well - not perfect, but good enough. A series of still frames imitating movement creates different perceptual artifacts in the brain, depending on the frame rate. Thus, the frame rate will never be optimal, but we can get closer to the ideal.
Standard frame rates, from bad to perfect
To better understand the absolute quality scale of the frame rate, I suggest to look at the overview table. But remember that the eye is a complex system and it does not recognize individual frames, so this is not an exact science, but simply observations of different people in the past tense.
Frame rate
Human perception
10-12 FPS
The absolute minimum for demonstrating movement. Smaller values are already recognized by the eye as separate images.
<16 FPS
Visible hitches are created, many people have such frame-rate headaches.
24 FPS
Minimum tolerable frame rate for motion perception, cost effective
30 FPS
Much better than 24 FPS, but not realistic. This is a standard for NTSC video due to the frequency of the AC
The zone of the best perception; most people will not accept further quality improvements above 60 FPS.
∞ FPS
To date, science has failed to prove or by observation to detect the theoretical limit of a person.
Note: Although 60 FPS is considered a good frame rate for smooth animation, this is still not enough for a great picture. Contrast and sharpness can still be improved beyond this value. To study how our eyes are sensitive to changes in brightness, a number of scientific studies were conducted. They showed that the subjects were able to recognize the white frame among the thousands of black frames. If you want to dig deeper, here are afewresources , andmore .
Demo: what does 24 FPS look like compared to 60 FPS?
60vs24fps.mp4 I thank my friend Mark Tonsing for making this fantastic comparison.
HFR: reassembly of the brain using the "Hobbit"
The Hobbit was a popular movie shot on 48 FPS double frame called HFR (high frame rate). Unfortunately, not everyone liked the new look. There were several reasons for this, the main one being the so-called “ soap opera effect ”.
The brain of most people is trained to perceive 24 full frames per second as a quality movie, and 50-60 half-frames (interlaced television signals) remind us of television and destroy the “ film effect ”. A similar effect is created if you activate motion interpolation on your TV for 24p material (progressive scan). Many do not like it (despite the fact that modern algorithms are quite good at rendering smooth movements without artifacts, which is the main reason why critics reject this function).
Although HFR significantly improves the image (makes the movements not so intermittent and struggles with the blurring of moving objects), it is not easy to find the answer to how to improve its perception. This requires retraining of the brain. Some viewers do not notice any problems after ten minutes of watching The Hobbit, but others absolutely do not tolerate HFR.
Cameras and CGI: history of motion blur
But if 24 FPS is called a barely portable frame rate, then why did you never complain about the intermittent video when leaving the cinema? It turns out that camcorders have a built-in function — or a bug, if you will — which is not enough in CGI (including CSS animations!): This is motion blur, that is, blur of a moving object.
After you have seen motion blur, its absence in video games and in software becomes painfully obvious.
Motion blur, as defined on Wikipedia, is
... a visible draw of fast moving objects in a still image or a sequence of images, such as a movie or animation.It occurs when the recorded image changes during the recording of one frame, either due to fast movement or during long exposure.
In this case, the picture is better than a thousand words.
No motion blur
C motion blur
Images from Evans & Sutherland Computer Corporation, Salt Lake City, UT.Used with permission.All rights reserved.
Motion blur uses trick, depicting a lot of movement in one frame, sacrificing detail . That's the reason why a 24 FPS movie looks relatively acceptable compared to a 24 FPS video game.
But how does motion blur initially appear? According to the description of E & S , which first applied 60 FPS to its mega-dome screens:
When you shoot a movie at 24 FPS, the camera sees and records only part of the movement in front of the lens, and the shutter closes after each exposure to rewind the film to the next frame.This means that the shutter speed is closed for the same time as it is open.With fast motion and action in front of the camera, the frame rate is not high enough to keep up with them, and the images are blurred in each frame (due to the exposure time).
Classic movie cameras use a shutter (rotating partitioned disc - approx. Lane) to capture the motion blur. Rotating the disc, you open the shutter for a controlled period of time at a certain angle and, depending on this angle, change the exposure time. If the shutter speed is small, then less motion will be recorded on the film, that is, motion blur will be weaker; and if the shutter speed is large, more movement will be recorded and the effect will be stronger.
Obturator in action.Via Wikipedia
If motion blur is such a good thing, then why are filmmakers trying to get rid of it? Well, when you add motion blur, you lose detail; and getting rid of it - you lose the smoothness of movement. So when directors want to shoot a scene with a lot of detail, like an explosion with a lot of flying particles or a complex scene with an action, they often choose a small shutter speed that reduces blurring and creates a clear puppet animation effect.
Visualization capture motion blur.Via Wikipedia
So why not just add it?
Motion blur significantly improves animation in games and on websites even at low frame rates. Unfortunately, its implementation is too expensive. To create the perfect motion blur, you would need to take four times as many frames of the object in motion, and then perform time filtering or anti-aliasing (here is a great explanation from Hugo Eliasch). If you need to render to 96 FPS to release acceptable material at 24 FPS, you can instead simply raise the frame rate, so this is often not an option for content that is rendered in real time. The exceptions are video games, where the trajectory of movement of objects is known in advance, so you can calculate the approximate motion blur , as well as declarative animation systems like CSS Animations and, of course, CGI movies like Pixar.
60 Hz! = 60 FPS: update rate and why it is important
Note: Hertz (Hz) is usually used when talking about refresh rates, while frame per second (fps) is an established term for frame-by-frame animation.In order not to confuse them, we use Hz for the refresh rate and FPS for the frame rate.
If you are wondering why Blu-Ray discs look so ugly on your laptop, then the reason is that the frame rate is unevenly divided by the screen refresh rate (in contrast, DVDs are converted before transferring). Yes, the refresh rate and frame rate are not the same. According to Wikipedia, “[..] the refresh rate includes the repeated drawing of identical frames, while the frame rate measures how often the original video will display a full frame of new data on the display.” So the frame rate corresponds to the number of individual frames on the screen, and the refresh rate corresponds to the number of times when the image on the screen is updated or redrawn.
In the ideal case, the refresh rate and frame rate are fully synchronized, but in certain situations there are reasons to use the refresh rate three times higher than the frame rate, depending on the projection system used.
New problem for each display
Cinema projectors Many people think that while working, film projectors roll the film in front of the light source. But in this case, we would observe a continuous blurred image. Instead, the shutter is used here to separate frames from each other, as is the case with movie cameras. After the frame is displayed, the shutter closes and the light does not pass until the shutter opens for the next frame and the process repeats.
However, this is not a complete description. Of course, as a result of such processes, you will see a movie, but the flickering of the screen due to the fact that the screen remains dark 50% of the time will drive you crazy. These blackouts between frames will destroy the illusion. To compensate, projectors actually close the shutter two or three times in each frame.
Of course, this seems illogical - why as a result of the addition of additional flickers it seems to us that there are fewer of them ? The challenge is to reduce the period of darkening, which has a disproportionate effect on the visual system. The flicker flicker threshold (closely related to the inertia of visual perception) describes the effect of these blackouts. At about ~ 45 Hz, blackout periods should be less than ~ 60% of the frame time, which is why the double shutter release method in the movie is effective. More than 60 Hz blackout periods can be more than 90% of the frame display time (necessary for displays like CRT). The whole concept is generally a bit more complicated, but in practice this is how to avoid flicker:
Use a different type of display, where there is no darkening between frames, that is, it constantly displays the frame on the screen.
Apply persistent, unchanging blackout phases with a duration of less than 16 ms
Flickering CRT
CRT monitors and televisions work by directing electrons to a fluorescent screen, which contains a phosphor with a low afterglow time . How short is the afterglow time? So small that you never see the full image! Instead, during the electronic scanning process, the phosphor ignites and loses its brightness in less than 50 microseconds - this is 0.05 milliseconds ! For comparison, the full frame on your smartphone is demonstrated for 16.67 ms.
Screen refresh shot at 1/3000 second exposure.From wikipedia .
So the only reason why CRT works at all is the inertia of visual perception. Due to the long dark gaps between the highlights of a CRT, they often seem to flicker - especially in the PAL system, which operates at 50 Hz, unlike NTSC, which operates at 60 Hz, where the flicker fusion threshold is already activated.
To further complicate matters, the eye does not perceive the flicker equally on every part of the screen. In fact, peripheral vision, while transmitting a more blurred image to the brain, is more sensitive to brightness and has a much shorter response time. This was probably very useful in ancient times for detecting wild animals jumping from the side to eat you, but it causes inconvenience when watching movies on a CRT from close range or from a strange angle.
Blurry LCDs
Liquid crystal displays (LCDs), which are classified as sampling and storage devices , are actually quite amazing, because they don’t have any blackouts between frames at all. The current image is continuously displayed on it until a new image arrives.
Let me repeat: LCDs have no flicker caused by the screen refresh, regardless of the refresh rate .
But now you are thinking: “Wait a minute, I recently chose a TV, and each manufacturer advertised, damn it, a higher screen refresh rate!” And although this is mostly pure marketing, LCD displays with a higher refresh rate solve the problem - it’s just not the one you think about.
Visual blur in motion
LCD manufacturers are all raising and increasing the refresh rate due to on - screen or visual motion blur . And there is; not only is the camera capable of recording motion blur, but your eyes can too! Before explaining how this happens, here are two demos that take down the roof to help you feel the effect (click on the image).
In the first experiment, focus your eyes on a stationary flying alien at the top and you will clearly see white lines. And if you focus your eyes on a moving alien, the white lines will magically disappear. From the Blur Busters website:
“Due to the movement of your eyes, the vertical lines are blurred into thicker lines each time the frame is refreshed, filling the black voids.Small afterglow displays (such as a CRT or LightBoost ) eliminate this kind of motion blur, so this test looks different on such displays. ”
In fact, the effect of tracing the sight of various objects can never be completely prevented, and often it is such a big problem in cinema and production that there are special people whose only job is to predict what exactly will track the viewer's eye in the frame and ensure that nothing the other does not hurt him.
In the second experiment, the guys from Blur Busters are trying to recreate the effect of the LCD display compared to a screen with a small afterglow, simply by inserting black frames between the frames of the display — surprisingly, it works.
As shown earlier, motion blur can be either a blessing or a curse - it sacrifices harshness for the sake of smoothness, and the blur added by your eyes is always undesirable. So why is motion blur - so big a problem for LCD displays compared to CRT, where such questions do not arise? Here is an explanation of what happens if a short-term frame (taken in a short time) lingers on the screen longer than expected.
When addressing a pixel, it is loaded with a certain value and remains with this value of the light output until the next addressing.In terms of drawing an image, this is wrong.A specific copy of the original scene is valid only at a specific instant.After this moment, the objects of the scene must be moved to other places.It is incorrect to hold the images of objects in fixed positions until the next sample arrives.Otherwise, it turns out that the object seems to suddenly jump to a completely different place.
And his conclusion:
Your gaze will try to smoothly follow the movements of the object of interest, and the display will hold it in a stationary state for the entire frame.The result will inevitably become a blurred image of a moving object.
Here is how! It turns out that what we need to do is to light the image onto the retina, and then let the eye and the brain do the interpolation of the movement.
Extras: so to what extent does our brain interpolate, in fact?
No one knows for sure, but there are definitely many situations where the brain helps create the final image of what is shown to it. Take, for example, this test for a blind spot : it turns out that there is a blind spot in the place where the optic nerve joins the retina. In theory, the stain should be black, but in fact the brain fills it with an interpolated image from the surrounding space.
Frames and screen updates do not mix and do not match!
As mentioned earlier, there are problems if the frame rate and the refresh rate of the screen are not synchronized, that is, when the refresh rate is not shared without the rest of the frame rate.
Problem: screen break
What happens when your game or application starts drawing a new frame on the screen and the display is in the middle of the update cycle? It literally tears the frame apart:
This is what happens behind the scenes. Your CPU / GPU performs some calculations to compose a frame, then transfers it to a buffer, which must wait for the monitor to trigger an update through the driver stack. The monitor then reads this frame and starts displaying it (here you need double buffering so that one image is always given, and one is compiled). A gap occurs when the buffer, which is currently displayed from top to bottom, is replaced by the next frame, which is output by the video card. The result is that the upper part of your screen is obtained from one frame, and the lower part - from another.
Note: to be precise, a screen break may occur even if the refresh rate and the frame rate are the same! They must be the same phase and frequency .
This is clearly not what we need. Fortunately, there is a solution!
Solution: Vsync
The screen gap can be eliminated with the help of Vsync, abbreviated from "vertical sync". This is a hardware or software function that ensures that a gap does not occur - that your software can draw a new frame only when the previous screen update is completed. Vsync changes the frame removal rate from the above process buffer so that the image never changes in the middle of the screen.
Therefore, if the new frame is not yet ready for drawing on the next screen update, the screen will simply take the previous frame and redraw it again. Unfortunately, this leads to the next problem.
New problem: jitter
Although our frames are no longer broken, reproduction is still far from smooth. This time, the cause is a problem that is so serious that every industry gives it its name: jadder, jitter , statter, junk or hitch, jitter and coupling. Let's dwell on the term "jitter."
Jitter occurs when the animation is played at a different frame rate compared to the one at which it was shot (or was supposed to play). Often this means that jitter appears when the playback frequency is unstable or variable, and not fixed (since most of the content is recorded at a fixed frequency). Unfortunately, this is exactly what happens when you try to display, for example, 24 FPS content on the screen, which is updated 60 times per second. From time to time, since 60 is not divisible by 24 without remainder, one frame must be shown twice (unless more advanced transformations are used), which spoils smooth effects, such as panning a camera.
In games and on websites with a lot of animation this is even more noticeable. Many people can not play the animation on a constant, sharing the rest of the frame. Instead, their frame rates vary greatly for various reasons, such as the operation of individual graphic layers independent of each other, the processing of user data input, and so on. It may shock you, but the animation with a maximum frequency of 30 FPS looks much, much better than the same animation with a frequency that varies from 40 to 50 FPS.