Optimization of projects for virtual reality requires its own special approach. In addition to the common things that should be paid attention to when developing 3D games, VR imposes a number of strict limitations. Any application requires not only an almost instantaneous response to any movement of the player (whether it be a head turn or a wave of the hand), but also the provision of a stable frame rate that far exceeds the standard requirements for “classic” games of any genre.
Modern helmets, such as the Oculus Rift and HTC Vive, support a number of special technologies that are designed to smooth performance drops. This should allow to compensate for FPS subsidence by artificially increasing the frame rate; improve user experience; provide some additional freedom to developers of end products and reduce the minimum system requirements. But is everything beautiful in practice? How do these technologies work and what is the difference between them? This will be discussed in this article.
While in ordinary games, often 30 FPS is enough, and 60 is already considered an excellent indicator, then for modern VR devices the
minimum value is 90 frames per second (60 FPS for Sony VR on PS4, but 90 is also officially recommended). It is important to add that for the comfort of the user in the helmet, the frame rate should be stable and not change throughout the game. A brief drop in FPS, say, up to 30 in a normal game on a monitor or TV, is likely to go completely unnoticed, but for a person in VR this will mean complete destruction of the immersion effect. The picture immediately begins to twitch, lag behind the turns of the head and the unpleasant sensations up to the motion sickness and nausea are provided to the player. So how can you at least partially eliminate all these negative effects?
')
Alternating Reprojection and Timewarp Synchronous
(Interleaved Reprojection and Synchronous Timewarp)
In the beginning, a few words should be said about vertical frame synchronization technology, which is close to this topic. In the most basic mode, the game engine for each iteration of the main loop produces one frame displayed on the screen. Depending on the time required for one iteration, the frame rate will vary, because The complexity of the game scene and logic is also variable. If this frequency does not coincide with the frequency of updating the image of the monitor, this often leads to the fact that the picture begins to “crumble”. To combat this effect, vertical synchronization (V-Sync) is used, which allows the engine to give a new frame strictly after the sync pulse, whose frequency is proportional to the refresh rate of the image on the monitor (usually 60 Hz).
Thus, the maximum frame rate will be 60, but if the frame preparation time exceeds 1/60 second, the next frame will start drawing only on every second clock and the FPS will drop exactly 2 times to 30 frames per second and so on.
In VR projects, as mentioned above, this approach to synchronization is not applicable, because requires sustained high frame rates. But after all, the engine just as well may not have time to produce 90 images per second, what to do? And here comes a trick called alternating reprojection. As soon as the system detects that the FPS has dropped below 90, it forcibly reduces the frame rate of the game by 2 times (to 45) and begins to “finish” intermediate frames on its own, taking the previous image as a basis and turning it through the 2D angle by this time the player's head turned.
Thus, with 45 frames per second generated by the game, 90 are still displayed on the screen! As soon as the engine is again able to produce 90 frames per second independently, the reprojection is automatically turned off.
It is important to note that it is precisely the rotation of the head that is compensated in this way, but not its movement in space. The result of moving such 2D transformations is not obtained due to the resulting parallax effect. But usually the rotation of the head occurs much more often and faster than its movement, so this method is quite effective.
This approach was implemented first for both Oculus Rift and HTC Vive. For SteamVR, it is called Interleaved Reprojection, and for Oculus SDK, it is Synchronous Timewarp.
The biggest disadvantage is that at the time of on / off reproduction, the user notices a noticeable “flinch” and change in the smoothness of the picture. In addition, even with a slight decrease in game performance, we immediately get a 2-fold drop in the real frame rate, and no matter how well the 2D transformation algorithm is implemented, it cannot provide the same accuracy and smoothness of the picture as the engine itself. Therefore, this method has been further improved in both SDKs.
Asynchronous Reprojection and Timewarp
(Asynchronous Reprojection and Timewarp)
The first to implement this approach was a company called Oculus called Asynchronous Time Warp (ATW), and it has been the main one for Oculus Rift since its official release. In general, the idea is the same - to supplement the sequence of frames in order to maintain a stable 90 FPS, but the addition process itself is allocated to a separate stream, independent of the main render. Immediately before each sync pulse, ATW displays to the screen either a frame successfully generated by the engine, or a result of the reprojection of the previous frame, if the new frame is not yet ready. Thus, two scenarios are possible:
- the engine managed to render a frame - this frame is displayed on the screen as it is and enters the ATW buffer;
- the engine did not have time to render the frame - the unfinished frame is “postponed” until the next tick of VSync, and the result of the 2D reprojection of the previous frame from the ATW buffer is displayed on the screen.
This makes it possible not to limit the game engine artificially and allows it to generate frames with the frequency that it is most capable of. At the same time, ATW “secures” the game, compensating for the drop in performance by reproducing it strictly at those moments when it is needed. As a result, there is no noticeable “jerkiness” of the picture when the reprojection mode is changed, much less “useful” frames of the engine are discarded, and in general the user feels everything is much more natural.
For game developers, supporting this technology costs nothing — it just works with any VR application. But on the part of the SDK, OS, and video card manufacturers, the implementation of this method required considerable additional joint efforts. Modern hardware and software for GPUs are optimized for very high bandwidth, but not for the continuity of frames required for ATW operation. Oculus had to work closely with Microsoft, NVidia and AMD to adapt the processor microcode and the GPU kernel driver to better support ATW. These extensions are called AMD's Liquid VR and NVIDIA's VRWorks.
At the end of 2016, Valve also presented its version of this method, which was called Asynchronous Reprojection. However, according to the company itself, everything turned out to be not so clear: according to its research, despite the reduction of specific jerks and “startles” when switching reprojection, periodically randomly generated “false” frames obtained as a result of 2D transformations are also perceived by players as annoying factor. Everything is quite individual, so this feature can, if desired, be turned off in the SteamVR settings, returning to the classic synchronous reprojection.
Asynchronous Spacewarp
(Asynchronous Spacewarp)
As already mentioned, all the methods described are capable of complementing the sequence of frames, compensating only for turns of the player’s head. But what if the player also moved aside? The position of the review will change, and due to the parallax effect of conventional 2D transformations, it will simply not be enough to generate more or less reliable intermediate frames. As a result, if the player at the time of sinking the FPS will actively move his head, the whole “world” around him will twitch and shake, destroying the effect of immersion. To solve this problem, Oculus developed another technology that works on top of asynchronous timewarp, and called it Spacewarp (from the English. Space) or ASW.
Unlike timewarp, here when generating "intermediate" frames, the movements of the character, camera, VR controllers and the player are taken into account. ASW handles the parallax effect created when the viewpoint is shifted and, as a result, near objects are relatively distant. For this, an additional depth buffer of the last rendered frame is used, according to which the ASW understands which objects are in the near plane and which objects are in the near plane. As a result, on the “intermediate” frame it becomes possible not only to shift the point of view, according to the player's movements, but also to simulate the movement of some objects relative to others.
Of course, the system will not be able to find out what is behind the area that was blocked by nearby objects, so the corresponding zones will, in fact, be generated from the environment, by its “stretching”. This is clearly illustrated by the following picture:
ASW is inextricably linked with ATW and each of them performs its task. Timewarp is ideal for static images of the environment, located at some distance from the viewer, and spacewarp, in turn, is responsible for moving objects near the player.
It is also worth noting that support for ASW is present only in Windows 8 and later versions. And the best implementation is available only in Windows 10, since It was there that Microsoft carried out additional work in this direction. Analogue of this technology for HTC Vive has not yet been announced.
Technology Summary
Reprojection, ATW and ASW, of course, make life easier for both developers and end users, allowing you to make virtual reality more similar to real life and smoothing out the flaws of 3D rendering, but you should not consider them as a panacea. In theory, it may seem that these methods can lower the performance bar to 45 frames per second or even lower, but in practice this is not always the case. Depending on what kind of virtual world a player is in, what events take place there and how he moves, all methods of artificially “complementing” a video stream can cause various artifacts, jerks and “jitter” of a picture that is felt by the user, often on a subconscious mind. level
From the user's side, if you are just starting to use a VR helmet, be it an Oculus Rift or HTC Vive, leave all help systems turned on by default. This is especially important if your system is not the most productive. Most likely, it is in this version that your experience of immersion in virtual reality will be the best, and the range of supported content will expand. However, all people are individual, and when we talk about VR, this is doubly true. What is comfortable for one can cause permanent rejection of the other. Therefore, if you notice that everything seems to be smooth, but sometimes something is embarrassing, something not moving as it should be, there is a feeling of motion sickness - try disconnecting ASW successively, then ATW and returning to classical reprojection. Perhaps this is exactly what you need.
The main advice
from the developers is the same - do not rely on the fact that modern methods of reprojection will allow you to pay less attention to optimization and solve all problems with a stable frame rate. 90 FPS is still the “gold standard” to which you should strive, not forgetting to leave a margin for weaker systems and the most difficult moments of the game. But still it is nice to realize that at any unforeseen time, such technologies will smooth out “sharp corners” for your user, hide some flaws and allow you to fully enjoy immersion in virtual worlds!