📜 ⬆️ ⬇️

Android - Live Wallpaper on OpenGL ES 2.0 with Simplified DOF Effect

The idea for the creation of the application was the video, in turn, inspired by the design of Deus Ex: Human Revolution, namely its elements with glass fragments. For a couple of days, the final vision of the scene matured and as a result, over the weekend, the first version of the finished application was created.

photo title_zps88207f0f.jpg

Building a scene


The scene is extremely simple. It consists of actually smoothly rotating glass fragments, suspended in the air. Fragments are divided into 2 groups - located near and far. Objects in the distance are blurred to add extra volume to the scene. There is also a blurred background of arbitrary color. The camera moves left-right according to the movement of the finger on the screen, and thus the fragments move.

Technically, this is implemented as follows: Glass fragments are placed along the cylinder around the chamber. Shards located beyond a certain distance are drawn into a separate texture and blurred. Together with them the background is drawn also. First, blurred objects are drawn away and fragments closer to the camera are already drawn on top of them.
')
It was in such a simple way that we implemented the simplified depth-of-field (DOF) effect, which is quite enough for this scene. The full implementation of DOF is somewhat more complicated and more resource-intensive. This would require rendering into separate textures a map of the depth of the scene, the finished scene, and it again with blur. And then draw on the screen at the same time a blurry and clear scene, mixing them according to the depth map and the parameters of the camera focus.

Implementation


Since all objects in the scene are transparent, the entire rendering is done with different blending modes. At the same time, the entry in the depth buffer is not conducted so that the transparent glasses do not cut the objects behind them. Since the glasses themselves are few, this does not cause too large re-drawing of the pixels. To create highlights and reflections, fragment objects are drawn with a cubemap size 128x128.

Background rendering order:

1. Ochitska FBO with glClearColor desired color.
2. A mask is drawn over the entire frame buffer size. Thus, we obtain decorative colored blurred spots for the background instead of a solid color.
3. Then draw the glass for the background. The resolution is 256x256, the image is quite pixelated.
4. Blur the entire background. Low resolution background almost imperceptibly.

photo bg_zps70ce1ce5.jpg

Drawing the main scene and layout of two plans:

1. Cleaning the screen.
2. Drawing the background.
3. Rendering Foreground Glasses.
This rendering is done without writing to the depth buffer, since all objects are transparent.

photo fg_zps0f2e8074.jpg

Blur objects in the background.


Blur is implemented by sequential drawing of the image between two frame buffers with a special shader. The shader can do horizontal or vertical blur, it is set by parameters. This blur is called ping-pong rendering. Its essence lies in the fact that first the texture from frame buffer A is drawn with horizontal blur in frame buffer B, and then vice versa from B to A, but with vertical blur. This procedure can be repeated the necessary number of iterations to achieve the required quality of blurring the original image. An example of the implementation of this effect of post-processing was taken long ago from some bloom example, unfortunately I can’t find the link.

It is noteworthy that modern phones and tablets (and even very old devices, too) can manage to do not even one, but several iterations of blur fairly quickly. In practice, it turned out that the Nexus 10 produces a stable 50-40 fps even with 6-8 passes of the texture blur 256x256, with one pass being full - horizontal + vertical blur.

Selecting a fairly compromise ratio of texture resolution, number of passes, and blur quality, we stopped at three iterations and a resolution of 256x256.

Mali


In the previous article I threw a stone in the nVidia garden. This is not because I just dislike NVIDIA - I also dislike any other hardware manufacturers that provide buggy drivers for their GPUs. For example, when developing the described live wallpapers, we encountered a problem on the Nexus 10. The problem is the incorrect rendering into the texture, and this only manifests when the orientation of the device changes. How the orientation of the tablet can affect the rendering in texture is a mystery to us, but it is a fact.

First, in order to make sure that I just missed some nuance when initializing the context, I wrote a question on Stack Overflow: stackoverflow.com/questions/17403197/nexus-10-render-to-external-rendertarget-works-only-in- landscape And here it is worth praising ARM employees for the work of those. support After a couple of days, I received a letter from an ARM engineer in which he suggested giving a request for this bug to the Mali Developer Center forum. I prepared a simple test application and described the steps to reproduce the error: forums.arm.com/index.php?/topic/16894-nexus-10-render-to-external-rendertarget-works-only-in-landscape/page__gopid__41612 . And after only 4 (!) Days, I received an answer that there really is a bug in the current version of the video driver for Nexus 10. The most interesting thing is that ARM offered a workaround to solve my problem, which miraculously helped - you just need to call glViewport ( ) after glBindFramebuffer (). For such work those. ARM support they need to put a monument in their lifetime - the ARM employee was not too lazy to find my e-mail (and he is not listed on the Stack Overflow), and those engineers. ARM support found and solved the problem faster than I even expected.

All interested in the quality of Android on Nexus 10, please vote for the corresponding bug in the Google tracker: code.google.com/p/android/issues/detail?id=57391

Result



You can download the program from Google Play at the link: play.google.com/store/apps/details?id=org.androidworks.livewallpaperglass

The described method of the simplified DOF effect can be applied not only to the scene with the same objects as in our application, but also in any other cases where the main scene can be separated from the background.

Source: https://habr.com/ru/post/191442/


All Articles