⬆️ ⬇️

Simulate dark-eye adaptation in 3D, or HDR for dummies

Everybody knows the effect of temporary blindness when you enter a dark room from a light one. According to a common misconception, the sensitivity of vision is governed by the size of the pupil. In fact, the change in the area of ​​the pupil regulates the amount of incoming light by only 25 times, and the retinal cells themselves play the main role in adaptation.



title



To imitate this effect in games, a mechanism called tonemapping is used .

')

tonemapping - the process of projecting the entire infinite interval of brightness ( HDR , high dynamic range, from 0 to infinity) to the final interval of perception of the eye / camera / monitor ( LDR , low dynamic range, limited on both sides).



In order to work with HDR, we need an appropriate screen buffer that supports values ​​greater than one. Our task will be to correctly convert these values ​​into the range [0..1].







First of all, we must somehow find out the overall brightness of the scene. To do this, calculate the geometric average brightness value of all pixels.



However, for our night scene, this is slightly unreasonable, since most of the image area is dark, even if there is a bright light source, and therefore the average brightness is almost unchanged. So take the maximum brightness, and divide it in half.



Let's press our picture to the nearest square with a side equal to two and discolor it. Then we will double it each time until one pixel remains:



Downsampling



To compress the image, we will take four neighboring pixels and choose the middle one of them (for our case - the maximum one instead). For the accelerated calculation of the geometric mean, we use the formula



6c1baa17500174ff1745d50bdabc1399



Why geometric? Because the geometric mean "to" higher values , which means that brighter pixels will be selected (which we need, because we are interested in the light sources in the picture).




RenderTextureFormat rtFormat = RenderTextureFormat.ARGBFloat; if (lumBuffer == null) { lumBuffer = new RenderTexture (LuminanceGridSize, LuminanceGridSize, 0, rtFormat, RenderTextureReadWrite.Default); } RenderTexture currentTex = RenderTexture.GetTemporary (InitialSampling, InitialSampling, 0, rtFormat, RenderTextureReadWrite.Default); Graphics.Blit (source, currentTex, material, PASS_PREPARE); int currentSize = InitialSampling; while (currentSize > LuminanceGridSize) { RenderTexture next = RenderTexture.GetTemporary (currentSize / 2, currentSize / 2, 0, rtFormat, RenderTextureReadWrite.Default); Graphics.Blit (currentTex, next, material, PASS_DOWNSAMPLE); RenderTexture.ReleaseTemporary (currentTex); currentTex = next; currentSize /= 2; } 




 // Downsample pass Pass { CGPROGRAM #pragma vertex vert #pragma fragment fragDownsample float4 fragDownsample(v2f i) : COLOR { float4 v1 = tex2D(_MainTex, i.uv + _MainTex_TexelSize.xy * float2(-1,-1)); float4 v2 = tex2D(_MainTex, i.uv + _MainTex_TexelSize.xy * float2(1,1)); float4 v3 = tex2D(_MainTex, i.uv + _MainTex_TexelSize.xy * float2(-1,1)); float4 v4 = tex2D(_MainTex, i.uv + _MainTex_TexelSize.xy * float2(1,-1)); float mn = min(min(v1.x,v2.x), min(v3.x,v4.x)); float mx = max(max(v1.y,v2.y), max(v3.y,v4.y)); float avg = (v1.z+v2.z+v3.z+v4.z) / 4; return float4(mn, mx, avg, 1); } ENDCG } // Prepare pass Pass { CGPROGRAM #pragma vertex vert #pragma fragment fragPrepare float4 fragPrepare(v2f i) : COLOR { float v = tex2D(_MainTex, i.uv); float l = log(v + 0.001); return half4(l, l, l, 1); } ENDCG } 




Note that when you log the source image, we add a small constant to avoid the collapse of the universe in the case of a completely black (0) pixel.



At each step of reduction, the minimum (R), maximum (G), and log-average (B) brightness values ​​are stored in our texture.



This is followed by a small trick that will avoid reading the texture and “adapting” the eye completely to the GPU: we will get a permanent texture of 1 pixel size and on each frame we will impose on it a new brightness value (also 1 pixel) with a small alpha (transparency). Thus, the stored brightness value will gradually come to the current, as required.



 if (!lumBuffer.IsCreated ()) { Debug.Log ("Luminance map recreated"); lumBuffer.Create (); //     ,     Graphics.Blit (currentTex, lumBuffer); } else { material.SetFloat ("_Adaptation", AdaptationCoefficient); Graphics.Blit (currentTex, lumBuffer, material, PASS_UPDATE); } 




AdaptationCoefficient - a coefficient of the order of 0.005, which determines the speed of adaptation to the brightness.



It remains to take our two textures (the original image and brightness) and “twist” the exposure in the first, using the value from the second.



 material.SetTexture ("_LumTex", lumBuffer); material.SetFloat ("_Key", Key); material.SetFloat ("_White", White); material.SetFloat ("_Limit", Limit); Graphics.Blit (source, destination, material, PASS_MAIN); 




 // Main pass Pass { CGPROGRAM #pragma vertex vert #pragma fragment frag float4 frag(v2f i) : COLOR { half4 cColor = tex2D(_MainTex, i.uv); float4 cLum = tex2D(_LumTex, i.uv); float lMin = exp(cLum.x); float lMax = exp(cLum.y); float lAvg = exp(cLum.z); lAvg = max(lMax / 2, _Limit); // force override for dark scene float lum = max(0.000001, Luminance(cColor.rgb)); float scaled = _Key / lAvg * lum; scaled *= (1 + scaled / _White / _White) / (1+scaled); return scaled * cColor; } ENDCG } 




Here we restore the brightness value from the logarithm, calculate the scaling factor (scaled), and make a correction for the white level (_White).



Parameters used:



Result:







You can get an interesting result by reducing the texture of the brightness not to one pixel, but stopping in a few steps (increasing LuminanceGridSize). Then separate areas of the screen will “get used” independently. In addition, there will be a “dark spot” effect, when one area of ​​the mesh is lit up, if you look directly at the lamp. However, in most cases, the brain automatically hides the effect of light, and on the monitor it looks unnatural and unusual.



Read more about daytime tonemapping'e read at Reinhard

Code

Shader

Source: https://habr.com/ru/post/165669/



All Articles