Hello! It just so happens that modern industry rarely comes to the same standard, and if it does, some companies often supplement simple and understandable common technologies with their crutches. The last major “format war” can be called HD-DVD vs BluRay, which the latter won. Well, a new round of technologies rolled out the new “guns”: one of the main trends in television construction in 2016 is support for HDR (high dynamic range).

What is it, how does it work, how does it differ from HDR, what is used in photography and games, and what formats are fighting this time - in our today's post.
HDR - what it is and why it is necessary
HDR (High dynamic range) is a technology for expanding the dynamic range of an image. The human eye adapts to the level of ambient light. When shooting a photo or video, the parameters of the optical system (that is, the sensitivity of the photographic material or the sensor, the aperture value and the shutter speed) affect the formation of the frame, and the single information recorded contains a certain range of brightness. From what is conventionally considered “black” (but it may not be black in reality, because the amount of reflected light just was not enough to record a different hue) to what is considered “white” (similarly, an object can reflect / emit more light than the sensor is able to capture at the current settings, while not being white from an objective point of view).
')

The difference between the brightest and darkest shades is the desired dynamic range. If the scene has a greater dynamic range than the technique is capable of capturing, then the image will contain burnout of bright areas and / or blackening of shadows, that is, loss of luminance and color information.
The expansion of the dynamic range leads to an increase in the range of brightness perceived by the camera, but at the same time it reduces the image contrast. To restore the contrast, use different algorithms to improve local contrast and tone mapping.
But if the issue of converting high dynamic range to normal is resolved when recording content, then why do you need HDR on TVs and how does it work?
HDR in TV
Television HDR does the same: it increases the number of acceptable brightness values ​​between the darkest dark shades and the brightest bright ones, so a properly encoded video stream can display more different shades in the frame. In general, the picture becomes more detailed, vivid and clear by increasing the microcontrast in the “border” zones, which on a regular video stream and a regular TV would look monochrome or low contrast, and would be completely lost in motion.
It is necessary to provide support for HDR in several places at once. TV should be able to decode and display a similar signal - once. Content must be encoded appropriately and contain the necessary information - two. To deliver content with an increased bit rate is also necessary, that is, to have support from the "transport" of the signal (interface cable). Three. And the content must somehow be removed, distributed and so on. To ensure the compatibility of all this diversity should (in theory) a set of recommendations
Rec. 2020 Let's sort everything out in order.
Television
No, no one will put the 30-bit matrix in TV, they will still remain 18- or 24-bit, and HDR content will be reduced to “normal” before displaying. Just in the new format of recording color space special transformations will allow to encode the luminance component with a large number of bits. A special algorithm will convert the original signal to the "correct" for the TV, so the picture will be both more detailed and will not lose in contrast.
And here is the first problem. There are two HDR-TV standards: HDR10 and DolbyVision. Samsung and Sony made a bet on the HDR10, most Hollywood studios and some other companies - at DolbyVision. And the Dolby brand itself is much more “promoted”. Modern content providers have solved the issue in their own way - Netflix and Amazon simply support both formats, and you can choose in what form to get an HDR movie. LG decided not to participate in the format war and simply provided support for both technologies.
Content and Media
To support HDR video, an appropriate data recording format is required. From the “old” formats, Rec. 2020 support H.264 (with some restrictions), from the "new" - HEVC, aka H.265. Of course, "additional" information requires more space, so 4k2k-HDR content is delivered only online so far. BluRay 4k and the only (at the time of writing this article) common and relatively easily bought player of such discs are made by Samsung, and we already know what format this industrial giant relied on. Technically, nothing prevents to implement DolbyVision support in BluRay 4k, but at the moment we have what we have.
Data transfer
Content is, TV is. It remains to transfer content to the TV. Here (for the time being) the format of HDMI reigns supreme, the TVs with DisplayPort support - the cat wept. Theoretically, HDMI 2.0 supports 4k2k video with HDR, but with certain limitations: 24/25/30 frames per second for full color images with 12-bit sampling and 50/60 frames per second for 4: 2: 2 or 4: 2 color subsampling : 0
At the moment, HDMI 2.0 is a bottleneck in TV technology, and there is no need to talk about a real breakthrough in picture quality before the appearance of a stadary that supports “fair” 4k2k 120Hz / 8k4k 60Hz. The problems of HDMI are in the area of ​​providing backward compatibility - HDMI allows transmitting a signal that is compatible even with an analog b / w TV, which imposes certain restrictions on the development of the standard.
What to choose
If you buy a TV in 2016 and want to get the most out of it, it would be logical to pay attention to models with HDR support. On the televisions themselves, manufacturers are somehow embarrassed to write HDR Ready, so the store should look for the
UltraHD Premium mark. Samsung has such TVs in
the 9000 series (2016 release), LG, for example, the flagship OLEDs, 65EG6.

From the disc players on the market there are two solutions available - Samsung UBD-K8500 (with HDR10 support) and Panasonic DMP-UB900. Foreign surveyors are strongly recommending taking Panas - it has better picture quality and (theoretically) should get Dolby Vision support, but so far it only works with those disks that are (and they write the HDR10 format).

In practice, all these devices are ... a lot, and before popularization of HDR video formats (as well as its “release” in 1080p-resolution), prices are unlikely to fall / these features will appear in the “budget” TV sets.
Theoretically, HDR should be able to (though without 4k playback, only FullHD) and relatively budget
Samsung KJ-7500 , but so far it has not been verified.
However, just recently, Nvidia introduced the new Pascal desktop graphics card: GTX 1080 and 1070: they support new HDR formats, so game developers can use “honest” HDR (and not its imitation) in their products, and you are among the first to evaluate these innovations.

However, it is not yet known when these opportunities will appear in games. But as soon as they appear, you will be able to evaluate them among the first.
Have questions? Or maybe you are the lucky ones who already enjoy the benefits of HDR television? We are waiting for you in the comments!
