4K UHD LED TV Comparison is supported by its audience. We review products independently. When you purchase through links on our site, we may earn an affiliate commission.
First it was HDR 10 by UHD alliance, then Dolby Vision by Dolby Laboratories, and then Advanced HDR Technicolor, and HLG (Hybrid Log Gamma) by NHK and BBC; now there is an exciting new HDR format called HDR 10+. Essentially this format is based on the HDR 10 format, but brings some improvements to deliver a better HDR experience than the predecessor. HDR 10+ format was first introduced by Samsung and Amazon on April 2017 and this format has received the support of several big-names such as Panasonic, 20th Century Fox and others. So what are the improvements ushered in by HDR 10+ compared to HDR 10 and how does the performance of HDR 10+ rank against the other technologies?
HDR 10+ vs HDR 10
As we mentioned, essentially HDR 10+ is based on the HDR 10 format. HDR 10 is the most popular HDR format and has been used by all manufacturers and content providers which to support a High Dynamic Range (HDR), but basically the technology still has some limitations. For remapping color and brightness, HDR 10 uses a static metadata sent at the beginning of the video that makes the information received by an HDR device to be limited. Basically the metadata is the information that tells a receiver how the HDR contents must be shown. Since the information received by the HDR device is limited to tone mapping it applies the same contrast, gradation, brightness and color enhancement across an entire scene of content, and this can result in less precise remapping of color and brightness, resulting in some unwanted deviations twith an impact on the HDR images shown. So now HDR 10+ comes in to bring improvement and to more precisely remap color and the brightness of HDR images, resulting in a better HDR experience than standard HDR 10.
Basically HDR 10+ uses the same standardization algorithm that HDR 10 does, such as 10 bit depth of color, DCI P3 Color space, 4000 nits of peak brightness with current target 1000 nits. Even so, HDR 10+ brings an improvement on how the metadata is sent. For remapping colors, black level, and brightness, HDR 10+ no longer uses static metadata but now it uses dynamic metadata and the metadata is no longer sent at the beginning of the video, but instead the metadata is sent scene by scene. This will allow the television to show a tone mapping curve from scene to scene more precisely which makes the HDR picture displayed on the screen to look more true-to-life and closer to the original intention of the content creator. The technique used by HDR 10+ in sending the metadata is reminiscent of the technique used by Dolby vision.
HDR 10+ vs Dolby Vision
Both equally use dynamic metadata to remap colors and brightness level, so what’s the difference between Dolby Vision and HDR 10+? As mentioned above, HDR 10+ is a royalty-free HDR format developed by Samsung and other big-name companies as an enhancement of the HDR 10 standard. On other hand, Dolby Vision is a proprietary HDR format developed by Dolby Laboratories. Both of these HDR formats equally use dynamic metadata to remap colors and brightness, making the HDR picture produced to look true-to-life. Nevertheless if both of these formats equally use dynamic metadata, how their metadata is created is different. According Dolby’s SVP of Consumer Entertainment, all metadata of Dolby Vision is created by hand by colorists and editors at the movie studio, while for HDR 10+ its metadata is created by an upscaling algorithm.
Meanwhile, based on the HDR 10 standard, HDR 10+ still uses 10 bit depth of color while Dolby Vision uses 12 bit depth of color. Although there are not many TVs which support 12 bit depth of color, Dolby has claimed it can be down-sampled in such a way as to render 10-bit color more accurately. Additionally, about their color gamut coverage, the HDR 10+ supports DCI P3 color space while Dolby Vision supports Rec.2020 color space. The other difference between these formats is the peak brightness reached. Dolby Vision already supports 10.000 nits of peak brightness with a current target of 4000 nits while like its predecessor HDR 10, HDR 10+ supports 4000 nits of peak brightness with a current target of 1000 nits. Theoretically with this higher standardization, Dolby Vision should be able to deliver a better HDR picture than HDR10+. But the problem for this now is that there are no TVs which perfectly meet these standardizations.
Conclusion
We are actually seeing a “format war” between some of the leading tech companies where eventually there will be a winner and loser, but don’t yet rule out that they might just coexist. Essentially, each wants to present a better HDR experience for consumers. Who is the winner is not important to us as it is to the consumer. This is just knowledge for us to know the characteristics and specs of each HDR format available on the market. Keep in mind that all the TVs which support Dolby Vision that we know of also support HDR 10, but not all the HDR TVs which support HDR 10 also support Dolby Vision. At this time, with the exception of Samsung TVs, only mid-to upper range TVs support Dolby Vision. Why is there no Samsung TVs which support Dolby Vision, it seems that Samsung is reluctant to pay Dolby Vision’s license for its TVs.
So Samsung supports HDR 10+, while other manufacturers like LG and Sony still focus on Dolby Vision and have not joined the HDR 10+ alliance (a group of manufacturers that share data points and technology advancements of HDR 10+ with one another). Which is the winner between HDR 10+ and Dolby Vision, we’ll just have to wait and see how it plays out. But do not rule out the possibility that HDR 10+ and Dolby Vision can coexist and TVs can support both. If it happens, this is certainly an advantage for all of us consumers so we can enjoy movies with more HDR format options.
Leave a Reply