What is HDR10+? Everything to Know About the New HDR Format

What is HDR10+? Everything to Know About the New HDR Format

HDR10+ Example

High Dynamic Range, more commonly referred to as HDR, is one of the most important new video technologies since the upgrade from standard definition to HD. But HDR comes in many flavors. You’ve probably heard terms like Dolby Vision, HDR10, HLG, or more recently, HDR10+. But what exactly is HDR10+? How can you get it? And perhaps most importantly, is it the best HDR format? We’re glad you asked! Below we’ll shed some much-needed light on all of these questions and more.

What is HDR?

Before we can dive into HDR10+, we need to quickly cover what HDR is. We’ve got a few fantastic deep dives on this technology that you can peruse at your leisure, but for the sake of a quick introduction, High Dynamic Range as it pertains to TVs allows for video and still images with much greater brightness, contrast, and better color accuracy than what was possible in the past. HDR works for movies, TV shows, and video games. Unlike increases in resolution (like 720p to 1080p), which aren’t always immediately noticeable — especially when viewed from a distance — great HDR material is eye-catching from the moment you see it.

HDR requires two things at a minimum: A TV that is HDR-capable, and a source of HDR video, like a 4K HDR Blu-ray and compatible Blu-ray player, or an HDR movie on Netflix. Sometimes people think that 4K and HDR are the same, but that’s not the case. Not all 4K TVs can handle HDR, and some do it much better than others. That said, most new TVs support both 4K UHD and HDR.

But saying “HDR” is like saying “digital music”: There are several different types of HDR, and each has its own strengths and weaknesses.

What is HDR10?

Every TV that is HDR-capable is compatible with HDR10. It’s the minimum specification. The HDR10 format allows for a maximum brightness of 1,000 nits, and a color-depth of 10 bits. On their own those numbers don’t mean much, but in context they do: Compared to regular SDR (Standard Dynamic Range), HDR10 allows for an image that is over twice as bright, with a corresponding increase in contrast (the difference between the blackest blacks and the whitest whites), and a color palette that has one billion shades, as opposed to the approximately 16 million of SDR.

As with all HDR formats, how well HDR10 is implemented depends upon the quality of the TV on which you view it. Still, when utilized properly, HDR10 makes video content look really good. But HDR10 is no longer the top of the HDR food chain.

hbo and game of thrones should learn a lesson from netflix stranger things 4k hdr logo sony joel
Joel Chokkattu/Digital Trends

What is HDR10+?

As the name suggests, HDR10+ takes all of the good parts of HDR10 and improves on them. It increases the maximum brightness to 4,000 nits, which thereby increases contrast too. But the biggest difference is in how HDR10+ handles information. With HDR10, the “metadata” that is fed by the content source is static, which means there’s one set of values established for a whole piece of content, like an entire movie. HDR10+ makes this metadata dynamic, allowing it to change for each frame of video. This means every frame is treated to its own set of colors, brightness, and contrast parameters making for a much more realistic-looking image. Areas of the screen that might have been over-saturated under HDR10 display their full details with HDR10+.

There’s a catch, though. Despite being a royalty-free format, HDR10+ was developed by a consortium of three companies — 20th Century Fox, Panasonic, and Samsung. As such, HDR10+ compatibility has so far been limited to TV models by Samsung and Panasonic. On the content side of the equation, there isn’t a lot of support for HDR10+ yet either, though that’s beginning to change. Netflix does not support the new format, but Amazon Prime Video does. In April 2019, Universal made a commitment to release both new and back-catalog titles in HDR10+, and 20th Century Fox is set to do the same. However, 20th Century Fox is now owned by Disney, which might actually have an effect on its HDR10+ plans because Disney has thrown its support behind Dolby Vision, a more established HDR format.

Dolby Vision

So what about Dolby Vision?

HDR10+ isn’t the only HDR format with ambitions of becoming the next king of the HDR castle. Dolby Vision is an advanced HDR format created by Dolby Labs, the same organization behind the famous collection of Dolby audio technologies like Dolby Digital and Dolby Atmos. Dolby Vision is very similar to HDR10+ in that it uses dynamic, not static metadata, giving each frame its own unique HDR treatment. But Dolby Vision provides for even greater brightness (up to 10,000 nits) and more colors too (12-bit depth, for a staggering 68 billion colors).

For now, these specs are a bit moot: There are no 12-bit capable TVs yet, and brightness of that caliber remains the stuff of prototypes. But both are certainly coming in the years ahead, and when that happens Dolby Vision will be ready. Unlike HDR10+, which only had its official launch in 2018, and has so far seen limited uptake by both content and hardware companies, Dolby Vision has been around for several years and enjoys wide industry support, which could help it end up the advanced HDR format of choice in the long-term. Part of the reason Dolby Vision is less abundant than HDR10 is the fact that it’s a proprietary technology and companies that wish to implement it in content or hardware must pay a licensing fee to do so. HDR10+, like its predecessor HDR10, is open-source and royalty-free which could see its adoption rate explode over the coming years, especially amongst TVs competing for the budget rung.

Oh no, not another format war!

Does the presence of competing HDR formats like HDR10+ and Dolby Vision mean we’re in for another format war? Not exactly. Unlike previous technologies like Blu-ray vs. HD-DVD, HDR formats are not mutually exclusive. This means there’s nothing stopping a movie studio from releasing a Blu-ray that contains HDR10, HDR10+, and Dolby Vision metadata on a single disc.

A TV that supports HDR can support multiple HDR formats, and many of today’s TVs do just that. The most common combo is HDR10 and Dolby Vision support on a single TV, however, we’re also just beginning to see the arrival of TVs that support all three, plus HLG, the version of HDR that is favored by digital TV broadcasters. It’s also possible that some TVs that shipped from the factory with support for just two formats — say HDR10 and Dolby Vision — could be updated via a firmware upgrade to handle HDR10+.

Blu-ray players and media streamers can also support multiple HDR formats. The challenge is that despite the ability to support multiple HDR formats, very few TVs, playback hardware devices, streaming video services, or Blu-rays do. This means that as consumers, we need to pay close attention to the labels to understand the capabilities of the devices and content we own, and the ones we plan on buying.

Many Blu-ray players, for instance, only offer support for HDR10, while some newer ones like Sony’s UBP-X800M2, add Dolby Vision support. The same considerations apply to set-top streaming boxes. At the moment, the only streaming device that we know of that can handle HDR10, HDR10+ and Dolby Vision is the Amazon Fire TV Stick 4K — not surprising given that Amazon’s Prime Video service also supports all three formats.  Others, like Apple’s Apple TV 4K, support HDR10 and Dolby Vision, but not HDR10+.

What do I need to get HDR10+?

To summarize, HDR10+ is a new format of HDR, which offers higher levels of brightness and contrast, plus more true-to-life colors and detail. To get it you’ll need:

  • A source of HDR10+ video, such as a Blu-ray movie or Amazon Prime Video (with more to follow)
  • A device that is capable of reading HDR10+ encoded material, like a compatible Blu-ray player or media streamer
  • A TV that is HDR10+ compatible (these may also have built-in apps that let you side-step the need for a playback device)

One more thing: If you’re using a media streamer or a Blu-ray player for your HDR10+ content, and it does not plug directly into your TV, the HDMI cable that you’re using should ideally be compatible with HDMI 2.1. The reason is that HDR10+ (and Dolby Vision) use far more data bandwidth than conventional HDR10, and older HDMI 2.0 cables may not be able to support that extra demand.






0/5 (0 Reviews)

Leave a Reply

Your email address will not be published. Required fields are marked *