[Originally published on March 20, 2016 by Larry Jordan.]
There’s only one big thing still holding back HDR: we can’t display it.
I’ve been giving a lot of thought to HDR (High-Dynamic Range) video recently. And, as we run up to NAB next month, I wanted to share some of these thoughts with you.
First, though, some background.
Wikipedia: “High-dynamic-range imaging (HDRI or HDR) is a technique used in imaging and photography to reproduce a greater dynamic range of luminosity [gray-scale] than is possible with standard digital imaging or photographic techniques. The aim is to present the human eye with a similar range of luminance as that which, through the visual system, is familiar in everyday life. The human eye, through adaptation of the iris and other methods, adjusts constantly to the broad dynamic changes ubiquitous in our environment. The brain continuously interprets this information so that most of us can see in a wide range of light conditions. Most cameras, on the other hand, cannot.
“The two primary types of HDR images are computer renderings and images resulting from merging multiple low-dynamic-range (LDR) or standard-dynamic-range (SDR) photographs. HDR images can also be acquired using special image sensors, like an oversampled binary image sensor.” (Source article link.)
RED website: “High dynamic range (HDR) imaging is a powerful technique that combines multiple exposures into a single frame that encompasses the brightness range of the entire set. This has been an established technique with stills photography, but has only recently emerged as a possibility with motion capture—where the applications are even more expansive.
“A camera’s dynamic range describes how much subject intensity [gray-scale values] can vary before being recorded as featureless black or white. Having a higher dynamic range therefore improves exposure and post-production flexibility, expands the possibilities for dramatic and unevenly-lit scenes, and enhances image quality and detail.
“Dynamic range is typically specified in terms of stops, where each unit increase translates into a doubling of dynamic range.” (Source article link.)
NOTE: The Red article also has some nice images illustrating the technology.
Many cameras today that can record a 10-bit video signal, or greater, using either a RAW or Log format can record an HDR signal. The problem is that we can’t monitor it.
MONITORING IS THE KEY
Currently, neither the Mac nor Windows operating systems support more than 8-bit gray scales, though the 5K iMac has 10-bit display technology built-in, but not turned on.
We are all familiar with the 256 gray-scale values in 8-bit video. Why 256? Computers store video – and every thing else – using a combination of 1’s and 0’s. 2 (for the 1’s and O’s) raised to the 8th power (for the bit depth of the video) equals 256 discrete gray scale values and slightly more than 16 million colors.
10-bit video provides up to 1,024 gray scale values, with more than a billion different colors!
However, the problem is deeper than just shooting higher bit-depth video. There are three core issues at play here:
- The operating system and related video applications need to support more than 8-bit video
- Display hardware needs to support more than 8-bit video
- There needs to be an industry-wide standard on how HDR video will be displayed.
While I don’t want to minimize the challenges of writing operating system software or display drivers, these are known problems with fairly easy solutions. (And, in fact, current display cards from both AJA and Blackmagic Design support HDR video signals.)
The hard issue, because its part technical and part political, is picking the right display standard. Forbes writes: “While High Dynamic Range (HDR) picture technology has been a ‘thing’ since Samsung unveiled its debut SUHD TVs this time last year, its usefulness as a home entertainment technology has been severely undermined by a lack of any sort of serious standardization.
“Aside from a broad consensus that HDR should deliver an expanded luminance range to deliver brighter, more contrast-rich images along with enhanced colour performance, there have been no hard and fast rules on how far a display or content source needs to go to actually deliver an HDR performance worthy of the name.
“In other words, HDR has essentially been operating for 12 months in a near wild west environment in which nobody has really had a clue as to what sort of specifications constitute either a true HDR-capable television or a true HDR video source.” (Source article link.)
Currently there’s a wrestling match between multiple standards:
- Ultra HD Premium
- Dolby Vision
- Rec. 2020 combined with SMPTE ST 2084
Not to mention proprietary capture formats like REDx from Red.
Robert Cringely, professional rumorist, wrote in his Feb. 9, 2016 column: “I think Cupertino has finally figured out a way to grab an important and profitable part of nearly all TVs, controlling the future of video entertainment in the process…. Cupertino has found success over the years by proposing new technical standards and also by taking the lead in product categories that were already considered mature.
“…The technical area I think Apple wants to dominate is High Dynamic Range (HDR) video — technology for making displays of all types brighter and display more colors. Right now there are two major video advances entering the market — 4K and HDR — and tests show HDR is a much more obvious improvement than 4K for viewers.
“…Apple could be preparing its own 4K HDR OLED TV for sale but I don’t think so. A better move for Apple would be to take command of the whole HDR video wave by acquiring a dominant position in HDR intellectual property….
“By controlling HDR Apple could come to control the entire video ecosystem. And the best way for Apple to control HDR would be by acquiring Dolby Labs…. Dolby Vision (Dolby’s HDR product) isn’t a video grading system but an end-to-end video distribution solution.” (Source article link.)
As an Apple-connected friend writes: “Apple’s level of commitment to the Pro Space not withstanding, HDR is ultimately a consumer visual standard. For that reason, HDR is pretty much compulsory for them in all their displays, in every device. They pretty much have to do this in order to try and move iPhone and iPad sales off of their current sales plateau. Also, they cannot afford to [get left] behind on a standard as fundamental as HDR.
“People need to talk more about this because it’s way too important than to allow creating a mess of standards for something this fundamental. [Image] acquisition for HDR will be a much less painful transition than for postproduction. THAT will be [both] painful and expensive. …The industry needs to be as single-minded and focused as possible to ease the pain.”
HDR trumps resolution. I saw a demo of Dolby Vision more than two years ago at Dolby Labs. Given a choice between watching a 4K image or an HDR image in standard-definition, you’d watch HDR. Compared to HDR, resolution becomes essentially irrelevant.
NAB this year will be very interesting. Unless I’m totally off-base, you’ll see a ton of HDR monitors at NAB – which is great. The question you need to ask, though, is what display standard they support and do any two manufacturers support the same one?
4K may be the buzz, but HDR will change our life – if only we can figure out a consistent way to display it during capture, editing and distribution.
Apple has the capability to take the lead here – because they control the entire display process from hardware to operating system to software. The questions are: “Will they?” and “How?”
As always, let me know your thoughts in the comments.