Current Issue - December 2016
Hot Button Discussion
Inside the HDR Spec
By Michael Goldman
As the complex ultra-high-definition (UHD), high dynamic range (HDR) ecosystem continues to evolve on the broadcast front, there are some simple concepts that should be considered as these issues are addressed, in the opinion of Tim Borer, Lead Engineer for Immersive and Interactive Content for the BBC’s Research and Development Division and a SMPTE Fellow. The first concept to understand is that what used to be known as “broadcast television” has evolved into “the Wild West when it comes to viewing environments,” meaning that broadcasters cannot possibly have any idea what viewing device or environment consumers will be using to watch their content at any given moment. Secondly, when addressing the issue of improving dynamic range and brightness values on consumer televisions and the content being viewed on those televisions, manufacturers and broadcasters would do well to keep the first concept in mind. They should avoid thinking in terms of a single, one-size-fits-all solution when it comes to distributing programming for the new generation of HDR displays.
Borer is the co-developer of the BBC/NHK Hybrid Log-Gamma (HLG) HDR standard, along with the BBC’s Andrew Cotton and engineers from NHK. As discussed last summer in Newswatch, HLG is one of two major HDR standards to widely impact the industry in recent years, following Dolby’s PQ Curve, or Perceptual Quantizer (PQ) approach. PQ subsequently served as the basis of the SMPTE dynamic range electro-optical transfer function now known ST-2084, while HLG was standardized by the Association of Radio Industries and Businesses (ARIB) into STD-B67. This past July, the next step in the industry’s pursuit of a single, over-arching HDR standard took place when these two approaches were brought under a single umbrella by the ITU, which ratified the proposal known as ITU BT.2100. The ITU followed that up with an “an accompanying report” titled BT.2390-0, “which gives a lot more background and detail and informative text and explanation about the systems and how they work,” according to Borer.
He suggests, however, that if one examines the history and background of PQ and HLG, and the way that the ITU brought them together, one will discover not only the reality about the need for the industry to accommodate the subtleties of competing approaches as far as high dynamic range goes, but more generally, why this reality is emblematic of a wider industry trend where many video standards are concerned these days. That trend, in Borer’s words, revolves around the fact that “we are continually being pushed by the consumer electronics’ manufacturers, who are always looking to introduce new products to consumers. I think that means the process of standardization has had to move faster [in recent years] to keep up with advances in consumer electronics’ technology. That has certainly been the case with HDR standardization.”
Borer explains that the ITU’s creation of a detailed HDR standard was hardly the end of the discussion, but rather the start of a process of “frequent review” aimed not at creating any new standard or approaches, but rather, of continually fine-tuning what already exists. In fact, he points out that once the ITU agreed in February 2016, on the new standard, even before it was ratified, that “because these technologies were so new, the ITU agreed that it would revisit it in October . There was another meeting in Geneva, where some editorial changes were suggested. There was nothing major, but a number of administrations wanted to consider further editorial changes, so any changes to the recommendation were postponed until 2017. These will be clarifications and modifications to the way the specification is written.”
“So you see we have an ongoing process to clarify [the specification] further, so we can make any kind of minor changes that might be necessary. Of course, any changes have to be done in a backward compatible way. We are not talking about having a fluid standard. We are just talking about ensuring we have the best way of specifying it.”
Borer states that BT.2100 will have new features such as incorporating Dolby ICtCp color space, which can be applied to either PQ or HLG, and also narrow signal range and full signal range characteristics. “That means the standard now takes into consideration the narrow signal range of conventional broadcast television, which gives you a bit of head room and foot room above and below the signal to allow for processing overshoots, and makes it easier to set up displays using a PLUGE signal. It also considers the fuller signal range of the feature film industry, which typically does not accommodate head room at the top and bottom,” he says.
Borer emphasizes that accommodating characteristics unique to both broadcast and feature film formats is crucial “simply because the industries are converging, and so, we have to weave that through the data and the specification.” That is why a part of the ITU specification even deals with signals in a floating point format, although that format is not typically applicable to broadcast television. But it is now part of the Academy’s ACES color management format, which makes it important to content creators and distributors,” he adds.
In fact, he emphasizes, the entire reason that PQ and HLG are both factored with equal weight into BT.2100 is the issue of convergence between the broadcast and cinema industries. The history of each format, he says, illustrates why this is important.
The BBC, he reminds, basically decided to initially pursue development of what eventually became HLG a few years ago after being “blown away seeing the quality of the pictures” produced using Dolby’s then brand-new PQ system. But officials eventually thought that specific approach “would be difficult to use in the context of a complicated broadcast infrastructure like the BBC’s.”
“In particular, we felt that the PQ solution required extensive metadata, because it evolved out of the movie industry, and was looking at the problem from that industry’s perspective, meaning the images were made for viewing in controlled environments, like cinemas,” he says. “They designed the PQ technology’s signal to represent what you would see on the final production screen—the final grading monitor’s brightness of each pixel on that monitor. That means their system has to transmit the details of that monitor, and what the viewing conditions were at that time in order to reproduce the artistic intent for that picture; this requires metadata to be part of the signal. We felt that would be acceptable for a controlled environment, if you were in a cinema or home theater, or otherwise in ideal conditions, which is fine for packaged media or for OTT delivery. Some [monitor] manufacturers use Dolby’s proprietary Dolby Vision system to increase and manage the brightness of the signal, but of course that is a proprietary system and not everyone will have that. When trying to view pictures in a wide range of conditions, we didn’t think that was ideal for conventional, linear TV broadcast channels like the BBC and others,” Borer states.
“This is why the BBC got involved in developing our own standard—to find something less dependent on metadata that would work better for our production environment and translate better in diverse and, perhaps less than ideal viewing conditions, such as we see in the linear, broadcast world. Eventually, we met with our colleagues at NHK, and they were like-minded because they have a broadcast infrastructure similar to our set-up, and so, their perspective was similar to ours. We amalgamated our efforts with theirs and came up with a common solution for conventional, linear TV broadcasters, and that is how we came up with HLG. Our approach was to encode what you have coming directly out of the camera, so that the HLG signal does not represent the picture on a grading monitor, but rather the signal from the camera, more like conventional television. Therefore, we do not need to convey metadata about a grading monitor, so the process involves less metadata to transmit to the display, and that is better for our production infrastructure. Modern consumer UHD displays often have ambient light sensors and measure what the brightness of the viewing environment is, so they try to do the best possible job of rendering the incoming signal for the actual viewing environment and the capabilities of the display. The production standards provide a guide to consumer manufacturers about how they may wish to render the signal for particular displays, but that is only a guide. Most of the newer consumer TV’s that we have measured perform fairly closely to the actual production specs. But in any case, the guide is not a requirement, because the ITU spec is a specification for television production, not specifically for distribution. Distribution specifications are produced by organizations like DVB [the Digital Video Broadcasting consortium] and, of course, the ATSC.”
As the ITU standardization process moved along, according to Borer, it eventually became clear that “a somewhat unusual rationale” was needed for the ITU standard due to the reality of “the convergence of the TV industry, the movie industry, and the Internet via OTT distribution.” Thus, the decision was made “not to end up with a single technology from one side or the other [as the basis for the standard]. Instead, both PQ and HLG were incorporated into the ITU standard and “were recognized as having somewhat different applications.”
Borer says a variety of recent developments illustrate that, “as we move into 2017, the HDR landscape is evolving rapidly to accommodate both formats of HDR—PQ [HDR10] and HLG [HLG10].” One example of this trend is how consumer electronics’ standards are evolving, he suggests, as far as HDMI is concerned. He points to the Consumer Technology Association’s DTV profile for uncompressed high-speed digital interfaces, known as CTA-861-G, which incorporates both PQ and HLG, and the fact that it has been announced that HDMI 2.0b, which is based on CTA-861-G and is backward compatible with earlier versions of the HDMI specification, will be supporting HLG.
There is also the question of whether the industry has reached the limit of where dynamic range ultimately needs to go, or whether there will eventually come a time when merely reviewing and tweaking the standard won’t be enough, when a whole new approach will be needed. Borer doubts that day will come any time soon, largely because in the era of 4K/UHD/HDR viewing, “we are already approaching the limits of human vision for appreciating dynamic range anyway.”
“The eye only has a capacity to see a dynamic range of about 10,000 to one in any given picture,” he says. “To put that in context, the eye can see a phenomenal range of brightness in actuality. You can see a very dark scene under starlight and very bright scenes under a mid-day tropical sun, and there is a vast difference in brightness between those two examples. The reason you can see that wide range is that your eye adjusts itself. The eye adjusts very differently on a dark night, under moonlight, than it does in the middle of the day in the tropics. But in any one particular scene, you can only see a difference of about 10,000 to one.”
“So if you have a 1,000 nit display, which is quite typical for a new consumer display—and this year some are getting brighter to about 2,000 nits, and even higher in some cases—and then you talk about 10,000 to one dynamic range, that means that the blackest black level that you can see would be down at about 0.1 nits or 0.1 Candelas per square meter. Those are the kind of black levels that we are achieving in ordinary living rooms today. Thus, if you increase the brightness of the screen, you also increase the minimum black that you can see, and you do not actually get an improvement in the viewed picture. The current HDR standards are therefore approaching the limits of human vision. However, this does not mean it isn’t possible to have a better standard in the future, but it would be a question of diminishing returns. It would be hard to appreciate subjectively any increase in quality.”
Still, Borer suggests that advances with dynamic range can continue to alter the landscape in ways never originally anticipated when the train first left the station. He points out, for instance, that dynamic range is also applicable to cinematic viewing, which “is a different beast,” because cinemas are “dark, controlled environments.” Thus, modern projection systems are “far less bright” than state-of-the-art modern consumer displays. Outside of brightness issues specific to 3D programming—which are currently being addressed by the industry in its attempt to transition to laser projection systems—the issue of brightness has not generally been as large a debate as it relates to average movie theaters. However, he predicts that debate could someday heat up; particularly as modern cinematic viewing environments leave traditional settings.
Borer states that movie theaters are dark and are likely to remain that way because of the limits of brightness in projectors, however, they have an interesting property—the dynamic range of a cinema is limited somewhat by reflections from the auditorium and, in particular, the audience. Therefore, for any particular scene, you can only achieve a certain dynamic range. “Artistically, we have seen people reduce brightness of movies for dramatic reasons, because they know the audience’s eyes will adapt in the dark theater. The same cannot applied to television, because people change channels, and there are interstitials and advertisements and such. Thus, we have to maintain a more constant brightness all the way through. This is because film and TV are very different media, even though some of the technologies are converging together,” he says.
“However, people are showing movies outside and in conditions where there is ambient light or less than perfect conditions, so some people may want to see cinema [screens] get brighter as a result. In the future, cinemas may need to move toward emissive displays for this purpose. We have already seen this on cruise ships, where they have open-air cinemas. They use screens with LED arrays, which is an emissive cinema display. Until recently, high resolution was not used, but that is starting to change.”
“Domestic TV’s are getting a lot brighter. It is quite common with modern HDR TV’s to have a brightness of 1,000 nits, and they are starting to increase to 2,000 and beyond. So if you can have a consumer display that has brightness of 1,000 nits and cinemas are only at 50 to 100 nits, you might wonder if, for some people, will it someday be hard for cinemas to compete with home theaters? If they want to address that issue, maybe they will want the cinema screen to get brighter, and projectors are not going to get them there. Therefore, they may need to change technologies with self-emissive displays—OLED or LED or some kind of quantum dot display. I don’t think the industry is really thinking that way right now, but I do think it is something that could happen in the future. So maybe we will see television technology converge toward cinema, and cinema technology converge toward television,” Borer says.
As 2017 dawns, the HDR topic shows no signs of abating, and much industry coverage is being devoted to demonstrating and explaining where HDR currently fits in the larger broadcast content creation, delivery, and viewing paradigm, and where it might be heading next. Among the developments, on the topic of streaming UHD/HDR footage, the BBC is currently streaming a limited, four-minute 4K UHD trial of Planet Earth II footage in its UHD/HLG format, which the network is calling “the highest quality the BBC has ever broadcast.” The footage is available through the BBC iPlayer platform, with the network promoting the notion that “extreme high contrast” footage as demonstrated in the demo illustrates an ability to use dynamic range to broadcast the natural world in a way that was simply never possible in the past. Meanwhile, the Creative Planet Network recently posted a comprehensive, end-of-year look at some of the creative possibilities for how HDR delivery and display technologies can be utilized for different applications.
Broadcast Look, Forward and Back
TV Technology recently published a roundup from its columnists summarizing major broadcast technology trends as we exit 2016 and head into 2017. Among the topics highlighted as having a major impact on the industry were the over-the-air terrestrial spectrum auction, which thus far has fallen far short of expectations; the rapid progress of A/V over IP transport technologies; the rise of social media platforms for video carriage; immersive audio; the completion of the ATSC 3.0 standard, and much more. While those trends involve looking ahead, another TV Technology article takes a look back to highlight the opening of a new museum in Texas dedicated to preserving the history of television and radio broadcast equipment. The article points out that the new Texas Museum of Broadcasting & Communications, located in Kilgore, Texas, is one of several, new private institutions that have opened recently for the purpose of preserving and showing rare artifacts from broadcast history. The Texas museum’s collection includes a wide range of videotape recorders, telecine equipment, video and waveform monitors, camera and lighting equipment, and much more, including its “crown jewel”—a restored 1949 DuMont telecruiser, a vintage broadcast truck used by Texas’ first TV station, KBTV.