Newswatch e-newsletter

Current Issue - May 2016

SMPTE Newswatch Masthead

Hot Button Discussion

How Best to Interface HDR into 4K Displays
By Michael Goldman  

When recently asked to identify a pressing issue in the display-manufacturing world, veteran broadcast industry analyst Pete Putman pointed to the challenge of most efficiently and seamlessly incorporating high dynamic range capabilities into the next generation of consumer displays. As Putman, a member of SMPTE’s Annual Technical Conference Committee, discussed last year in a Newswatch article, the overall industry trend line remains steadfastly pointed in the direction of phasing out large-screen high-definition televisions and replacing them with Ultra HD (3840 x 2160) TVs. But his point was that this ongoing transition is still in its early stages, meaning that some image display improvements are not currently available, easily accessible, or available in their most efficient form in the first generation of large-screen UHD televisions for various reasons. And none of those improvements, he adds, are more important than the industry’s recent thrust into the world of high dynamic range (HDR).

Therefore, Putman advises that you may want to “wait to pull the trigger” before buying a UHDTV, to make sure that it truly supports HDR before you put your money down.
 
“I did a presentation at the [2016 NAB Broadcast Engineering Conference in April], and the topic was next-generation display interfaces,” Putman says. “I asked how many people had a 4K TV that was about a year or two old, and a bunch of hands went up. I then stated that next year they would have to buy a new one, because their [current] TV likely does not support high dynamic range, unless they managed to purchase one of a handful of LCD TVs that feature quantum dot backlights. The next question I asked was how many people were considering buying a 4K TV right now, and more hands went up. I advised them to wait until about a year from now. By then, I expect a lot of the issues with HDR—display interface compatibility issues, metadata issues, and everything else—to be largely ironed out, and they could then get a UHD TV that would be reasonably future-proofed. But if you buy one right now, I’m not so sure it could display HDR content. First, the TV must use an illumination system that can reproduce a high dynamic range signal. And second, the TV’s display interface—most likely HDMI—must support HDR metadata [HDMI version 2.0a]. This means that when CTA-861.3 HDR metadata extensions are present in the HDMI stream, coming from whatever the content source is, and passing through to the TV, then the TV will configure itself to reproduce HDR video.”

Putman elaborates that, currently, only two HDR formats are available for consumer displays, and first-generation HDR televisions are not necessarily future-proofed. The first format, HDR10, is supported by Ultra Blu-ray discs and the UHD Alliance, but is limited at the current time to 10-bit color depth and supports only static HDR metadata. The second format, Dolby Vision, masters content with a 12-bit color depth, supports dynamic metadata, and has a range of other features that make it attractive to many leading display manufacturers, but it requires Dolby Vision software built into the display.   
 
“The problem, though, is not all [future content] will be coded in Dolby Vision,” Putman says. “There are other proposed HDR systems out there—a dual HDR/SDR single stream proposal from Technicolor, consolidated with a similar idea from Phillips, for instance. And at NAB, Samsung demonstrated its Dynamic Metadata proposal for a tone-mapping method [dubbed ‘Scene-based Color Volume Mapping’ to permit the reproduction of the creative intent of HDR content on displays with more limited dynamic range capabilities], which Samsung has submitted to SMPTE [working group ST-2094] for consideration as an HDR standard. And then, there is the Hybrid Log Gamma Proposal [to the ITU] from NHK and BBC, which is really an open system suited for bandwidth limited delivery, and is intended to be a way to transmit HDR over broadcast airwaves. Ideally, I would like my new Ultra HDTV first to display HDR, and second, I would like it to support all of these HDR formats.”
 
(This 2015 SMPTE Study Group Report on the HDR Imaging Ecosystem contains examinations of the Samsung and original Technicolor, Phillips, Dolby, and BBC and NHK proposals.)

All of which leads to this question: What is the primary remaining issue to be resolved before the next generation of UHD consumer displays can support whatever flavor of high dynamic range that comes down the pike? As Putman alluded to earlier and explained in his NAB presentation, it involves transitioning toward faster display interfaces. It’s his view that, in the long run, HDMI interface technology “really isn’t fast enough” for an HDR world. Thus, he has been evangelizing the notion that the industry should consider transitioning into the alternative technologies that he discussed last year with Newswatch—DisplayPort™ and superMHL™.

“HDMI 2.0 has a maximum clock rate of 18 Gb/s, which is enough speed to transport a 4K Ultra HD signal with 8-bit color depth at 60 Hz using 4:4:4 RGB color resolution,” Putman explains. “It can also handle a 10-bit, 4K signal at 60 Hz, but only with 4:2:0 color. So a potential problem arises with 4K video content at 60 Hz that originates from a computer or computer-like media player in the RGB signal format—this signal can’t pass through an HDMI 2.0 connection if it is encoded with 10 bits per pixel.
 
“Therefore, the immediate problem I have with it is, since we don’t know how the content is going to get into the TV in the future—what if I’m a gamer and I want to view 4K video that I’m playing off a computer? Or, what if I have a future media appliance that is streaming in a 4:4:4 format, and I want to do true 10-bit, RGB? That is too fast for HDMI—it can’t pass the signal.”
 
Putman believes this is “a big problem” that will only grow exponentially as more high-end 4K content and gaming applications are produced for consumers to view and interact with using home displays.

“The current version of DisplayPort [v. 1.2] is faster at 21.6 Gb/s, and can pass 4K at 60 Hz with 4:4:4 RGB, 10-bit color,” he elaborates. “So, in my opinion, that should be the minimum requirement for any interface that is passing UHD content. Although HDMI 2.0 can pass 10-bit UHD 4:4:4 video content at 24 Hz and 30 Hz, it won’t work at 60 Hz. Dropping the color resolution to 4:2:0 is the only way to make this work. 4:2:0 color is the same resolution we use currently to deliver 8-bit color through broadcast, cable, satellite, DVD, Blu-ray, and streaming video. 10-bit UHD content should really be delivered in a 4:2:2 color format, at the very least.”
 
Therefore, in Putman’s opinion, “we should move to existing alternatives” rather than continually waiting around for HDMI updates. In the year since he last discussed this issue with Newswatch, what surprises him, however, is that “no Ultra HDTV manufacturers have yet adopted superMHL,” and there has been only limited utilization of DisplayPort with the first generation of consumer UHD televisions.

“superMHL technology is owned by the same parent company [Lattice Semiconductor] that owns the intellectual property and patents behind HDMI, and it solves the problems I outlined with HDMI for [HDR capable displays],” Putman says. “Currently, it is the fastest interface out there, but no one is using it [for televisions]. I was told a year ago that they hoped to have design wins out by [last] Christmas, but there weren’t any. Everyone seems to be sticking with HDMI for now.

“The other contender, DisplayPort, is usually thought of as a computer interface. We now have DP version 1.4, which supports Display Stream Compression [DSC, a low latency, light compression system for signals interfacing with displays, also supported by superMHL™] with forward error correction, unlike HDMI. In fact, you might need only 2:1 or 3:1 compression to get an 8K signal through DisplayPort 1.4 or superMHL interfaces. All the recent demos I’ve seen of 8K displays have either used DisplayPort or superMHL, using Display Stream Compression. And DisplayPort does not require licensing—VESA [the Video Electronics Standards Association] is continuing to offer it royalty-free.
 
“Oddly, the current version of DisplayPort [1.2] has been [available] for a few years and is faster than HDMI 2.0 because it can interface 10-bit RGB 60 Hz, Ultra HD. I know a couple manufacturers have added it to their TVs, labeling it as a computer connection or something like that. But in my view, the whole point of next-generation TV, if we pull back to the 10,000-ft. view, is that we want it to be a significant improvement over the current version of TV, which offers 1080p resolution, with a maximum frame rate of 60 Hz, capable of showing deep color, but not capable of showing real high dynamic range.

“Therefore, as we move to UHD, my feeling is that we should be discarding all the notions that we’ve had about televisions being backward compatible. Every time we’ve had to upgrade technology, going all the way back to moving from black-and-white to color, we’ve had to be backward compatible. We stepped up to HD, but kept an interlaced video format and essentially the same color space. But now, we have an opportunity to adopt a new TV system based on Ultra HD, and say, we’ll add dynamic range—which is definitely not backward compatible—and wider color gamuts and eventually higher frame rates. That means we should adopt a new display interface. That’s the beauty of the Ultra HD system—it can give us a viewing experience a lot better than what we had before. So why use an interface [HDMI] that is not fast enough to pass anything beyond 60 Hz, 10-bit, 4:2:0 color? It’s my view there is no reason to convene a group to talk about HDMI 3.0—we already have the necessary speed in [these other interfaces].”

Putman suggests this transition means that the eventual “sunset for 8-bit color” is also, in his view, inevitable for consumer displays as the UHD era moves forward, as he discussed last year in Newswatch. In fact, he says, “there has been progress in that regard” in the past year, which he thinks is a good thing.
 
“For a long time, flat panels were only 8-bit addressable, so it didn’t matter what content was coming in,” he says. “10-bit content was quantized to 8 bits, and depending on the panel, it sometimes looked really nice. But native 8-bit content [on a modern UHDTV] can reveal contouring and picture artifacts—finite gradations of color where there shouldn’t be any. 10-bit color gets rid of that, and now, we finally have 10-bit addressable display panels coming to market from factories in Asia. I’d like to think that pretty soon all panels going into 4K TVs will be 10-bit natively addressable—I know many already are from [many major manufacturers]. That means 10-bit content coming into that display—even content that was originally authored at 12 or even 16 bits per color, and then quantized down to 10 bits—will look good [once they have the interface issue addressed]. This is important for live UHD video coming in at 60 Hz or 59.94 frames/sec, because we are basically moving away from interlaced video. We’ll eventually be going to all-progressive video, coming in at 59.94 frames/sec, or 60 Hz, progressive scan, with 10-bit color, and it could be in the 4:2:2 format, not 4:2:0.
 
“With all that said, in my view, 8-bit color should be stamped out. There is no reason to have it any more in these displays.”

The “fly in the ointment,” as Putman calls it, regarding this goal is the issue of over-the-air broadcast content, where content would likely continue to be broadcast in the 4:2:0, 8-bit color format due to bandwidth limitations. “That comes down to transmission demands because, right now, if you are going to encode 4K with H.264 compression, the bit rate is going to be pretty high. But there were a lot of demos at NAB of encoding and decoding with H.265 [HEVC] encoders that were pretty reasonably priced. And H.265 can encode 2160p at what would be Ultra HD, 60 Hz, at around, from the demos I saw, data rates as low as 15 to 16 Mbits/sec. In fact, the Fraunhofer Institute showed a coding demo of 2160p HDR content with BT.2020 10-bit color at 16 Mbits/sec, and it was beautiful.
 
“My point is: you can deliver UHD to the home, and you can do it with HDR content, and such demos show you can do it at about 16 Mbits/sec. Even if it required faster connections, like 20 to 30 Mbits/sec, that is not unusual for today’s broadband connections. So, I think the key for UHD really taking off with all these enhancements and with 10-bit color is really going to depend on the adoption of H.265 for both encoding and decoding. We have the native 10-bit panels, and we will have fast enough interfaces if we make the changes [away from HDMI], so we really just need to do that and get rid of 8-bit color formats, in my view.”

You can read more of Pete Putman’s post-NAB 2016 thoughts on these and other display trends in his HDTV Magazine column here.

News Briefs
ATSC 3.0 Update

Now that the FCC has put out a joint petition for comment on the proposal for voluntary adoption of the new broadcast media transmission standard, ATSC 3.0, interest in the new standard has been heating up. As reported by Deborah McAdams in a TV Technology column this month, the Advanced Television Systems Committee held its annual meeting recently to report on progress and the promise of ATSC 3.0. The report states that the standard is expected to be completed and ready for implementation within in a year from right now, with a future-proof capability built in to accommodate unknown future data formats, described at the ATSC meeting as “generic data” that is “extensible.” Officials at the meeting laid out the overall structure of ATSC 3.0—its foundation and “bootstrap” signaling layer; its transmission methodologies using orthogonal frequency-division multiplexing (OFDM); its protocol layer, which defines broadcast content as data or IP files; its presentation layer, which accounts for 4K, UHD, immersive audio, and other high-bandwidth image or sound content upgrades; and also the software applications layer of ATSC 3.0, to permit an Internet-style viewing experience on certain types of screens. Tutorials and updates on events designed to test various aspects of the specification were given, and attendees heard various other presentations designed to further the industry’s knowledge of the new standard’s capabilities.

AES Examines VR Audio
The 141st AES Convention in Los Angeles this year will include, for the first time, a conference dedicated to the topic of audio for virtual reality applications. Sound & Video Contractor recently reported that the AES International Conference on Audio for Virtual and Augmented Reality will be held during the AES Convention at the Los Angeles Convention Center, 30 September to 1 October. The main thrust of the conference will be to examine “the creative and technical challenges of providing immersive spatial audio to accompany virtual-reality and augmented-reality media,” according to the article. Topics will include Object-based Audio Mixing for VR/AR Applications, Immersive Audio in VR/AR Broadcast, Live VR Audio Production, Streaming Immersive Audio Content, and several others. You can find out more about the conference here.

Surviving Death of Moore's Law
A recent report in the MIT Technology Review suggests that technology companies are currently searching for new ways to increase the efficiency of computers now that the predicted demise of the venerable Moore’s Law appears to be happening. Moore’s Law, of course, was named after Intel co-founder Gordon Moore after he postulated in 1965 that the number of transistors on computer chips would continue to double indefinitely approximately every two years. For decades, that prediction was pretty much proven accurate, but now, according to the MIT article, Intel itself has increased the time between the release of new transistor technologies, including the recent delay of its expected release later this year of a new breakthrough expected to permit transistors as tiny as 10 nanometers to cluster onto new chips, and industry pundits are generally suggesting that silicon transistors will only be able to keep shrinking for about another five years. The article reports that technology companies are therefore examining new ways to improve chip designs to increase efficiency in other ways, particularly by accelerating or improving certain algorithms. For example, some companies are developing what is called “deep learning” chips that process algebraic computations more efficiently so that machines can literally learn processes quicker—a technology coming out of the artificial intelligence field. Others, according to the article, are working on “reconfigurable chips,” and there is also work going on to re-design basic computer and data center architecture, according to experts.

SMPTE Centennial - A Growing Movement!
As we mark the 100th anniversary of SMPTE, we are pleased to add YouTube, Fotokem, Blackmagic Design, and Bud Mayo to the list of The Next Century Fund donors. The fund, which has raised more than $1.8 million in committed gifts toward a $4 million goal, supports the Society’s advancement in standards, membership, and education initiatives. The campaign's success is among the highlights of the centennial year and we are grateful for the support of The Walt Disney Company, Panasonic, Dolby, Technicolor, Blackmagic Design, Fotokem, Ross Video, YouTube, Aspera, and individual donors Bud Mayo, Michelle Munson and Serban Simu, Leon Silverman, Wendy Aylsworth, Peter Wharton, Bill Miller, Ellen Sontag-Miller, and Andy Setos.  
 
The 2016 centennial celebration will culminate with the Centennial Gala, which takes place on Friday, 28 Oct. in the Ray Dolby Ballroom at the Hollywood/Highland complex in Hollywood, California as the finale to the SMPTE 2016 Annual Technical Conference & Exhibition.