April 2013

SMPTE Newswatch Masthead

 

Hot Button Discussion

Compression Trends: HEVC and More 

By Michael Goldman 

 

The July 2012 SMPTE Newswatch covered how the Main Profile of the new High Efficiency Video Coding (HEVC) scheme, technically known as ITU-T H.265|ISO/IEC 23008-2, was on the verge of being finalized--a potentially significant leap forward in helping broadcasters achieve meaningful bandwidth savings on the distribution of high-quality video images over the existing H.264|MPEG-4 AVC standard. This past January, as expected, the ITU-T Study Group 16 achieved consensus and MPEG received approval for publication of the standard. That consensus, according to industry experts, is that HEVC has the potential to reduce data bit rates by as much as 50% over the current AVC standard. The new standard includes a main (8-bit support) profile and a main-10 (10-bit support) profile.
 
In line with this development, according to Walt Husak, a longtime SMPTE member and director of image technologies at Dolby Laboratories, some manufacturers were already showing off HEVC decoding hardware tools at the Consumer Electronics Show (CES) and the National Association of Broadcasters (NAB) show earlier this year. All of which begs the questions--what will be HEVC's role moving forward, how will it be integrated into existing business and technical strategies, how might it be extended and improved over time, and is it reasonable to expect it will become a practical global standard anytime soon?
 
"HEVC was originally driven by two things--one was bigger, faster images like 4K and higher frame rates, and the other was being able to do that at really low bit rates for portable devices," Husak says. "Those were two big driving requirements originally. But, as time goes on, other tools and applications will get folded in. When you design something like this, you always lay out your use cases, your application spaces, and then develop a set of requirements [for the standard], and those are used to define which technologies are useful enough. There are all sorts of tools and various methodologies being proposed. Some will be useful to meet the requirements and other ones won't be. So the industry will have to use the requirements to decide what tools go in, or what tools do not go in, or if certain tools go into one profile and not into another profile."
 
Thus, Husak says, "there will be a demarcation between the actual HEVC standard and the implementation of HEVC." By that, he means that HEVC, while largely viewed as "a successor" to AVC, will most likely grow and evolve by fitting itself into evolving industry models as a piece of an ever-growing pie for the foreseeable future, rather than as some kind of ubiquitous, one-size-fits-all solution. A wide range of proprietary codecs, systems, and strategies abound--open and closed--and HEVC will need to fit into that reality. What its proper role will be, and what regulatory issues it will pose for the industry, remain to be seen, Husak suggests.
 
"You have open and closed systems," Husak points out. "For a closed system, it is easy to specify whatever you want whenever and however you want, because at the end of the day, you are telling your encoder and decoder vendors what you are going to buy. An example of that would be direct-to-home [DTH] service that controls both ends of the content value chain. Within a closed system, the hurdles for specifying and deploying a new codec such as HEVC are low. An open system is where there is a big disconnect between the broadcaster and the consumer. These systems present some challenges for the deployment of a new codec. For instance, there may be some regulatory issues that need to be addressed in order for all the various devices in the chain to be interoperable. For an over-the-air broadcaster to deploy HEVC as its main service, it would have to replace the MPEG-2 service, which is nearly 20 years old. So, in order to deploy another codec, you can do it off of your main service, which was done for AVC, but then your penetration is really low. Or, you can change your main service, but that typically means you have to go back to the regulatory space for that, so that does make it a bit of a challenge in [fully implementing] HEVC."
 
Additionally, there are a host of new creative possibilities for imagery that interest content creators; HEVC, or any new codec, will have to accommodate those possibilities as time goes on. The advantages as it relates to spatial resolution at lower bandwidths are obvious, Husak points out, but what about "other conventions like higher frame rates? More pixels? Wider color gamuts? Multi-view displays?"
 
Future additions to HEVC, he suggests, could go in the direction of scalable extensions.
 
"For past codecs like MPEG-2 or AVC, the core codec comes out first, and then new profiles for different application spaces, and then MPEG and ITU-T begin working on extensions," he says. "In AVC, there were scalable extensions and multi-view extensions--scalable for more frame rates, bigger images, deeper bit depths, and things like that. Scalability was not successful for AVC; however, the multi-view extensions enjoyed some success in 3D Blu-ray. Now, there are efforts for scalable extensions for HEVC--making it scalable and containing multi-view all in one package."
 
"With so many things going on, as an industry, we have to ask ourselves if [in future development] we should go to each dimension separately, or bundle all the improvements and roll that out? That's a big question. In other words, do we do more resolution this year, higher frame rates the year after that, wider color gamut the year after that? Or do we say, wait a minute, let's collect them all together and give the consumer everything all at one time."
 
According to Husak, one of the tool-set extensions the industry is working on for the codec is range extensions for professional applications. Husak says range extensions are all about allowing HEVC technologies to provide different values at different bit rates and operational points, depending on the application.
 
HEVC, of course, is not the only new issue in the world of compression. Husak points out that a lot of HEVC's core technology comes directly from AVC and that, at its most basic level, HEVC is just AVC "with more variety of tools. So some implementations that were designed for AVC can decode HEVC streams under certain circumstances." Besides, the cost of infrastructure changes being what it is in the current economy, AVC-based systems for many broadcasters are unlikely to depart wholesale anytime soon, as much as they are likely to gradually evolve into HEVC-based systems.
 
Other important compression approaches and improvements are on the landscape, such as with JPEG 2000. Husak, a liaison to SMPTE on the JPEG standard, says there is "a lot of discussion inside SMPTE" about expanding that codec's broadcast profiles to give it a wider range of applications to make it an even stronger mezzanine compression choice inside the Interoperable Master Format (IMF). He expects movement on that issue in the coming months.
 
Husak adds that, more generally, a lot of the industry's focus currently revolves around analyzing the strengths and weaknesses of different codecs--comparing 2K and 4K imagery compressed through different schemes under different conditions for different kinds of applications. He also points to industry research into using JPEG 2000 in new ways for light compression work, its incorporation into the digital cinema world, and many other developments.
 
The rise of HEVC and its potential to help make broadband transmission of 4K images an efficient reality is clearly the biggest current step forward in the compression arena, particularly now that it is on the verge of being formally published as a standard. As discussed in the July 2012 issue of SMPTE Newswatch, the  Future of Broadcast Television (FOBTV) initiative and other entities continue to study HEVC's potential as a global compression standard that could be the foundation for terrestrial broadcast systems of the future.
 
"It's not hard to draw the conclusion that HEVC will be targeted [by FOBTV] because it is the newest codec and could be fully developed in time for use by the feature television services of the near future," Husak says. "At least in the U.S., the FCC defined the main broadcast service as being an MPEG-2-based service long ago, so the industry would have to go back to the FCC to get that changed. But it makes sense if you think about it. If there was a 2:1 bandwidth savings between MPEG-2 and AVC and there is another 2:1 savings between AVC and HEVC, in the same space that you currently have an MPEG-2 service, you could have four or more HEVC services."
 
"Maybe someday, little or no compression might become feasible. They are working on that--ways to move large amounts of data around using both fiber and copper and banded copper interfaces. A SMPTE working group [24tb-uhdtv] is currently trying to figure out what a 4K broadcast plant would look like. There are a lot changes happening, and what is possible keeps evolving."
 
News Briefs
Flexible JPEG2000   
Speaking of JPEG 2000, Broadcast Engineering recently published an interesting analysis on its website about the codec's potential as a leading mastering compression format. The column, by Jean-Baptiste Lorent, product manager at image data technology company intoPIX, emphasizes that JPEG 2000's "openness and flexibility to provide lossy and lossless compression, progressive and parseable code streams, error resilience, region of interest, proxies, random access" and more, "in one integrated algorithm" make it an excellent fit for production workflows. That's because, frequently, such work needs to address imagery on a frame-by-frame basis throughout what he calls "several encoding/decoding cycles." Lorent also calls the scaleable codec "future proof" in the sense that it can handle all resolutions, every kind of color depth, and limitless components and frame rates. 
 
Spectrum Woes   
The TV Technology website reported in April on the struggles audio professionals are encountering with wireless systems on major projects. This is due to the loss of available spectrum following the FCC's 2008 wireless spectrum auction (known as Auction 73) that took some of the previously existing bandwidth, available for analog signals, away from television broadcasters. The article suggests the wireless revolution has eaten into the remaining spectrum in certain geographic regions, sometimes causing problems for audio crews or venues with audio infrastructures. It suggests that the art of frequency coordination has become critical for staging and broadcasting major sports, entertainment, and political events. Additionally, coordination professionals are now highly in demand around the industry. The article states that professional coordinators worked to strategically acquire about 7,000 RF channels for the 2013 Super Bowl in New Orleans.
 
Virual Pioneer
Technology journalist Harry McCracken recently published a fascinating piece in his Time Magazine Technologizer column about a recent conversation he had with computer graphics pioneer Ivan Sutherland. Sutherland, while still a Massachusetts Institute of Technology (MIT) student in 1963, invented the Sketchpad interactive drawing-and-design program that served as a foundation for much of today's computer graphic work, and more remarkably for the era. The program includes an interactive drawing methodology using a light pen and an oscilloscope display.
 
Sutherland patented numerous other inventions and co-founded the pioneering computer graphics/virtual reality technology company Evans & Sutherland with University of Utah professor David Evans. In 2012, he was awarded the prestigious Kyoto Prize for Advanced Technology. To celebrate that honor, he chatted with McCracken about his accomplishments and the role they played in helping to build a foundation for the computer graphics' revolution. McCracken pointed out Sutherland's work, which deeply influenced mouse inventor Douglas Engelbart and Xerox PARC researchers, among others. The modest Sutherland told McCracken he is currently conducting research on how to speed up computer circuitry by circumventing their reliance on a processor's clock. Most fascinating, however, is the 20-minute video link in the story showing clips, produced at MIT, of researchers demonstrating the Sketchpad interface in an era where the very concept was hard for most people to fathom. You can also see that video here.