Hot Button Discussion
By Michael Goldman
Since SMPTE Newswatch last spoke with him about the status of ongoing efforts to develop an international framework for developing standards for ultra high- definition television (UHDTV) last Fall, David Wood suggests a key development on the UHDTV landscape involves a sharpening of several basic "world views" about how the next generation--or rather, generations--of broadcast viewing technology should evolve in the coming years. Wood is the Chairman of the International Telecommunications Union (ITU) Working Party 6C, and the group DVB CM-UHDTV, which have been examining how to structure a UHDTV rollout in a practical manner.
As explained in the Newswatch last October, that rollout is based on the parameter values given in ITU-R Recommendation BT.2020, which essentially proposed that UHDTV should have two levels. The first is UHD-1 (2160p or 4K resolution), which Wood calls "the simpler level of UHDTV--essentially it is four progressively scanned 1080p HD pictures put together. This 2160p system in itself looks likely to have two forms in practice. The first will have fewer features, apart from static resolution, than the second. There are different views about whether other features are needed for consumer success."
The second UHDTV level is UHD-2, or 4320p, called Super Hi-Vision in Japan, which can be thought of as having 16 1080p pictures as building bricks for an 8K image.
"In a sense, therefore, for practical purposes, we have a three-step UHDTV staircase--the first two steps are 2160p, with and without additional features, and the third step is 4320p," Wood elaborates.
UHD-1, of course, has started to become a reality as manufacturers are currently selling the first-generation of such televisions to consumers around the world. But as Newswatch recently discussed, that initial transition is taking place now, largely because it makes business sense to make and sell such monitors to consumers from a manufacturing and distribution point of view, and not because the industry is ready to broadcast UHD content far and wide to people who purchase such televisions. Therefore, these early UHD televisions are arriving in a largely high-definition world, where their initial role will be to enhance the HD viewing experience long before UHD content becomes ubiquitous.
Wood says, today the question therefore revolves around the different "world views" debate over how, when, and to what degree to roll out the three full steps of UHDTV in a meaningful way. What "quality factors," as Wood puts it, "need to be part of what stage?"
By quality factors, Wood means combinations of improved technological enhancements ranging from increased static picture resolution, to higher frame rates, to better dynamic range, to improved sound systems, and so on--things that, "directly impact the viewer's sense of immersion in the viewing experience." That question, he suggests, is far from settled, because it all depends on what point of view one brings to the debate.
"The set makers--the manufacturers--tend to believe that fewer quality factors need to be added to make UHDTV saleable to the public right now," Wood explains. "It is perfectly logical that they would believe this, because it gets more expensive to add more of the quality factors, and TV sets will be more saleable the less they cost. For them, the need is for a relatively simple UHD system, and that is what we are starting with the TV's in stores today."
"At the other end of the spectrum are the Hollywood movie studios, and they believe UHDTV needs to offer a really big quality jump. Yes, you need the added resolution and 2160p is OK, but also better frame rates, better colorimetry, three-dimensional sound systems, and of course, the big talking point of the moment--the possibility of higher dynamic range. So that is the world today--different views of what we need to make UHD saleable, what we need to add in terms of quality factors."
With the first generation of UHDTV sets already available and the technology for future generation monitors in development, Wood emphasizes that the discussion naturally moves back to the issue of content--not so much how best to create it, since ultra high-resolution acquisition technology is now becoming well established, but rather, how best to package and broadcast it, and via which medium, in a multiplatform world. Wood further breaks down the issue of what is "saleable" to consumers, based on the reality that we live in a multiplatform, multiformat universe. Thus, he emphasizes that the aforementioned "quality factors" needed are logically likely to end up varying, based on, among other things, how and where the content is going to be delivered.
"The issue before us is what formats should be broadcast to which type of display, what formats should be available on the Internet, and what formats should be available on packaged media, probably principally, the successor to Blu-ray disc," he says. "These are important discussion points at the moment, and there are different views about which quality factors will be necessary for each one. But you can play a hunch for certain things. For instance, it stands to reason that the Hollywood movie industry will be influential in how Blu-ray formats will end up, since that is largely shipping their product. So it makes sense that we can expect Blu-rays to have a higher number of important quality factors apart from static resolution, like high-dynamic range improvements, for instance."
"For broadcasters (satellite and earth-bound), the discussion is more about what is needed to broadcast their particular type of content and when. For Pay TV operators, things may be different than for the free-to-air broadcasters. Pay TV broadcasters are likely to be the first ones to broadcast UHDTV regularly, and we should not be surprised to see this in the next year to 18 months from broadcast and cable service companies like Sky Germany, BSkyB, and DIRECTV. They will probably be first up to bat, simply because they already have the bandwidth over satellite and fiber to deliver the signals, with only a need to get a new set-top box to the viewer to make it feasible, provided the viewer buys a 4K television."
That is the key point. No matter the quality of the content and the image data being delivered by such services, Wood adds that at the end of the day, the UHD experience will return to the issue of the capabilities of the monitor the viewer is using. In particular, the issues of extended dynamic range and higher frame rates sit at the heart of the discussion. These issues are crucial because they impact both the technological side of the debate and the content creation discussion, he suggests.
"One school of thought says the TV sets of the future will be far brighter than they are right now," Wood says. "Right now, modern TV sets are around 100 to 200 NITs in terms of brightness. Some people believe it won't be long until we see them go far beyond that, as much as 1,000 or even 2,000 NITs to allow much brighter elements in the picture. That would permit far higher dynamic range images if there is adequate bit depth--you could see the glint in someone's eye or the detail in a lump of coal, details that are limited in televisions today. Dynamic range could be extended dramatically."
"But then, there is discussion about how valuable this is, what it would bring to the table creatively, and how much it is worth. That is probably the principal discussion in UHDTV today--what will be the peak brightness in TV sets in the near future, and what should we do about it creatively? If it happens, it will obviously improve the viewing experience. It could feel like you stepped into the sunlight, or details would be much clearer in black areas of the image. Some people think this kind of improvement would actually be more valuable for viewers than extra static resolution of the picture," Wood says.
The other main issue involves frame rates. The ITU has recommended a standard that would allow up to 120 frames/sec for quality UHD, but of course, as Wood points out, "those kinds of numbers mean sharper images, but also you need more bandwidth, and the TV becomes more complicated. So there is this tradeoff of trying to know where the affordability point of the set is compared to how much quality improvement you need to make UHDTV successful. At its core, this is what part of the UHD discussion has become.
The ITU recommendation for UHDTV is essentially a starting baseline for these debates. But as Wood states, "the question of which of those options make commercial sense to broadcast in terms of the jigsaw of the cost of equipment, compared to the additional benefit to the quality to make it more saleable, are the issues that will be in discussion for quite some time. But hopefully, we will eventually get into some kind of world consensus about what the second stage of 2160p UHDTV will be."
"We could anticipate, as noted, that in a year to 18 months, we will find pay TV services putting the first stage 2160p UHDTV content on air. That will be phase one--up to 60 frames/sec without higher dynamic range. During 2015 we can hope to define more specifically, what the second phase of 2160p should be--can it be the bells and whistles that Hollywood wants? And having that specification, by 2018 or so, we can imagine some of those services would be on air in some form."
"The next step after that would be our third--Super Hi-Vision or 8K, with four times more resolution. We will have to see if that could be a commercially interesting service. A few months ago, I might have said that I didn't see any signs of [manufacturers] showing interest in making Super-Hi Vision TV sets. But now, several of them are investigating how it could be done. By 2020 or so, they might actually be commercially available. All this begs the question, how long will the life of 2160p UHD-1 be? Is it a short-term system? Should we go for it now, and then have to upgrade to a 4320p 8K system in six years or so? These are still difficult things for the industry to know for sure--if I knew, I would soon be very rich."
Remote World Cup
A recent TV Technology article suggests that despite all the hoopla in recent years about both 3D and 4K broadcast experimentation for live sporting events, the 2014 World Cup soccer tournament in Brazil, which takes place in mid-June, will be notable from a broadcast point of view, more for live remote production content streaming innovations than anything else. The article states that host broadcaster HBS has strategically installed a total of 234 remote broadcast servers throughout Brazil for the event at each of the 12 soccer venues across the country. This will permit licensed media entities, of which there are expected to be hundreds, to access, edit, and distribute streamed content coming from the venues in as state-of-the-art and comprehensive a way as the industry has ever seen for an event of this size and scope. The article also states the remote streaming content solution being used by HBS for the tournament will be designed to be integrated into FIFA's official mobile app and many other mobile apps belonging to licensed broadcasters around the world. By contrast, 4K broadcast tests will be conducted at the World Cup event for only three soccer matches, including the championship match, as compared to some 25 matches that were broadcast in stereoscopic 3D during the last World Cup. Still, those three matches will be covered with 12 Sony F55 cameras for each match, and dozens of specialty cameras, along with standard HD content being upconverted and added to those broadcasts, according to the article.
Rise of Virtual Reality
Variety's David Cohen penned an interesting analytical piece this month about Hollywood's struggles to figure out how best to use various virtual reality (VR) technologies to create entertainment product. By "virtual reality," Cohen is referring to the creation of artificial, 3D environments that users can enter, and in some fashion, interact with computer graphics within that environment using some combination of handheld devices, goggles, or earphones. He points out the concept has been around for a while in the form of military and medical simulation technology and, in a limited way, for video games, but only with the advent of smaller, more powerful computer chips in recent years has the entertainment industry started to examine VR more closely for how it can be sold as an immersive product to consumers. Cohen points to Facebook's purchase of VR gaming company Oculus for $2 billion, Sony's development of its own VR gaming system, Project Morpheus, and the rise of dozens of smaller companies interested in this space. He says that Hollywood studios like 20th Century Fox are also exploring ways to use VR to create supplemental content for movie and video game franchise marketing promotions. Cohen suggests that with recent technological advances that can both improve the experience and hold down the price simultaneously, the entertainment industry may be on the precipice of a virtual-reality breakout in the near future. SMPTE's upcoming Entertainment Technology in the Internet Age (ETIA) conference at Stanford University is featuring an Event Event with an Interactive Exploration on the topic of VR. Cohen is scheduled as a panelist on the 16 June, Special Evening Session, The Holodeck: Entertainment for the Next Generation.
Post Magazine recently surveyed several post-production facilities and manufacturers about procedures and policies related to reducing their carbon footprint and being more environmentally conscious as a business practice. Several companies highlighted in the article pointed to policies or technologies that slowly seem to be gaining a foothold in an industry which, according to the article, produces tons of daily waste from electronic gear that can be highly toxic if not disposed of properly. The article discusses a New York-based business called Tekserve, for example, that specializes in offering recycling programs and events to technology companies. A number of facilities featured in the piece discuss a variety of strategies they have used to reduce their footprint in recent years. Such solutions range from going all tapeless to increased reliance on Cloud storage; redesigning storage area network environments with smaller, more efficient technologies; using certain brands of energy-efficient organic light-emitting diode (OLED) monitors; architecture; and design choices that incorporate recycled materials and LED lighting, switching to Ecofonts (print fonts that use less ink), providing bicycles to employees, fundraising and selling gear for environmental organizations, and many other interesting ideas