Newswatch July 2011

To View an Online Version of this E-letter, click here

SMPTE Industry News - Monthly Tech Focused Newsletter of the Society of Motion Picture and Television Engineers

Hot Button Discussion
Quest for Greater Bit Depth
By Michael Goldman

Deep inside the inter-connected layers of the Academy Image Interchange Framework color encoding specification (IIF-ACES) sits the issue of how best to achieve greater, more consistent bit depth in imagery as a crucial step toward improving and preserving dynamic range from capture through post and final exhibition. Like color gamut and spacial resolution, long-held views about what is possible with bit depth have shifted considerably in recent years as sophisticated digital imaging sensors have pushed image capture far past the capabilities of the venerable 10-bit Cineon framework. Some systems are now capturing data at 14 bits linear — Sony's new F65 4k digital imaging system, in fact, was constructed from the ground up to be compatible with IIF-ACES standards and is capable of outputting 16-bit linear RAW data. Thus, the industry's challenge involves figuring out the best logistically feasible structure to replace the 10-bit log-based system with a new, consistent, and vigorous standard all the way down the post-production path and well into the future.

"Everyone in this industry wants more dynamic range, and thus, digital cameras with more F-stops," says Ray Feeney, Chairman of the Academy Sci-Tech Council. "To cover more dynamic range at even the same sized digital increments, you need more bits. It just makes sense, so that is why we want greater bit depth down the post-production chain."

The issue has been under examination for several years by the Science and Technology Council as an important part of the larger IIF-ACES initiative. It connects to the work done to develop Interoperable Mastering Format (IMF) architecture for the movement of data during all phases of the digital mastering process in the sense that IMF was created to be the wrapper format for exchanging/transferring data around, while IIF-ACES has been proposed as the ideal encoding standard for data inserted into that wrapper.

Feeney says great clarity has emerged on the two key aspects of the bit depth issue. The first is the fact that the industry clearly needs to upgrade from 10-bit Cineon. Secondly, according to Feeney, the floating 16-bit OpenEXR format is the correct choice to replace it.

Feeney and other industry experts discussed these conclusions and their implications for the industry in April at an NAB panel discussion. He recently elaborated in a conversation with SMPTE Newswatch.

"The 10-bit Cineon system was brilliant when it was proposed in 1988, but it was designed with film as one of the carrying steps of data, with digital being an intermediate form before the images were recorded out to film, and so, it depended on the nature of the film grain and other factors unique to a film negative," Feeney explains. "Since then, we have basically had only two choices for digital post workflows—10-bit Cineon, a digital intermediate stage in a film-style workflow, or working in video space at REC 709. We feel that this reality has hindered the proper use of digital cinema equipment that has been designed since then. Today's film stocks and digital cameras are all more capable than they were in 1988, when the Cineon design decisions were made. Today's systems already exceed the capability of legacy 10-bit systems, and so, the goal is to preserve and extend that original dynamic range down the line. Everyone studying this universally agrees that the current 10-bit legacy system out there is not allowing (original bit depth captured by) the current generation of image capture devices to make it all the way through the (post-production process). And there are even better systems currently in prototype stages that will only exacerbate the situation."

"Therefore what we decided as part of the IIF-ACES initiative was that if we needed to have a change, we should look at making a large enough leap that is not just a Band Aid," he adds. "We wanted to come up with something with more legs to it. It therefore became clear that if you go up from 10 bits, then 16 bits made the most sense as the next step. We have been working on 16-bit systems that can make use of existing developmental work out there, rather than designing it all from scratch. And that is why we decided that the 16-bit OpenEXR format should be the file container for (data in an ACES workflow)."

16-bit OpenEXR is nothing new, of course. It's been around for more than a decade after its creation at Industrial Light & Magic as an open source, high dynamic range encoding format for visual effects files. Folding OpenEXR into the ACES initiative will potentially allow the industry to standardize encoding of the reams of data it moves around from facility to facility on projects without having to develop a new format from the bottom up. And, in so doing, it can theoretically reduce much of the "wasted effort," as Feeney describes it, that those facilities put in daily to transcribe and encode that data over and over again as it moves around to produce all of a typical project's many deliverables.

"This is an unambiguous way to provide materials from one facility to the next," Feeney says. "This way, we hope to allow productions to utilize as much capability as (manufacturers) can put into new digital capture devices, and preserve it all the way through post and into the archive."

The new system is currently working its way through the SMPTE standardization process and has been "pretty well tested in a robust way," according to Feeney. That includes invaluable production trials on such shows as the FX Network show "Justified," which produced an entire season of episodes this year using an IIF-ACES workflow. (Read a detailed account of "Justified's" workflow and methodology in the digital edition of American Cinematographer magazine here.)

As often happens with evolving technologies, the ability to work in an IIF-ACES workflow, and thus have extended dynamic range and greater bit depth in imagery throughout the process, as "Justified" has done, is coming into focus long before the new standard becomes official or ubiquitous. While the IIF-ACES format encompasses much more than 10 bits of dynamic range, the ability to fully see that extended dynamic range in realtime on the display side remains limited for the time being. Thus, overall, the industry will continue to see much work displayed at 10 bits for the foreseeable future, reminds Dave Schnuelle, Senior Director of Image Technology at Dolby Laboratories, who also participated in the NAB panel discussion in April. He explains that there are lots of practical reasons why any conversion to a full-on standard will take time, even once the review process is over and IIF-ACES becomes a formal standard.

"Much of the hardware out there is limited to 10 bits, and we are still working to get more equipment capable of dealing with 12 bits over an HD-SDI cable," Schnuelle explains. "One thing you have to remember with these file formats, whether DPX or OpenEXR, is that they don't flow over an HD-SDI cable into a monitor. There is way too much data there, and so, it has to be transformed into something else that can run down those cables in realtime. We can translate it to a 12-bit logarithmic signal to run down the cable, but at some point, you need to see the realtime image without playing around on a display with DVI connectors. So, for that reason, I think we are not yet done with 10 bits in the short term. However, we are at an inflection point where we are trying to improve the quality of everything in the post-production chain, and these new standards can permit that now, and then spread over time. But in terms of the OpenEXR format and ACES encoding, we don't have a lot left to do to make it work—the format works and the limitations, for now, are at the display end."

All of which begs the question of why improving color gamut, bit depth, and dynamic range overall from capture through post is so important in a world where the ultimate results of that work won't necessarily be viewable by the current end user/consumer. The answer to that question, of course, is that the improvement is crucial for archiving media for any/all future versions and display methodologies that will eventually show up.

Schnuelle calls this "a huge issue."

"Archiving data is why the IIF process and standard will be so useful for many decades to come," he says. "Primarily, it exceeds what we are currently able to see on a display device or even fully use. Right now, the standard probably carries more information than we can hope to use any time soon, in fact. But if you archive something that has more potential than we can currently take advantage of, you do that because we haven't provided displays yet to match the capability of our eyes, and there is room for us to improve there. But when you get those improvements, we'll have preserved the information that will allow those future displays to operate at their maximum potential."

Feeney agrees, and suggests that achieving the IIF-ACES standard, including far richer bit depth in imagery, will be central to the rising industry of data archiving.

"There are two distinct issues about archiving," says Feeney. "The first is to successfully save the ones and zeroes and be able to get them back in the future, and the second is to actually do something meaningful with them once you do get them back. This unambiguous global encoding mechanism can serve that goal in multiple ways because we are talking about preserving the original digital master."

Thus, at the end of the day, having higher dynamic range data to begin with, and being able to preserve that dynamic range as the media is stored and moved around over the fullness of time, has great meaning to content creators. And, therefore, suggests both Feeney and Schnuelle, finalizing and implementing the IIF-ACES workflow as soon as possible is good business for studios today and going forward, and simultaneously, will creatively benefit both content creators and the audiences who view their work in the long run.

"With higher dynamic range upon image capture, you can capture a more diverse spread of light levels, allowing for greater creativity for the director of photography, and then seamlessly adjust them for delivery purposes during the actual (digital intermediate process)," Feeney says. "So the filmmaker will benefit and the studio will benefit because they will finally be able to create more deliverables for less work. After all, in recent years, we have had an explosion of required deliverables ranging from theatrical and broadcast to computers, iPads, mobile devices, and much more. A benefit of the IIF-ACES system is that transcoding to all those different things will be easier. Rather than just trying to emulate the theater screen on an iPad, it allows someone to actually re-direct the material in a way that is more optimized for the iPad viewing environment. The net result is higher quality to the consumer at the end of the day."

Thus, Feeney suggests, as with the digital cinema specification before it, the timeline for implementation and proliferation of the IIF-ACES system will largely depend on how long it takes for the major studios to get together to officially sew the system into larger strategies for archiving digital assets on a long-term basis.

"If you recall, the DCI spec did not go easily," Feeney recalls. "It required studios to finally say, 'if you want our material, you have to do it this way.' There was reaction to the DCI spec from manufacturers who wanted to build simpler systems at that time, but fundamentally, studios control what happens and they accomplished something great for the industry when they committed to the DCI spec. I believe this spec will come down to the same sort of thing. If studios see value in their archive and value in re-purposing that material in the future, and recognize and support the complexities of doing post-production today, they will require this format from their suppliers, and in turn, their suppliers, like the DI houses, will require the equipment manufacturers who supply them to support the new system, as well. The studios are what made digital cinema happen and I expect they will do the same thing here."

Opinions expressed in SMPTE Newswatch do not necessarily reflect those of SMPTE. Reference to specific products do not represent an endorsement, recommendation, or promotion.

Comment
back to top

News Briefs
Broadcasting from the Cloud

As cloud-based computing continues to revolutionize various industries, few have the potential to use the concept as innovatively as the broadcast industry. Broadcasting is, after all, an industry that desperately needs the time savings, reduced render times, security, backup, and instant communications that the cloud concept provides. Recently, a comprehensive review of the history of cloud computing and its introduction and proliferation in the broadcast industry was penned for Broadcast Engineering magazine by Greg Lennon of Chyron Corporation, which you can read here. Lennon suggests that the cloud's ability to integrate into existing workflows, its ease of use, and functionality for the specifics of a broadcast operation's requirements and personnel are among the factors broadcasters need to consider when figuring out how to integrate cloud technology with existing infrastructures.

Comment
back to top

Google+ Privacy

A recent article in TechNewsWorld suggests that the maturation path for the newest social media phenomenon—Google+—is happening at a truly startling rate, and it's a rate far faster than the industry's ability to figure out how to protect consumers from the reams of privacy questions that come with such technology. The article cites research saying that Google+ went from being introduced to 20-million unique worldwide visitors in just 21 days (between 29 June and 19 July 2011). Further, the number of Google+ members in the United States rose over 80% in the middle of July from the previous week. All this success and potential for the first big rival to Facebook has led to, the article says, indications that Google will move big-time into the social gaming market. However, the article suggests this means that user privacy will once again be a major concern with the service's users, as it has demonstrated with Facebook. The analysis illustrates a major, originally unanticipated social concern evolving out of the unbelievably fast rise of new, compelling consumer technologies.

Comment
back to top

3D Smart Phones

The stereoscopic craze has now spread to smart phones, but a recent analysis at the CNET News site questions whether this is a development of significance or a quick flash in the pan. The article points out that the LG Electronics' Thrill 4G phone is the latest to offer stereo viewing of certain videos, following on the heels of the HTC Evo 3D, with others on the way. However, the article suggests that selected users, reviewers, and industry experts are saying that some of the 3D features in such devices are less than mature, sometimes less than pleasing to the eye, and lacking in robust-enough available content to be worth a consumer's while.

Comment
back to top

 

Let us know what you think about SMPTE Newswatch! Provide suggestions and feedback on improving this newsletter by completing this brief survey.

back to top

Table of Contents

Quest for Greater Bit Depth

Broadcasting from the Cloud

Google+ Privacy

3D Smart Phones

 

Connect with Us

SMPTE Events

8-13 September 2011
IBC Conference
Amsterdam, The Netherlands

24-27 October 2011
SMPTE Annual Technical Conference and Exhibition
Hollywood, CA

 

 

Get more information about SMPTE.

 

To advertise in this e-newsletter, contact Jeff Victor at jeffvictor@comcast.net or 847-367-4073.
Eletter created by Cavich Creative, LLC

 

Click here to unsubscribe.

 

3 Barker Avenue, White Plains, NY 10601