Hot Button Discussion
Higher Frame Rates
By Michael Goldman
When James Cameron showed up at CinemaCom 2011 in Las Vegas earlier this year to give a high-frame-rate 3D demonstration to promote the notion that the venerable 24 fps (frames per second) standard for feature film exhibition needs to be re-examined in the digital cinema era, his voice was hardly the first on this topic. Indeed, legendary visual effects guru/filmmaker Douglas Trumbull has been trying to get the industry interested in upping the frame rate ante since he debuted his 60 fps Showscan process in the 1980s and initially planned to exhibit his movie Brainstorm at 60 fps as far back as 1983—a plan that, in that era, simply was not feasible for even limited exhibition and was thus scaled back. Brainstorm was eventually released as a traditionally exhibited feature, and Showscan went on to be used as a process to create and exhibit realistic action in large-format ride films over the years, but failed to catch on as an alternative for conventional theatrical exhibition.
But the comparison test presented by Cameron's Lightstorm Entertainment at CinemaCom, combined with additional presentations soon after at NAB 2011 by both Cameron and Trumbull, who is now touting a modern Showscan Digital process, have brought the high frame rate conversation front and center once again. The potential for some new 48 fps or 60 fps industry-wide "standard" for exhibition remains a far way off for a wide range of reasons, of course, but the point of raising the volume on the topic in recent months has been to seriously illustrate what is possible in terms of an enhanced viewing experience.
Geoff Burdick is VP of production services and technology for Lightstorm, a close colleague of Cameron's, and was deeply involved in the CinemaCom presentation. He suggests, in fact, that an industry-wide conversation of substance is what is most required right now in terms of whether major features can, or should, be captured at and/or exhibited on big screens at frame rates higher than 24 fps.
"There have been a number of improvements made across all different types of verticals in this entire industry—picture, color, sound, audio, visual effects, presentation, everything," Burdick says. "But frame rate has remained effectively a constant for feature film exhibition, notwithstanding some variations that existed in the infancy of exhibition.
"One drawback of 24 fps is the strobing and motion artifacts which are simply innate at that limited number of frames," he adds. "If you have a static shot, it is something that might not be noticeable (to the viewer). But if you have a moving camera and a lot of vertical elements in your shot, and you are doing a whip pan or a diagonal pan-and-tilt, then it can be very distracting."
Therefore, Burdick adds, "folks have been looking at potential solutions in terms of applying different kinds of algorithms to the content. But, really, it comes down to acquisition and exhibition—upping the frame rate for them."
In other words, Burdick feels that as the ability to capture and exhibit imagery at higher frame rates becomes easier with technological advances, the industry's level of interest in addressing this issue today should logically be greater than in the past. That's because, much like 3D, higher frame rates are something that viewers can and will immediately notice "and gauge the benefit right away," adds Burdick. "It's not something that has to be explained to them—they'll get it (when they see it). And they'll get an enhanced viewing experience."
Thus, Lightstorm brought its high frame rate 3D demonstration to CinemaCom. Imagery was captured for the test using Arri Alexa, Red Epic, and Vision Research Phantom Flex cameras configured in Cameron-Pace rigs—all digital cameras capable of multiple frame-rate capture. Material was shot at 23.976, 24, 47.952, 48, 60, and 120 fps at a wide range of shutter angles and then screened at 24 fps, 48 fps, and 60 fps for each eye. The playback relied on four Christie 2230, 2k DLP Cinema projectors, with content being encoded and played through four Doremi V1 HD players (using JPEG2K compression). The basic concept was to configure the playback equipment in two groups each—two projectors, two processors, and two 3D polarizers, etc. RealD provided the 3D projection system utilizing four customized static polarizer versions of its RealD XL system and RealD polarized glasses for the audience to view images on a 70-ft. screen.
The presentation was designed to let people see material—sometimes the same camera angle—in different frame rates, one following the other. Burdick insists the exercise illustrated that exhibition at higher frame rates is not only possible, but visually discernible by the average viewer, who could, he says, see a generally more realistic image, minus any visible strobing. He says the whole thing was meant as "a wakeup call, a flag-waving, and not for us to suggest there is a specific business model for deployment. We're not even saying that one particular frame rate is the answer—we demonstrated 48 and 60, but people talk about many others. The demonstration was about showing how great the end user experience could be. Once people see that, then they can have a discussion about it and figure out the best approach toward how to achieve it. My take on it was that a lot of folks went into the demo thinking it wouldn't be as noticeable as 3D, and they were not sure what the benefit would be. But they later left the demonstration convinced."
There are, of course, a ream of obstacles to moving beyond the serious discussion phase on higher frame rates to a meaningful standardization process, if in fact one is needed at all. Burdick points out that high frame rate acquisition is nothing new and a wide range of manufacturers are now producing digital camera systems capable of multiple frame rates, with such technology improving rapidly. Indeed, specialized use of that technology to help create visual effects, action, and stunt imagery for modern feature films has been commonplace for some time. And, as the CinemaCom demonstration illustrated, exhibition tools such as data processors and projectors are capable of higher frame rate processing, and are likewise improving.
But the business model may prove elusive. Burdick suggests the holdup is more akin to the modern rollout of 3D. Will the industry deem it a worthwhile business endeavor? If so, who would pay for converting projection technology in theaters? And what impact would any change to, say, 48 fps or 60 fps have on the post-production chain, the infrastructures of companies that process imagery for theatrical viewing, and for that matter, the structure of the DCI specification to begin with? For example, the current DCI spec limits the compression data rate to 250 Mbits/s, while for 3D at higher frame rates, greater bandwidth beyond the DCI spec might be preferable. And, for that matter, why 48 fps or 60 fps?
Not to mention the more basic question of what to do in the meantime, since the ability to capture and/or process imagery in higher frame rates currently outpaces, and will continue to outpace, the ability of the majority of theatrical venues to show imagery at those frame rates in a meaningful way.
"Different groups are looking at how we can approach this in terms of standards and how to make it work with the DCI specification," Burdick says. "That doesn't mean that every theater would be able to do this immediately, but like 3D, they could start offering it (in limited venues) as an enhanced exhibition experience. It can't be something that is universally adopted from day one, but perhaps it can be something that, once they see it, consumers will seek it out and more and more exhibitors will adopt it. In the meantime, as our test showed, you can take content at 48 fps or 60 fps and interpolate it back down to 24 fps for standard exhibition or 35mm filmout, for as long as that is necessary. When will there be a realistic business model (for a wider rollout)? I don't know because that comes down to a lot of factors, but we're just saying it is something the industry needs to seriously discuss."
Some major industry players on the manufacturing side are taking it seriously enough to assist people like Cameron, Trumbull, and others in developing, adopting, and utilizing tools like cameras, image processors, servers, and projectors to make tests like the one Cameron brought to CinemaCon viable. Christie, Doremi, DVS, Red, Arri, Vision Research, RealD, Texas Instruments, Reliance MediaWorks, and Modern VideoFilm, for instance, all teamed up with Lightstorm to make the CinemaCom test work and, no doubt, meaningful time and cost were involved in their participation.
Burdick calls it "a group effort" and an illustration that "these organizations see the potential benefits in doing this."
Certainly, the original evangelist of higher frame rates sees the benefits and, like Cameron, Trumbull is still quite actively promoting the notion. At the NAB 2011 SMPTE Digital Cinema Summit, Trumbull pitched the idea that Showscan Digital is designed to capture imagery at a whopping 120 fps and then interpolate that number down to whatever looks best and is practical for projects using the process to make movies. Indeed, Trumbull told Daily Variety this year around NAB time that he had two feature films in development set to be shot and sewn together in the virtual Avatar style, that he hopes they will be exhibited at a more advanced frame rate than 24 fps. Cameron is talking about it, as well, for future Avatar sequels.
Whether that comes to fruition remains to be seen. But certainly the efforts of Cameron, Trumbull, director Peter Jackson, who has also experimented with higher frame rates recently, and others to promote the notion, and the help they are getting in their research from major manufacturers, suggests that, at a minimum, the industry "is getting excited" about higher frame rates, in Burdick's words.
"And that's what we want—for everyone to get excited and start talking about it," Burdick says. "Once that happens, we can figure out a game plan and come up with a way to move forward as an industry."
Opinions expressed in SMPTE Newswatch do not necessarily reflect those of SMPTE. Reference to specific products do not represent an endorsement, recommendation, or promotion.
Sony formally debuted its much ballyhooed F65 CineAlta digital camera system, along with an explanation of what Sony is calling a true end-to-end 4k workflow for the camera, and a pricing scale for F65 and all its various, potential components (selling for between $65,000 to $85,000 depending on how you rig it). As of mid-September, the camera was bowing for visitors to the IBC 2011 tradeshow after a formal introductory event at the Directors Guild of America in Hollywood for invited members from the American Society of Cinematographers and the media. The camera's potential has been buzzing around the industry for months now. It will first be available from Sony and rental house Otto Nemenz International, and will no doubt be appearing on sets across the industry in coming months. Here's the original release about the camera's official debut, and industry veteran Jon Fauer's analysis.
When the British Office of Communications (Ofcom) formally announced recently a plan to make the UK the first European country to boost mobile bandwidth through the use of white space frequencies, it was a pretty significant deal. White space, of course, basically refers to currently unused broadcast spectrums, particularly bands that exist in gaps between signal paths used or reserved for analog television and radio. It's an important development, since other data paths, such as 3G, are already well on their way to getting maxed out across the globe. Analysts suggest the UK plan could help extend existing data networks and pave the way for new, faster, deeper ones to the point of possibly coming close to doubling the current amount of bandwidth currently available in that country. Here is the original announcement from Ofcom and some analysis from Computerworld UK.
Analysts have been debating over recent months whether the 3D phenomenon will have a major impact on home viewing, and correspondingly, on the viability and sales of new 3D-capable flat panel TV's in the U.S. Despite the ongoing sluggish economy and other barriers, Panasonic, among others, thinks this rollout is inevitable, to the point where that company is moving full forward with the second generation of its 3D televisions, and plans to have those new flavors hit the market in force in 2012. Panasonic's chief technology officer, Eisuke Tsuyuzaki, recently did an interview with Marketing Daily to discuss these plans, and Panasonic's view of the 3D home-viewing market and he had some interesting thoughts on where the trend is heading. Read the interview here.