Newswatch e-newsletter

Current Issue - November 2015 #2

SMPTE Newswatch Masthead

Hot Button Discussion

Cinematically Immersive Environments
By Michael Goldman  

One of the less well-defined concepts percolating through the media industry right now is the notion of the so-called multi-screen or multi-view environment. The term can potentially be applied in different directions, depending on whether one is discussing home viewing environments, cinematic environments, or virtual environments, and whether one is discussing the use of multiple devices to view and digest content or supplement primary content with secondary content, or the use of multiple screens to create a single image for an immersive viewing experience in a cinematic setting.

Indeed, Peter Ludé, senior vice president at visual technology company Real D, former SMPTE President, and a SMPTE Fellow, suggests a better term than “multiscreen environments” might simply be “immersive environments,” and in that context, cinematic settings are an exciting, new area.

“Today, there is a lot of talk about immersive cinema,” Ludé explains. “Content creators are now considering how a theatrical cinema might work with, or compete with, virtual reality or augmented reality technology.”
 
In terms of historical context, Ludé points out that the pursuit of immersive cinematic experiences is nothing new, dating back to the original stereoscopic 3D experiments in the 1930s and the early 1950s with the brief rise of the Cinerama theatrical format—an amazingly complicated endeavor for its day. 

“That technology had a curved, semi-ellipsoidal screen, and it worked by using three film projectors that were synchronized mechanically via shafts under the floor, connected to the sprocket reels,” he says. “And there was also a separate sound system for Cinerama, which was one of the first surround sound implementations. So somebody sitting in that auditorium would see a very wide horizontal field of view filling his or her peripheral vision from left to right, creating a highly immersive visual experience, coupled with immersive sound. It was very complicated, because it had to be shot with three synchronized cameras and projected with three synchronized projectors, and it needed a crew of six people in the theater to operate the system. They also had technical challenges with keeping the three projectors at the same brightness, synchronized, and blended so the edges were not visible. There was also the problem that when you curve a screen, light reflecting off one side will bounce back onto other parts of the screen, washing out the image and greatly reducing contrast. That was addressed by using louver systems on the screen that prevented light from hitting the wrong areas, but that also created some artifacts when you were trying to watch from certain seating positions.”
 
Today, similar principles are back with a handful of innovative developments applied to the digital landscape, according to Ludé. Among these developments is the Barco Escape system, which debuted a little over a year ago with limited presentations of Fox’s The Maze Runner, followed by The Maze Runner: The Scorch Trials this past September as part of the launch of a five-year, multi-film deal between Barco and Fox to release films that feature extended sequences in the Escape format. Ludé suggests that system does offer a somewhat similar experience as Cinerama, and exemplifies the notion that immersive visuals are something the movie industry is now clearly considering.

“They started by installing those systems in about a dozen or so theaters around the country,” Ludé says. “The idea is somewhat similar to Cinerama in that they are using three screens side-by-side, with the two side screens at oblique angles about 100 degrees off the center screen, and one digital cinema projector imaging each screen.
 
“Of course, nowadays, with digital technology, it is far easier to create the content and to synchronize and balance and operate such systems, so it is much more practical than it was in the 1950s. You still have some of the challenges optically with light coming off the side screens and reflected on the main screen, so they have been working carefully to optimize the performance. The idea is not something that has been widely accepted yet, but it has certainly been experimented with, and that is notable in itself. And it definitely is immersive—the idea is that you are greatly increasing the horizontal field of view.”
 
Indeed, Ludé makes the point that such presentations are built around the notion of pumping more visual information into the viewer’s peripheral vision, which “responds differently, neurologically, than your central vision. The eye is concentrating ahead, and that is where you see fine detail. But on the sides of your field of vision, you get more of a subconscious feeling that you are actually in the environment, and you are much more sensitive to motion cues in your peripheral vision. Right now, filmmakers are experimenting with how to use this wide field of view, but there is no right or wrong answer. It’s really an artistic language that needs to be explored.”

Ludé adds the idea of such systems is to present a single image, even though multiple screens are required to achieve the illusion. “The purpose is to be immersing the user by erasing the screen edges,” he emphasizes. “That’s why the term ‘multi-screen’ doesn’t do it justice. You are really seaming together three projectors, but that is just an implementation detail of today’s technology. Over time, there will likely be other display technologies in the theater. IMAX is already using a pair of high-luminance, 4K projectors [as part of a new system known as IMAX with Laser] to fill up to a 100 degree horizontal field of view.”
 
He adds that other approaches are raising interest, as well. One of them, dubbed Screen X, comes from the Korean conglomerate CJ Group, and has been installed into a couple hundred theaters worldwide, mostly in Asia. It provides a primary image focused on the main, central screen in an auditorium, but also projects supplemental images on side walls that are specially authored and color-timed.

“Screen X appears to work best with medium gray or beige walls, with the surround speakers positioned as unobtrusively as possible,” Ludé explains. “But fundamentally, the additional image content is just projected on the sidewalls. Screen X uses a bank of projectors on the sidewalls, with proprietary image stitching to create one seamless projection. Multiple projectors near the ceiling on the left wall project onto the right wall, and vice versa. Between four and six smaller projectors are typically needed on each wall, so that you might have a main digital cinema projector illuminating the screen, and 12 additional, smaller projectors—six on the left and six on the right—filling the whole auditorium with images. These are seamed together, so once again, the viewer is not thinking of them as separate projections—it’s all one big image.

“So far, they have not used this approach so much for full-length movies, but primarily for creating a memorable surround experience for the pre-show material. While you concentrate on the screen, during the pre-show advertising, you are enveloped in all this visual stimulation all around you, which is intended to enhance the experience. I know they have been working on a couple of Korean feature film titles, so it will be interesting to see how this technology develops.”
 
Providing multiple screen configurations designed to offer the viewer an immersive cinematic experience inside an actual movie theater, however, is not the only development under the broader heading of a “multi-view environment.” Attempts to combine the cinematic experience, either literally or figuratively, with the virtual reality experience are also under way, according to Ludé.
 
Some of these attempts involve gear already on the market, while others involve early experiments and theories that are now filtering into the industry consciousness, he adds. “There is some general discussion about combining augmented reality headgear with a conventional display,” he says. “For example, a television or a projected image in a movie theater might be supplemented through additional images rendered in AR eyewear.”
 
“The concept is that you could watch a standard movie, but when viewed through your AR glasses, you would see additional characters or ghosts or sub-titles, or whatever else the content creator came up with to supplement the main image. As you move your head around, the virtually augmented images would remain totally fixed in relationship to the main screen. Another approach would be to use the AR eyewear to fill out your peripheral vision a bit more, just for you individually. In this case, you might see characters or other elements approaching from the side that are not being projected on the screen. Instead, they are being rendered in your augmented reality glasses. I’m not aware of any public demonstrations for this, but there is experimentation going on in labs, reports in academic papers, and discussions in Hollywood in this regard.”

More than theoretical, however, is an application with today’s most well-known, high-end virtual reality consumer technology, Oculus Rift, called Oculus Cinema. That application, Ludé points out, is not multi-screen in nature at all, but rather simulates the concept of being in an immersive cinematic environment by using the VR headset’s strengths to place the user into a familiar setting—a movie theater—where the viewer can watch an actual movie they have downloaded or are streaming, bringing the cinema experience to the at-home experience, so to speak.
 
“And if that is not confusing enough, they are also turning the tables by building out immersive environments from existing standard cinematic movies for virtual reality experiences connected to those movies,” he adds. “This was done [late last year] by [Fox Searchlight and partners] for the Reese Witherspoon film, Wild, in which they created a short immersive experience based on the main protagonist in the film joining you, the viewer, in the virtual world [based on an environment in the movie]. That is an example of filmmakers leveraging cinematic scripted stories we are used to seeing on the bigscreen, and making them into more interactive experiences using virtual reality eyewear. One can argue that is totally different than cinema, and not multi-screen, and not a movie theater, but it is movie content reauthored to be an interactive, virtual experience.”

Such developments are requiring content creators to think about new or additional ways of mastering their content, and how to render it into new VR platforms. Ludé points out that eventually such work will likely force new post-production requirements upon filmmakers that go beyond what the Interoperable Mastering Format (IMF) was designed for. And the use of new 360-degree, immersive camera systems on sets, such as technology now being used by some productions like the Jaunt One, the Lytro Immerge, or simpler systems now available from companies like NokiaGoPro, and others, is likely to become more common.

Ludé says what makes these developments potentially germane to so-called multi-screen cinematic experiences potentially is the fact that they could, in the near future, be used for capturing imagery more conducive to creating content for systems like Escape or Screen X, among others.
 
“A lot of this work in immersive image capture is currently driven toward virtual reality headsets, but there is nothing to say they couldn’t also go into an Escape or a Screen X, or some future Cinerama-type surround theater, where you might walk into the theater and, from floor to ceiling, experience screens all around you, or at least, 180 degrees around you,” he suggests. “That could lead to eventually watching motion pictures in a way that is very novel and immersive.”
 
And that possibility, in turn, has implications for the standards’ community, Ludé adds, simply because “the Genie is out of the bottle. Research and development into these applications won’t stop as long as content creators find them potentially compelling to audiences, and for luring new revenue.” 

“The good news is the fact that some of the standards are already out there, such as DCI System Specifications and the SMPTE 428 [series of standards for D-Cinema production and post], which could be leveraged to work with immersive imaging [in a cinema setting],” he says. “For example, to project three images, as Barco Escape does, you could leverage all existing SMPTE standards by synchronizing multiple media blocks off a common standardized protocol. In that scenario, there are three images going to three projectors and immersive sound, all synchronized under existing SMPTE standards.
 
“However, if I am trying to author one large image that wraps all the way around you spherically, those technologies are generally proprietary right now. There are quite a few computer scientists now working to develop improved algorithms for stitching images together, and for re-warping geometry needed for immersive displays. I would expect that, eventually, the question of how you distribute this immersive content, and how you render it to preserve frame rate and color gamut, how you map the image into different display geometries, and many other attributes, will need to be standardized.”