Current Issue - August 2018

SMPTE Newswatch Masthead

 

Hot Button Discussion

A Foundation for Immersive Entertainment
By Michael Goldman  

The ongoing expansion of “immersive” or “inclusive” entertainment of various types marks one of the more unusual game-changing technology trends the entertainment industry has seen in recent years. That’s because the trend appears to be running on multiple tracks, all with enormous potential. The use of new tools to create virtual reality (VR), extended reality (XR), augmented reality (AR), and hyper-reality experiences, along with new forms of immersive theater and a few other things along the way, appears to be important both regarding the challenge of how to transform that content into meaningful consumer entertainment experiences on the one hand, and for the significance of the tools used to make the content to begin with. Both tracks appear to be growing in importance for studios seeking to create new kinds of entertainment and new ways to produce traditional forms of entertainment. As a consequence, although it is still early, and much is still unknown about how such content and tools will change the media landscape, major studios are ratcheting up infrastructures designed to make sense of it all in a healthy experimental way.

“Virtual reality is both a consumer product and a creative tool, because what VR fundamentally does is put you inside the world you are creating or experiencing a story,” explains Alice Taylor, Director of Disney’s StudioLab—a new technology center on the Disney lot designed to allow all Disney creative units to experiment with new and immersive storytelling technologies. “So [for content creators], VR can be, in some cases, a parallel to being on a set. You can stand in the middle of the environment, look around, see your set, and make changes. In terms of consumer VR products, the in-home market is still new and consumer hardware continues to change on a year-by-year basis. Then, we have hyper-reality and things like VR arcades that have been popping up in the past year. So it is still pioneer time. Whether it is full body, or a big-room experience or whether you are in a chair or on a more mobile device, these experiences are still going to be emerging in coming years. This is a fascinating space to watch, both for consumer entertainment and as a creative medium for storytelling.”
 
StudioLab is one of a handful of recent studio initiatives of a similar vein put together to explore such technologies and their capabilities. In the past several months alone, besides Disney, Fox and Sony have also announced such projects—the Fox Innovation Lab’s FoxNext VR Studio, and Sony’s Innovation Studios, among others.

Regarding the nature and role of content that may emerge from such entities, much of it has focused on transmedia programming—essentially the plugging of a narrative concept into multiple forms of media, currently being designed across the industry to extend existing feature film franchises into new platforms. Examples would include development of Pixar’s Coco VR experience, the first official entertainment product in which Disney’s StudioLab has participated, partnering with Pixar to offer a VR content experience built around Pixar’s award-winning animated film, Coco; and Fox’s Alien: Descent, a VR arcade-style experience based on Fox’s most recent film in the Alien series. 
 
Disney also recently debuted a so-called hyper-reality, full-body, immersive experience for fans under a partnership with The Void, which offers experiences built on existing film franchises and ones existing only in the new medium. The Void is one of several standalone entertainment initiatives supported by the Disney Accelerator program, which is designed to allow selected companies to develop ideas and new kinds of entertainment product in partnership with Disney.

“StudioLab is a place on the Disney lot, and it is also a program inside the company,” Taylor explains. “The idea is to have a lab that could experiment with new forms of cinematic experiences in storytelling via new technology. We are interested in all types of cinematic entertainment within the studio. In Disney’s case, that includes theatrical content, and also Broadway and Music and more. With that in mind, we are looking at immersive experiences to support existing movies, and there is an appetite across the studio for further experimentation. 
 
“A lot of this brings up new ways of [looking at entertainment]. What happens if you have a concept that starts with an idea that wasn’t specific to any platform, and then, looking at the advances in technology, and especially various forms of 3D technology or real-time virtual technology, what happens to that story universe? How does it manifest itself? So we have lots of experiments going on and are talking about that kind of thing all the time. Generally, in the industry, that’s because we are at a period now where what we see on screens is increasingly often built in 3D using game engines.”  
 
And that, of course, is why the rapid development and evolution of the tools used to create these kinds of immersive entertainment experiences is so important. In recent years, VR technology of some type has been used as a significant production tool for filmmakers making animated and 3D films, such as the Avatar films, The Jungle Book, and currently, 2019’s reimagining of The Lion King, among other projects. Filmmakers are putting themselves “inside” the animated worlds they are imagining using such tools, and making sophisticated creative choices in that way, ranging from lighting to lens and composition and production design choices. 
 
Indeed, the trend of including real-time visualization in production is chugging along so rapidly that “probably some aspect of virtual production will be used in almost every film production within the next decade,” predicts industry veteran Girish Balakrishnan, virtual production supervisor for visual effects’ facility MPC Film, who has worked in recent years to help major filmmakers configure and use VR tools for various studio features. Balakrishnan serves as virtual production supervisor for MPC Los Angeles and is currently coordinating with them on the rollout of MPC’s own proprietary virtual production workflow platform, called Genesis, which was recently introduced at Siggraph 2018.
 
Balakrishnan suggests this phenomenon has been made possible by “a crossroads between the real-time benefits of game engines and the power of cinema,” a trend that is accelerating rapidly. 
 
“We look forward to the next decade-plus when we feel that all filmmaking is going to be real-time and that what has started to happen over the past year or two is a new trend where people are expecting more things to be real-time and interactive,” he says. “What that means is that filmmakers want to be able to see what their picture is going to look like sooner than ever, at an early standpoint. In the past, if you look at visual effects, they became more complex for the creatives that are building the effects, for the artists working on them, and for all the filmmakers. So the divide between creative and visual effects started growing.

“Now, with the buzzwords ‘virtual production’ and ‘immersive media,’ there has been a resurgence and a drive within the industry where we want to start bringing the creative back into the fold [throughout the entire process]. Whether it is a virtual camera, using game engines for real-time feedback, using film hardware in a digital environment—the goal is the same, to give creative control back to the filmmaker by giving them a creative sandbox to work in, rather than [having to wait for shots to be visualized later].
 
“With virtual production, for tech scouting and world building, we realize this is a great application for immersive mediums like virtual reality. We can digitally scan a physical location, bring filmmakers into an air-conditioned room, put immersive headsets on, and let them ‘walk’ around that location or space, making all sorts of creative decisions. This is possible because of the [speed and efficiency of new real-time game engines]. That makes it a digital, real-time world. Filmmakers don’t even have to be physically located in the same place. Someone in Los Angeles and someone in London can put on headsets together and really ‘feel’ like they are next to each other on a set that has yet to be built. So that’s an example of creatives using the technology to enable the creative sensibilities they have developed throughout their career, rather than talking about letting the technology dictate how they should be shooting their film.”

That’s why MPC developed Genesis, he adds, an initiative he calls essentially “a series of workflows and technologies brought together as a core foundation for building interactive methodologies for creatives.” MPC brought in filmmakers, engineers, and game-industry experts to develop the platform “to help us understand working in real-time in filmmaking,” he elaborates. 
 
Taylor also emphasizes that “real-time everything is the trend” in terms of content creation across all mediums. “Real-time graphics, real-time production, real-time rendering,” she notes. “More Cloud networking, changing workflows and processes so that, maybe in the future, we get to a point where you have a 3D model and [create it once] and use it on your TV program, your VR experience, your AR, your movie—whatever else. That is coming, for sure.”

Other industry breakthroughs, such as the use of Artificial Intelligence (AI) algorithms, are combining with real-time game engines and VR interfaces to make all this take off. AI can help automate or make certain processes more efficient, but the trick there, Balakrishnan suggests, is making sure AI is being used to fuel, rather than stifle, creativity.
 
“AI can be a touchy subject because we don’t want to allow AI to take away from creative decisions [by automating them], but rather, we want to use it to enable a number of creative options,” he says. “There is a lot of [fascinating] work in the AI space, but we want to remove the notion that having AI will remove the need for actual actors or cinematographers. Rather, it’s about opening options for them, and bringing those options to them more easily.”
 
So many new and exciting technical developments are driving this technology that one of the looming concerns is how and when to standardize it all as immersive content creation and distribution becomes more commonplace. Balakrishnan suggests that, in a sense, it is just too early to know where or how standardization for this sector of the industry could work. One thing he does know, however, is that it will be driven by vendors who are creating new systems like MPC’s Genesis as they calculate the best way to make their approaches interoperative with the other approaches across the industry.

“Major vendors want to be industry leaders to show this is what you need to build a real-time system,” he says. “As other vendors adopt these workflows, eventually the translation [of data] between vendors will become easier. But right now, it is painful to work in this space among multiple companies because standards don’t exist. Do we need a standard for how to build a [virtual] set, how to write a character, how to design a character, how to shade or light an object so that it feels realistic, how to calculate the proper depth of field and many other things? In time, yes. We need to make sure the translation between the visualization of virtual production [in real-time] to post-production can be as seamless as possible. We are continually making strides to that goal, but it’s early. The entire industry will benefit as we make more of these advances.”

That said, he adds that MPC has incorporated important new, specific formats into the foundation of Genesis, and he expects more to come. Among them, are the Universal Scene Description (USD) format, developed by Pixar to make the translation or interchange of assets between multiple mediums more seamless. Another is a new open standard called Material X, originally developed by Lucasfilm and designed to aid the transfer of shading, lighting, and other data between different kinds of systems, applications, and renderers.  
 
Taylor reminds that this “forward march of technological progress” is still in its early stages, building conceptually, at least, on explorations and innovations that came along with the Machinima phenomenon of the early 2000s—when fans themselves were basically taking the lead in creating simplistic cinematic content and sharing it on the Internet using early game engines. However, that “forward march” is already resulting in remarkable, ongoing improvements in terms of stability and fidelity of image and sound. The next development, she expects, will be for these breakthroughs to become ubiquitous across different platforms.

“If we fast-forward ten years, I don’t think it is unreasonable to expect to see a lot of this kind of media playing out on screens large and small that was originated in a real-time engine, because the fidelity is growing exponentially better each year,” she relates.
 
“When Machinima started, back then the graphics were very basic, just a fun little thing,” she adds. “The phenomenon became popular on YouTube, and if you Google around, you will see lots of people making web and TV content with game engines now. Obviously, with real-time game engines like Unity and Unreal Engine both pushing their real-time filmmaking capabilities, it becomes likely that we are going to see more real-time content. Both companies have stated their intent already to become content generation engines, and not just ‘game’ engines, going forward.”

 

News Briefs
ETC Celebrates a Quarter Century

This summer marked the quarter-century anniversary celebration for the Entertainment Technology Center at the University of Southern California (ETC at USC). The ETC is a celebrated think tank and research center for Hollywood technology and creative thought-leaders to come together. In late June, the ETC themed its annual Studio Technology Leader’s Dinner around the 25th anniversary, with a wide range of senior studio executives, creative service providers, and technology companies joining well-known filmmakers in honoring Elizabeth M. Daley, Dean of the USC School of Cinematic Arts (SCA) with the ETC’s Bob Lambert Technology Leadership Award. The event included panel discussions between technology experts from most of Hollywood’s major studios and across the industry. In accepting the award, Daley emphasized that technology and creativity can only succeed in the entertainment world by melding seamlessly together. “This is a team sport—it’s the most exciting time I can imagine in this industry, and we need to keep it going,” she said. Among the ETC’s key accomplishments over the past 25 years was its participation in the development of the Interoperable Master Format (IMF) specification, which was eventually standardized by SMPTE.

Open Framework for Programming Quantum Computers
Among the many hard-to-understand complexities of quantum computing is the issue of how accurately to program such computers. Since most programmers do not have a background in quantum physics, the challenge of programming quantum computing circuits using so-called “qubits” as the programming language, rather than standard digital bits (ones and zeroes) as traditional computer programming does, can be intimidating to say the least. A recent article in the MIT Technology Review, however, reports that Google has approached this problem with its recent release of a new programming software toolkit called Cirq. According to the article, qubits can essentially reside in-between ones and zeroes, thanks to the phenomenon known as “superposition.” Qubits can influence one-another even when not connected in any physical sense, and stay in their quantum state “for no longer than the blink of an eye.” Therefore, writing such software is a highly specialized skill. Google’s solution, the article reports, is to make Cirq a web-based, open-source initiative to allow quantum algorithms to run on simulators, for now, letting developers modify software using existing programming techniques, and eventually, do so on a wide range of  systems. Google, the report adds, also recently released a toolkit for creating algorithms that “simulate molecules and properties of materials,” called OpenFermion-Cirq. Google expects such advances to have wide-ranging applications across dozens of industries.