<img height="1" width="1" src="https://www.facebook.com/tr?id=414634002484912&amp;ev=PageView%20&amp;noscript=1">
Donate
Check out SMPTE at NABSHOW24
Donate

October 2019 - Pursuing The Ideal Media Cloud

October 31, 2019

When Tomasz Witkowski, a senior workflow engineer for UK-based Sundog Media Toolkit, Ltd., a company that specializes in media workflows built on Cloud platforms, and his Sundog colleague, Richard Welsh, the company’s co-founder, set to work to prepare a September 2019 SMPTE Journal article on “Media Cloud Fundamentals,” they realized the idea of cloud computing was no longer a new concept for business professionals generally. But, Witkowski says, the idea of converting media production and distribution processes “from the ground into the cloud” remains a confusing one across the media world.

“The cloud is out there, and everybody is using it in some way,” Witkowski says. “But we said, what about media in the cloud? It’s different—in media, people still do not use the cloud for all processes. We realized that while people knew about the cloud, they didn’t know many basic things about the cloud and how it works. The biggest point, we discovered, was the human factor—that many people avoid using the cloud because they don’t want to change [their infrastructure], they don’t understand the costs, and they don’t understand how to connect the different content creation processes together in the cloud. Media people are used to sitting together, working [under one roof], whereas the cloud allows you to collaborate from anywhere, anytime. So we concluded there will be changes in media processes in the future to work with the cloud.”

According to Witkowski, the goal of having “every single media process happen in the cloud” is at the heart of those changes. He describes achieving this as “the ideal media cloud.” He and Welsh pointed out in their article, "Media Cloud Fundamentals," (published in the October 2019 issue of the SMPTE Motion Imaging Journal) that “tools and technology exist now that allow the entire media lifecycle to tap into the cloud as a central resource. In principle, it is possible to do everything in the traditional media lifecycle in the cloud today.”

And when they say “everything,” they mean everything—pre-production, image capture, and production, live broadcast, post-production, mastering, versioning, delivery, and more. All this is possible today, at least in theory, Witkowski suggests, because the modern cloud as we know it “is defined by storage. If you have access to the storage, you can do the processes [remotely]. You simply need to get permissible access with the respective security, and then you can complete all required processes in one place.”

However, he elaborates, “there are challenges” involved in how soon, how seamlessly, and how completely the industry will transition to the “ideal cloud,” even if most tools, resources, and techniques exist today to make it possible. Witkowski points to a recent white paper published by Motion Pictures Laboratories Inc. (MovieLabs) in advance of IBC 2019, called “The Evolution of Media Creation,” which offers an argument that most of this transition could well be completed within the next 10 years—by 2030. Included in that timeline is the notion that, as the white paper states, “all assets [will be] created or ingested straight into the cloud and [will] not need to be moved.”

Witkowski believes that timeline is reasonable, assuming the industry overcomes current challenges by merging how media workflows have traditionally worked with IT-based methodologies. Among the most fundamental of these challenges is the fact that historically, media workflows are what he calls “step-by-step processes,” and “the tools are not connected to each other. Different departments have different needs.”

Further, in terms of tools, Witkowski points out that “people [currently] don't have the same software and services in the cloud that they're used to having on the ground, in their facilities.”

Additionally, he adds, methods for approaching many tasks are “culturally” different when doing media work in the cloud, such as the need to identify and map assets—that’s an entirely different way of managing things from what media professionals are typically accustomed. As discussed in the August 2019 issue of Newswatch, impressive standardization work has enabled media companies to find great solutions for this challenge. Still, it remains a fundamentally “new” way for them to approach the process.

And then, in a sense, certain business fundamentals in the media world need to reverse their traditional path to accommodate moving workflows into the cloud adequately. Since the media cloud is built on the foundation of storage costs, that essentially interferes with the traditional operational expenditure (OPEX) vs. capital expenditure (CaPEX) model, he suggests. The cloud, Witkowski reminds, “is an operating expense in the sense that you don’t own it—you are paying for file storage. So now, in your operating budget, OPEX goes up.”

Then there is the fact that media content is not typically native to a cloud environment, and increasingly higher-resolution and complex (4K, 8K) data is working its way into the mix. Indeed, as Witkowski and Welsh point out in their SMPTE Journal article, even “a single piece of media in all its forms from RAW, uncompressed intermediates, masters, mezzanines, proxies, deliverables, and localized versions represent a huge amount of complexity both in terms of understanding the technical links between these forms and also the contextual links.”

They add that “there are further challenges with the underlying architecture of cloud storage. For instance, the sequential nature of file names for frame ranges of uncompressed material can in itself trip up cloud storage.”

For example, the way storage arrays work means that reams of similar uncompressed frames can “cluster” in the storage array, as Witkowki puts it, slowing down a system, because a small portion of the storage array is “suddenly saturated with requests” related to content from the same scene or similar topics.

Witkowski says the industry is innovating a variety of solutions for these challenges, such as the use of set management software layers, but such obstacles continue to vex those trying to make the transition. At the same time, as noted, much of the software typically used for media processes on the ground has not yet been fully woven into Web browsers anyway, he adds.

Therefore, he explains, the industry frequently relies on so-called “Forklift,” or “lift-and-shift” upgrades, as opposed to native cloud software applications. This means that media companies implement major changes to their existing IT infrastructures to essentially route newer software from local hardware to virtual machines in the cloud that they can then use to migrate their data, as opposed to modifying the data to work in the cloud to begin with. Witkowski calls this a process of “spinning the machine to be able to use newer software that is not part of the Web browser.”

But it can take years literally for media companies to fully alter their IT infrastructures to make this an efficient alternative for the kind of work they do. “The software is evolving [to create more native cloud applications], but the industry is working hard to encourage people to create more automated processes,” Witkowski says. “It is much easier [for people] to use the files if an automated process has prepared the files for them. People don’t want to use a machine on the ground—they want to use business systems to drive the process, working through an API. ”

Thus, one of the goals is to eventually get software companies to “improve their API’s to connect between different ones in an interoperable fashion, creating a much better system that has all the different software tools all in one place,” he elaborates.

And so, with the rise of the media cloud inevitably comes automation as a key factor. This is hardly a trend unique to the media industry, but Witkowski says it is particularly crucial in the media world. That’s because a sophisticated, automated cloud-based workflow basically eliminates the need to prioritize jobs based on the limitations of your infrastructure—companies can use virtually unlimited processing power through a cloud service simply by requesting and paying for more resources.

“With no more reliance on queuing of individual jobs and prioritization, automation can deliver huge scale and exponentially reduce turnaround times on traditionally serviced processes,” he and Welsh wrote in their article.

It is particularly cool, according to Witkowski, that new developments are rocking the IT and communications industries that have the potential to make media cloud adoption more feasible.

For example, he points to the looming advent of high-speed 5G mobile networks—about 100 times faster than 4G networks—as a development that will be particularly well-suited to the realtime movement of camera data to the cloud during the production process from remote field locations without the need for physical connectivity. Until that becomes ubiquitous, he points out that high-capacity physical disk data transport solutions like Amazon’s Snowball—a petabyte-sized storage technology for transferring major amounts of data in-and-out of Amazon’s AWS cloud service—remain highly useful for many different kinds of projects.

Meanwhile, we continue to dwell in a sort of hybrid world in which many media processes are done in the cloud, but not all of them, at least not on a consistent basis across the industry. In particular, for now, production workflows still typically rely on on-set hardware and offline delivery, for example. However, Witkowski and many others feel it is simply a matter of time until even that part of the equation starts to change. He points to streaming giant Netflix as an example of how a cloud commitment, once started on a large-scale basis, can march forward rapidly.

“The quick progress of Netflix and [other major streaming services] is due to the cloud,” he says. “Without the cloud, they would not be progressing to where they are today. Yes, it is true they still have files on the ground and use facilities [for production of original content], but then they send the masters as soon as possible to the cloud, and they are ready for distribution. And then, for future-proofing, they are saying they want all assets used on the ground to be sent as files to be processed to the cloud, as well. Right now, they still use facilities on the ground, but masters and other assets are all kept in the cloud. In fact, from the masters, they produce down-streams as quickly as they can, and they might distribute those to different facilities for a QC or a check, but then it goes right back to the cloud. For them, the cloud is already the center for all processing. So they have already moved [most processes] to the cloud, and are [positioned to eventually] move others there, as well.”

In any case, Witkowski suggests the only real limitations on how soon and how completely the media cloud concept becomes fully and efficiently realized have nothing to do with the serviceability of the cloud itself. Instead, the limitations revolve around bandwidth and compression improvements across the industry, particularly now that 4K and 8K data have leaked into the equation. Once those issues are more fully resolved, it will become a simple question of whether it makes business and creative sense for media companies to shift virtually all their workflow processes up to the cloud, and when and how to do it.

“Most of it is already in place,” he says. “To a large degree, it is only about a [change] in your way of thinking. If you want to stream RAW uncompressed files, then you need at least 500 or 1,000 Mbps to stream. So the limitation is not on the cloud—it’s on your receiving network. But the machines now on the cloud are so powerful that you can stream. I can stream to many different machines, and easily create more machines—a batch of machines—and they would all stream the same. And the same at home: if you want to stream 8K, you need the proper bandwidth. In an ideal world, you might need at least a one-gig connection, at least for streaming [high-resolution content] to smart TV’s, but for now, [advances in compression] allow for high-end streaming. That might take up data at home, but in general, there should be no limits. And now there is such a demand that people are putting more effort into solving these things.”

News Briefs

Live Streaming from Space

In late October, NASA and Amazon Web Services (AWS) conducted an important workflow test during the Annual SMPTE 2019 Technical Conference designed to prove the viability of a cloud-based streaming alternative to long-haul satellite distribution of video originating from the International Space Station. According to SMPTE’s announcement, the demonstration was part of a special SMPTE program to honor the 50th anniversary of the Apollo 11 moon landing. It involved a special live interview from the Space Station, some 250 miles above Earth, with astronauts Christina Koch, Jessica Meir, and Andrew Morgan. The project was part of an initiative by NASA to find new ways to reduce the costs and technical complexity of beaming live video from space by shifting 90 percent of the video transmission workload from satellite long-haul transport to the Cloud. According to TV Technology’s coverage of the story, the SMPTE test involved using AWS Elemental Media Services to transport content, produce the live interview, and deliver it through Amazon’s cloud services by relying on AWS Elemental MediaLive technology for live transcoding, AWS Elemental MediaPackage for content origin and packaging, and Amazon CloudFront for content delivery. The demonstration, which was broadcast live on NASA TV, is still being streamed online here.  

Red’s Jim Jannard Retires

In a recent post on the H4Vuser.net site, famed digital cinema camera pioneer Jim Jannard abruptly announced he was retiring and shutting down his long percolating Hydrogen cell phone and holographic camera capture device project as a result. According to Studio Daily’s coverage, Jannard said simply that he was dealing with unspecified health issues and has decided to retire. He added that Red Digital Cinema, the digital cinema camera company he founded in 2005, would continue moving forward under the leadership of President Jarred Land, Executive VP Tommy Rios, and President of Marketing Jamin Jannard. The promise and struggles of the intriguing Hydrogen project, which launched three years ago with the Hydrogen One phone, which for the time being will continue to be supported, was the subject of a recent blog post on the Pro Video Coalition site.

Tag(s): Cloud , Newswatch

Michael Goldman

Related Posts