<img height="1" width="1" src="https://www.facebook.com/tr?id=414634002484912&amp;ev=PageView%20&amp;noscript=1">
Donate
Mark Your Calendar! April 26th booth selection meeting for MTS2024
Donate

August 2019 - Reinventing Media Workflows

August 31, 2019

As media companies battle to keep up with the industry’s ongoing transition to IT-based workflows, the questions of what, exactly, a workflow is and means on this new landscape, and how best to implement the most efficient ones, sit at the root of the transition. That’s the view of workflow expert Bruce Devlin, SMPTE’s Standards Vice President, and through his company Mr. MXF, a provider of a wide range of services to companies seeking to improve workflows and file-based operations generally.

After all, Devlin says, the concept of a media workflow is nothing new. At its core, he says, “a workflow is simply chaining together a bunch of processes to find a solution to an immediate business need in the most efficient way. The problem is, today, those needs can change minute by minute, second by second.”

In an editorial he wrote recently for KitPlus Magazine, Devlin discussed this issue. He says that while the word “workflow” may only have become ubiquitous in the media industry around the late 1990s, “we always had some kind of workflow. Only, back then, it was Bob and Kevin and Joanne sitting next to each other, doing the job in thier tried and tested way. The rules of interaction were not written, but this was, in fact, a workflow. Those workflows back in the day were dynamic and changed depending on the people you hired because each person had their own style. Therefore, the repeatability of workflows wasn’t that great, because of the variation in the humans involved in the process.”

What has fundamentally changed, Devlin emphasizes, is the fact that automation has been inserted aggressively into the process. The goal of this paradigm shift is to meet business needs in the most efficient way possible, “with humans doing the good human jobs and machines doing what they do best—all the drudgery,” he says.

However, things haven’t totally worked out that way—sometimes, things get more, not less, complicated in the age of automation. Devlin points to the rather straightforward concept of “ingest” to illustrate this point.

“Ingest means different things to different people, and it can be [more complex] in the automated world,” he says. “Typically, ingest involves bringing material into a facility and prepping it so you can work on it. But today, that can be difficult. The first question is, have I got the right stuff? Once upon a time, Bob and Kevin and Joanne, sitting in the ingest department, would probably read a sheet about what had been bought, and they would put a tape in the machine and hit play and check it out. If they were purchasing, say, an old episode of "Dallas," they would make sure it was the right series, the right season, and the right episode, and they would put a tick on the purchase order—the first business step in the workflow.

“But as soon as we became automated, we don’t have a tape anymore. We have a file. How do you know it’s the right file? Well, you hit play on the player, right? But first you must get the file into the player. But you can’t do that until you make sure it isn’t riddled with viruses. So you have to run an anti-virus. Before you can do that, you need to make sure it came from the correct location. And so, before we even begin our automated workflow, we are now looking at a supply chain problem to make sure that the file comes from the right place, isn’t riddled with viruses, and is in a format we can play in our facility.

“So just the act of checking the episode of "Dallas" is now significantly more complicated than it used to be even though, in theory, files have made our lives easier.”

And now, Devlin adds, we have a new entrant in the drama—the addition of Cloud-based workflows. Those workflows, of course, are all about the pursuit of working on media files remotely, often neither knowing, nor caring, where the actual file is located.

“Going back to the example of ingest, in the old days, you were ingesting quite close to where the media was,” he says. “But when you ingest in the Cloud, you have no idea where the content is located. You might ingest into one of the well-known Cloud storage systems like Amazon S3, Azure Blob, or Soft Layer data storage, among others

“So one of the really important things about Cloud workflows is, how do you identify your assets so that you can find them again? We can’t use tape ID, and we can’t use the file path on a physical server. We could use things like Universal Resource Locators [URLs], which tells us the route to the asset over a network, but in the Cloud, that doesn’t always help very much. If you’re a sensible company and you're using the Cloud to its fullest extent, you'll have duplicated all your stuff to reduce risks of disasters. You almost certainly want to differentiate between ‘get me the intro asset’ and ‘get me the intro asset stored in Oregon in the AWS S3 bucket with these credentials.’ The first request is what the user wants to do, and the second request is what the software has to do to fulfill that request. Separating those concerns makes life easier for the user, improves automation of Cloud cost control, and is key to defining workflows that can adapt rapidly to business change.

“Given that a simple URL isn’t enough, what you really want is the identity of the asset you've placed in the Cloud. Storage systems themselves have good ID tracking, but the IDs they employ are kind of local to any storage vendor—an AWS S3 ID cannot be used, for example, as an Azure Blob ID. So my workflow needs a unique ID that shows it's the same asset on AWS, Azure, Soft Layer, and any others. If my workflow can ask a management system for the location of an asset in a specific Cloud, then the workflow can determine whether or not assets need to move. Given that a major Cloud cost is moving assets out of their storage locations, efficient workflows need to be smart about whether the content or the software is moved to perform a process.”

If you do this right and identify your content correctly, Devlin points out, you can work more efficiently and spend less money while manufacturing different versions of your media assets. To wrap one’s head around how this process can work to maximum efficiency, Devlin compares it to the "supply chain" of any business such as the way package delivery companies “move stuff around, [combining] their invoices and credit notes with specific packages.”

“There is almost no difference in what we are trying to do in a media supply chain other than the fact that there is nothing physical anymore,” he says. “So if you shoot something and push it directly into the Cloud, what comes out the back of the camera hits a network and goes straight into the Cloud. Smart workflows give it an ID at that point. So when you begin editing different shots together to create a program, basically that editing function is part of the supply chain, where the end result of this supply chain segment is your completed, edited program. That edited program then goes into a number of different supply chain segments where it is versioned and delivered to various distribution outlets. What you're doing is creating a number of different supply chains with dependencies between them. If you can figure out and audit the dependencies, you can automate, remove redundant steps, and make each chain more efficient.”

Devlin emphasizes that standardization plays a crucial role in this area. The right standards, applied correctly, can allow media companies to identify and reference content and “start to get the efficiencies and the scale you need, allowing you to spend money on the talent that makes the content better, rather than the logistics of executing manual processes and moving content from A to B without adding value,” he explains. “Our traditional, good, solid standards cover topics like the numerical value for representing white in a file or a stream. The standards we need in the workflow era cover topics such as how do you carry a company’s proprietary identifier in a standard file format? How do you know whose identifier takes precedence when searching for content? Can I carry registered identifiers like EIDR and Ad-ID along with my facility-wide identifiers in a standard container? Can I test the behavior of devices against the standard in a Plugfest? There is a massive need in the workflow world for standards covering topics that are kind of boring and stable and need to stay fixed.

“The trick,” Devlin continues, “is to define stable, boring workflows that can be applied to varied and exciting content.” And that’s why he emphasizes that the media world is highly unlikely to have permanently standardized workflows. "Because, after all, entertainment continues to change and businesses need to react. Some change will be slow and the slow-paced technology will be standardized, like the core definition of a file format, such as MXF. Other needs might be recommended practices for when you have to change behavior rather than values, such as the test criteria for carrying EIDR in broadcast TV. Other workflow technologies will require even shorter time scales, and these might end up as technical specifications, such as the constraints on picture resolution for launching a new delivery system. Finally, some needs are very reactive and are best met by agile software that works within a standardized framework. Consider the program guide that you use to navigate your evening viewing through a streaming service. It may be updated every week, but it depends on standards from the language used to write the app, the format of the still images, the format of the media that is played, and the format of the metadata that is transferred from the content owner through a complex supply chain to the app.”

By contrast, Devlin points to the development of the Interoperable Master Format (IMF) and the SMPTE ST 2110 suite of IP video standards as two examples of “good, solid, meaty, concrete standards.”

“With IMF, workflow patterns were identified in producing versions of titles where much of the time we would do the same things over and over again” he says. “We were looking for ways to represent the repetitions and the differences in a standardized way, and that is  how IMF came about. IMF gives us massive industry workflow improvements.”

Devlin elaborates that even with IMF, "we discovered there are certain aspects that may eventually change, or should not be part of the standard. Timecode, for instance, has traditionally been used for synchronization. But in IMF, using timecode to synchronize causes more problems than it solves. We’ve found ways to remove timecode from the standard, but to apply it when absolutely necessary for specific business cases. Part of the skill of the IMF community working on the standards is to recognize that some things are important to standardize and some things are best left to individual businesses to figure out on their own.”

Where to draw the dividing line between all these different techniques for achieving interoperability is an interesting philosophical debate. Devlin uses quality control (QC) in a media workflow as an example.

“There are several automated QC tools on the market that measure values in streams and files,” Devlin says. “In an automated workflow, you want to be able to compare results between different tools and from different processes. This means that each tool should consistently name the tests and provide results in common units such as ‘cd/m2’ for brightness, or in pixel offsets from the top left corner of the image when locating defects. Additionally, the layout of the results should be in a common form so the outcomes can easily be displayed with the software. Work in this area began by the EBU with its QC project. Effectively, this is a methodology and specification for testing standards.”

“However, in a workflow, simple actions like starting and stopping a [QC] service are important,” Devlin adds. “So, should the API to invoke these services be standardized across vendors? With my SMPTE Standards VP hat on, I can tell you I have spoken to people who believe with religious fervor that they should be standardized, while others with equal fervor state that standardizing that kind of software API should never happen." Devlin says he believes that limited API standardization could encourage an automated workflow market for a variety of software models. The idea of standardizing a dialect or protocol or an API to invoke QC software is useful and might even spawn a new sort of automated QC environment or an ST 2110 monitoring environment, states Devlin. 

“SMPTE, as a due process open standards body, is obliged to accept proposals from people willing to come to us and do the work to make a standard. But we have very strict rules and tell them they have to standardize it properly. The point is, our industry is in a complicated state where standards for underlying technologies are vital to building agile apps. But do we standardize the actual apps or the protocols? That’s a new area of debate we are seeing across the industry.”

In other words, Devlin cautions, with all this innovation in a still embryonic landscape, the industry must be cautious not to rush into over-codifying how to structure workflows in an IT world, especially since “we are always going to be in a state of disruption from now on.”

By “disruption,” Devlin means that there are so many new developments, tools, processes, terms, and even new definitions for old terms flying around that many things simply need to be sorted out and properly understood by the wider community before they can ever hope to fit into a neat, industry standard way of doing things.

“Netflix started out as distributors of physical media, but now, for them and others, distribution often means moving a database record from one database on the production side to a different database on the distribution side while the media stays put in the Cloud,” he says. “So ‘supply chain’ is no longer constrained to the movement of a physical asset, and workflows to deliver media from A to B will continue to change fundamentally as we move further into Cloud environments. I can fully see in about a decade from now, we won’t be shipping 100 Terabyte Zip bundles around. We’ll be moving access control records that can be sent by e-mail or even a secure messaging service, because they're only a few kilobytes in size. But you will still need all kinds of authentication and security tokens to identify and access the material. We’ll have to reinvent every workflow in the industry at least twice before we get there.”

Until then, since there will be no “one size fits all” workflow approach any time soon, he emphasizes that education is the best tonic.

“The more you learn about this future we are heading into, the more secure you can be about your role in it,” he says. “The two questions that your education needs to answer are ‘what is it’ and ‘how do I fit into it?’ I phrase it that way because I’ve come to understand that for new technology to be successfully deployed, a key piece of any workflow will be change management. How will the technology adapt to fit with the way humans act, and how will humans adapt to suit the technology? It's all very complicated.

“I think that view is healthy for the industry, because the instant we believe we have a one-size-fits-all solution will be the moment that people stop innovating.”

News Briefs

Netflix’s Technology Push

A recent Hollywood Reporter column explains how the rise of streaming content giant Netflix has played an increasingly important role in driving Hollywood’s technology infrastructure. The article points out that digital camera manufacturer Arri recently released a new model of its Alexa camera outfitted with a 4.5K sensor, a development at least partly spurred by Netflix’s insistence that all of its original programming originate on cameras with 4K sensors and above. This had previously kept many leading cinematographers from using the popular Alexa platform for Netflix jobs. In a similar vein, the article elaborates, consumer TV’s are now starting to come out with either “Netflix Recommended TV” stickers or “Netflix calibrated mode” options as part of their packages. The article’s author, Carolyn Giardina, says Netflix also launched late last year the Netflix Post Technology Alliance with partners like Sony, Adobe, MTI, and others to share its “technology roadmap,” with the manufacturing community. Giardina adds “the streaming giant has exercised its clout to become the most influential entertainment company in the technology field” with initiatives and guidelines that “touch and influence everything from hardware and software development to industry display standards.”

HPA to Honor Legato

The Hollywood Professional Association (HPA) recently announced plans to present the organization’s Lifetime Achievement Award during the HPA Awards gala in November to pioneering visual effects supervisor Robert Legato, ASC. The award, which will be presented at the event at the Skirball Center in Los Angeles on November 21, is presented periodically to individuals whose work has directly led to the betterment of the industry. Legato has dozens of film credits, frequently serves as second unit director, and routinely collaborates with leading filmmakers like Jon Favreau, James Cameron, Martin Scorsese, and Robert Zemeckis, among others. His efforts on Cameron’s Avatar (2009), and more recently, on Favreau’s Jungle Book (2016) and this year, The Lion King, has been central to developing, advancing, and evolving new virtual production and other workflow techniques. He has won Oscars for his work on Cameron’s Titanic (1998), Scorsese’s Hugo (2012), and Jungle Book (2017), among various honors.

Michael Goldman

Related Posts