Current Issue - June 2018

SMPTE Newswatch Masthead

 

Hot Button Discussion

Comprehensive Media Management
By Michael Goldman  

To put it simply, the big problem with managing media workflows these days is that there is virtually unlimited data to manage, dozens of platforms to deliver to, and a plethora of sub-categories within the greater content creation and delivery umbrella, according to Jesse Korosi, director of workflow services at Sim Digital and co-chair of the HPA’s Young Entertainment Professionals (YEP) Committee. Korosi emphasizes that a discussion of real solutions about how best to collect, manage, and distribute assets and metadata together all the way across the production and post-production chain needs to be comprehensive and global, starting at the camera head onward. 
 
“This discussion needs to involve both production and post-production,” Korosi emphasizes. “On the front end, we are no longer talking just about camera information that we need to track—like resolution, codecs, frame rates versus base rates, internal camera settings, and so on. We are also talking about computer vision assets, facial recognition data, object recognition, and wardrobe within a shot. Eventually, tracking and managing all this data has to become a standardized process.”

Korosi expects that the future of media data management will therefore head toward some pretty fresh ground. Among other things, this paradigm shift includes how best to incorporate powerful Artificial Intelligence (AI) technologies; the need to push past LTO tape as archival/backup mediums by trusting all data to the Cloud; and getting equipment manufacturers to eventually work together to standardize metadata categories and formats, thus making it easier for content creators to put together centralized databases of all pertinent metadata associated with their file-based content.

All this is easier said than done. For one thing, Korosi points out that despite the overall improvement in efficiency and the enormous potential of new approaches concerning protecting data and making it accessible, there is no pressing demand to implement radical changes at the moment. Also, for another thing, a lot of these new steps are hard to achieve quickly for a wide range of reasons.
 
“I go to shows like NAB, leaving excited about opportunities with new [technology] in this area, and I come home and realize it’s up to me to keep that excitement and engagement going with my team, because none of our clients are asking for this stuff,” he remarks. “Plus, a lot of the technology I see is not in a place that yet has many real-world applications. At the same time, it’s up to us to lead the charge.”
 
Korosi points to several legitimate obstacles that make content creators hesitant to take steps to standardize how they manage metadata. Among these lies the issue of whether or not it’s worthwhile from a business point of view.

“Even if we manage and collect all this metadata for the studio, is it worth anything to them? Are they paying for it?” he asks. “Technically, all the metadata connected to everything on the screen is their information, so I guess we could provide it to them as a paid service. However, until they can do something with that data, it’s not that valuable to them.
 
“And then, even if we have all this awesome information for them, how do they want it? There is no standard for them to say, OK, place this information here, place that information there—please give it to us as an XML file or a JSON file. There isn’t any standardized way for production or post-production facilities to provide that information. If you don’t have a standard for how they receive the metadata, every company on every project is going to give them a giant bucket of metadata in a different way.”

Since there is currently no standardized database or approach, Korosi decided to essentially design a manual approach of his own for Sim to offer clients. That approach is called Metabanq, and involves aggregating metadata from various production files and or software used on sets, like ScriptE. The idea is to efficiently combine any metadata content creators get their hands on from both production and post. 
 
“I’ve found that, typically, production and post never have a centralized place for technical information or metadata for everybody to reference, so I’ve worked with ScriptE to customizes an XML export that merges into our database,” Korosi explains. “This XML contains descriptions of action, lens information, character names, location information, and more.” 
 
Korosi continues to work on Metabanq, adding metadata enhancements alongside connecting master media/automated transcoding processes, but as proud as he is of that work, he suggests such a database is currently limited without the use of AI technology to gather most of the data.

“There are a few different products out there, like Microsoft’s Video Indexer, which can upload and scan your finished content,” he says. “And when you scan it, maybe you do facial or object recognition, and then, at that point, information becomes available to you. A lot of the focus on gathering this metadata has been on targeted ads—chasing the idea of trying to sell wardrobe or assets that are within the frame.  However, another approach is looking at how we can also help the creative within post-production with additional metadata.”
 
For example, he points to an Adobe Premiere plugin shown earlier this year at NAB by a company called GrayMeta, called GrayMeta Platform, designed to offer total metadata searchability of all storage locations from a Premiere editing interface.  
 
“It’s a good approach, but the problem is most scripted episodic and feature jobs are edited using the Avid platform,” he says. “I don’t know if Avid is working on something similar, which is why I ended up designing my database for our clients. The idea is to log in and have access to all information in one central place. For example, a visual effects’ pull for a VFX vendor—we pull from our metadata database and provide that information along with the pull.  The same could be said for delivering editorial dailies. The Avid will receive all of this metadata, as well.”

“But, although they can search our database, or their Avid once they have this data, it is not as accurate as it would be with an AI workflow. The studios pushing the boundaries technologically, however, are a little more focused on centralizing their camera masters than they are their metadata or technical information. One step at a time, I suppose.”
 
He adds that a hybrid solution of offering the service of managing the metadata and simultaneously helping studios centralize their negative during production is a service that only a few, select entities will have the tools, resources, and expertise to offer for the next several years.
 
“More and more, people are shooting Arri RAW,” Korosi says. “It was only a couple years ago when Arri RAW open gate became a thing [for television production], and now, it’s normal. And 8K, at a low compression ratio on a RED or Arri 65 is now considered large format. This trend will continue to grow every few years. Getting all that content up into the Cloud in an easy-to-scale way, factoring in new remote locations for every job, is tough—that’s a problem that needs to be solved.”

Korosi notes, however, that Netflix is explicitly pushing all its media into the Cloud these days, and that Amazon has also started to do so for specific jobs, and with success. The advantage of having all bits of every project already online from the get-go, he suggests, can be substantial.
 
“When Netflix pushes all its negative up into the Cloud, and stores it there, that means that if they ever need to convert that material or provide any process on it, they already have it online,” he says. “They don’t need to go to a warehouse and pull their tapes out of boxes, mount the tapes, pull the files off, and only then figure out what they have. They may also want to use that data for analytical purposes.”
 
Thus, despite cost and bandwidth concerns, he expects “it’s only a matter of time until we don’t need LTO tape anymore, and the Cloud can be our archival storage—and this means also providing that storage for any vendor that needs access. Everyone authorized will have access to cloud-based storage when they want to do a VFX pull, a promo pull, want to do a conform for the online edit, or want a copy of the negative for any reason. You gain access through a control system, which is essentially software and access controls. And it will be well managed, all centralized. The studio will finally have access to [all metadata] connected to their negative, whereas, right now, post houses kind of control those assets because the studio can’t keep its hands on all facets of its media.”

Indeed, central to Korosi’s argument is the notion that this transition, though difficult right now to fully initiate, will eventually happen because it’s in the interest of studios and other content creators to more firmly grasp full control of all data related to their content. However, he says that none of the metadata aggregation and centralization of computer-vision scanned content will be viable until one crucial development transpires first.
 
“We need to standardize how we are naming and categorizing this data,” he declares. “Look at cameras as an example. RED versus Arri and other camera companies have their own column names for the same pieces of data. I have always wished these companies could get together, and standardize how they note their metadata. Quite frankly, this is a mess, and computer vision data will be ten times more complex, with just as many or more companies offering the computer vision scanning service. The industry needs to initiate that conversation.”
 
Meanwhile, Korosi argues that, for the time being, it’s only large post-production corporate entities that have the resources and bandwidth to put together their own soup-to-nuts, centralized data management systems. Therefore, boutique operations may eventually suffer from not having such systems in place. “It’s tough for such companies to put the kinds of resources into a product like that, while [much larger companies] can,” he says.

That said, he elaborates, if camera vendors and companies behind computer vision services were to cooperate on a process for standardizing metadata eventually, “it would make it easier for the boutiques, and the industry as a whole” to bring such an initiative to fruition. “That’s another aspect of the conversation—post companies having dialogue with camera companies. The lines have blurred between production and post, and so it’s justified for them to come up with a system that translates all the way [through post].”
 
Further, he speculates that if studios themselves pushed for standardization, “they could offer [post-production partners] a breakdown as to exactly how to upload their metadata and media, and it would be up to the post houses to conform, or don’t take the job, just as they already do with LTO specifications.”

“It’s very common that a lot of [metadata] gets thrown away because nobody has a [centralized] system,” he says. “Yes, it’s true they might have [some of that information] in a master camera file, but how often do [post houses] get to work with a master camera file? When you turn shots over for visual effects, you convert them to DPX or EXR. While doing that, most people are throwing away [metadata]. They don’t even retain the original camera file name, because they convert the file to a new visual effects shot name. Are they passing along lens metadata and all the internal metadata from the camera? 
 
“And what about all of the other logged metadata?  Let’s take, for instance, a LUT-based job whereby certain LUTs were used on set/in dailies. If you need to turn over shots for visual effects, and you have 50 shots, each using one of eight LUTs, how are you delivering that information to that vendor?  You need a centralized database from which any delivery can pull.

“Also, even if you do deliver all of that data to the VFX vendor, how often is any of it being maintained during delivery from VFX back to the online facility? I have personally only ever seen this happen once, and that was working on the movie Warcraft, where we developed a custom solution with ILM to make this happen.  All of these metadata tags I have just mentioned are also just the tip of the iceberg once we get into actually providing a computer vision scan during the dailies process. If we have not been able to standardize the tracking of this simple information, how are we going to deal with the plethora of information soon to come our way?”

 

News Briefs
Congress' Technology Gap

A recent feature story in the Washington Post suggests the United States Congress faces a dire learning curve when it comes to legislating on new, cutting-edge technologies like Quantum computing, Artificial Intelligence (AI), satellite technologies, Internet issues, 5G, driverless cars, drones, and more. The article quotes one Congressman at a recent technology hearing on Capitol Hill telling a witness from a major tech firm that he could only understand “about 50%” of the things he was saying, and points out that members of Congress were widely criticized recently for their lack of specific knowledge while grilling Facebook CEO Mark Zuckerberg in public hearings. The article reports that there is a drive in Washington to try and do something about this knowledge gap by bringing back an old institution that the Post refers to as a “science-and-tech think tank,” better known as the Office of Technology Assessment. That office’s role was to give advice and education to members of Congress and their staffs on brewing technology matters. However, amid a range of partisan battles, it was defunded and disbanded by Congress in the 1990’s. California Democratic Congressman Mark Takano and Illinois Democrat Bill Foster, however, are pushing to restore the office, according to the article, by reviving its funding.

Online SMPTE Journal - 10 Issues
SMPTE recently announced that, for the first time, it is now offering an additional, digital-only, online version of the legendary SMPTE Motion Imaging Journal. That version, available via the SMPTE digital library, means that the Journal is now published ten times per year—nine print editions and one online version. SMPTE officials emphasize that the quality of the digital-only version is the same as print, maintaining the same peer-review process for every piece of content. The online version debuted in June, hosted on the IEEE Xplore platform.