SMPTE Newswatch Masthead

 

Hot Button Discussion

Content Security on a Decentralized Landscape
By Michael Goldman  

When asked what has changed the most in recent years when it comes to the subject of content security, Marc Zorn suggests it is the need to stand vigilant over what is now a decentralized data chain. Zorn has almost 30 years of experience in information security practices, heads up productions and content security for HBO, and is also on the Program Committee for the SMPTE 2018 Annual Technical Conference. He explains that the industry’s transition from a film-based workflow to a file-based workflow has fundamentally impacted “the philosophy of how [security professionals] address the digital workflow” as compared to previous practice.  
 
“Now, lots of security issues become IT problems or, in some cases, just common sense for how we manage information versus physical assets,” he says. “Because [content] now takes a digital form, someone can make a copy of it, and you may never even detect it. In that sense, it’s possible to both lose your content and still have it at the same time. It’s a new way of looking at traditional workflows.

“That said, we need to handle digital information [from a data protection perspective] from beginning to end. Once data is created, how we store it, share it and the environement in which it is transmitted must be determined, covering all the stops along the way. We have to protect media content by doing all the things you would do if you were managing any other kind of digital information.”
 
Zorn adds that the way we handle [media] assets now is very different from the process with film-based workflow where there was a limited number of assets and we had to make sure the right people were the custodians of those assets. “Now that we are moving from a central workflow to a distributed and collaborative workflow, it must be done differently,” he says.
 
These new processes, he says, largely revolves around “putting different mechanisms in place so that the data is safe wherever it is.”

In moving to a distributed collaborative workflow, IT-based developments like encryption, data authentication, digital certificates, digital fingerprinting, and more come into play. “The idea is that the data is protected, wherever it is along the way, and only those who have the right combination of tools to access it can work on it,” he adds. “So we are moving from protecting centralized data in a controlled environment to the Cloud or very distributed environments where we [set up systems] that allow the data to protect itself.”

The aforementioned data protection techniques have been technically absorbed by the media industry out of the IT world. But, adds Zorn, how strategically and successfully they are applied still varies widely, largely because the nature and culture of the media industry are far different than other industries that need to manage and protect large volumes of data.
 
The fundamental difference, he says, is that “media and entertainment don’t want to slow down the workflow. They don’t want to make any changes, unless they are essential. The tools may now be available, but that does not mean people have adopted them into their workflows. In fact, the media industry has a challenge in that gap that other industries manage through regulations and other means. The pressure of the media industry has always been to get the product out the door as quickly as possible.”
 
In recent years, though, the prioritization of this issue within the industry has been accelerated by upticks in serious events on what Zorn calls “the threat landscape.” Recent piracy, hacking, and data theft incidents across virtually all industries and government on the heels of the infamous Sony hack incident in 2014 have increased awareness among studios and other content creators.

Zorn says he believes the threat landscape is usually underestimated until a large event occurs. “People who are trying to get our data are always going to be a little more motivated to find the holes than we are to plug them, at least until some loss event happens. Usually, it is a cascade of reactions to those events. People look at what happened and ask how it could have happened, and should they have done more to prevent it. It’s kind of like the stages of grief with any other loss.”
 
Zorn points out, one evolving change is that “studios are now investing in security departments that are directly involved in a production. They are no longer an afterthought. The security team works directly with the production team to take content from its creators through to the distributors, placing watermarks and digital rights management and all other safeguards on that content inside whatever [digital file] wrappers they are using for the final product. Security features are now embedded earlier in the process than in the past—really from the time the data leaves the camera. Security has to be part of the entire [production and distribution] process.”

This presents a dichotomy for the industry. On the one hand, content creators must take charge of managing and protecting their material on the decentralized digital landscape. On the other hand, due to the industry’s unique workflow demands, mentioned earlier, the implementation of new security protocols must not interfere with the efficient creation of media content, nor its distribution. 
 
“In that sense, people are no longer thinking of IT as being a separate department [from a media production]. It is now an aspect of everything they do. With that, I think the solutions become more turnkey or off-shelf or already integrated into the tools they are using just because everybody is using them. It’s more ubiquitous than before. So [content creators] are not just handing over their data to the guys in the data center to manage for them. Now, they are in charge of their data and security, because people are learning the tools better.”
 
Zorn says that one of the more exciting new methodologies for seamlessly implementing additional security into the workflow chain without slowing down productivity is the use of so-called block-chain technology. Block-chaining was initially developed to allow data to be distributed but not copied, for digital currencies like Bitcoin; however, it is now increasingly being considered relevant for the media and entertainment industry.

“That’s a process that combines both encryption and check-pointing to allow secure transactions between the people who are supposed to have access to the data,” he says. “Block-chaining is a nice technique for making sure the data is maintaining integrity throughout the whole process. We see that a lot of studios want to adopt it.”
 
Zorn adds that block chaining has relevance in the industry’s growing drive to weave additional security features into the Interoperable Master Format (IMF).

“There are many different formats that could benefit from having embedded security information, and IMF is one,” he says. “If IMF can incorporate some security information like simple file hashes, then [studios] could use some of the metadata for block-chaining directly out of an IMF file.”
 
These and other issues beg the question, how and to what degree should the industry standardize security protocols? For major studio content, of course, the Motion Picture Association of American (MPAA) set security protocols for many years. Recently, however, the MPAA announced it was partnering with the Content Delivery and Security Association (CDSA) in a security initiative designed to reduce the risk of piracy and data theft involving film and TV content across the industry.
 
Zorn urges people to examine that initiative, called the Trusted Partner Network (TPN), which was designed to strengthen security standards for production and distribution vendors who collaborate with studios. Basically, industry vendors in various categories will undergo annual assessments or evaluations of their practices and join with other “trusted partners” listed in a proprietary directory of verified vendors that maintain high security standards, available to studios but not the public. That program is currently in Beta and is expected to roll out later this year.

“It used to be that the MPAA would [recommend] best practices, and then the studios would tell their vendors to meet them, and they would audit against that,” Zorn says. “But you obviously can’t visit and audit every vendor every year, so there was always a subset of people [involved]. With this new initiative, vendors must participate in the entire security ecosystem—they get their audits and present them to the industry, and they must get it to a certain level. This initiative raises the bar from a logistical standpoint. If I were to look into my crystal ball, I would say, by NAB 2019, probably everyone will be rushing to catch up to this approach, and studios will be demanding increased security from the people who create tools.”  
 
Zorn states that as the security industry adapts to the new landscape and data becomes more massive and more bandwidth-intensive he does not anticipate any signficant issues. That’s because the process of encrypting high dynamic range (HDR), ultra high definition (UHD) data is not that complicated, he says.

“It’s somewhat of a myth that [larger files] are harder to encrypt than lots of smaller files, in terms of overhead,” Zorn explains. “By overhead, I mean there is less overhead with a large file than there is with a small file in terms of the percentage of encryption versus the raw data. The larger the file, the less the encryption gets in the way, so as we increase the size and depth of files, encryption becomes less of a direct burden. It’s almost an inverse relationship and somewhat surprising to realize until you look at the mathematics.”
 
Encryption, however, becomes complicated by the multi-platform universe upon which media content is now viewed.

“That makes [matters] more complex, without a doubt,” he says. “The more formats you have to support, the more flexible you have to be with encryption and the various other control mechanisms you put into place. For every environment used to view content, there are particular challenges, and for each one, there is one particular set of controls available, depending on whether you are watching on a Smart TV, an iPad, a PC, or other device. It is difficult to have one set of controls that work everywhere—it’s not feasible. So, you need a combination of controls for every environment. That’s why we rely on more than just encryption—on things like digital fingerprinting. There, regardless of encryption, there is still unique data embedded in the content you have, so that you can uniquely identify some of the sources, depending on how late in the process you put on that watermark.”
 
In any case, even considering all these developments, Zorn cautions that you can’t take the human element out of the security equation. As he noted, those determined to steal data will be more enthusiastic and knowledgeable than those trying to stop them. That’s why, in his view, “prevention is the most important thing.”

By that, he means the industry should spend more time promoting the fact that as good as its data protection tools and protocols have become, its methods for detecting data breaches and finding those responsible are even better. This is an issue Zorn says the industry has not traditionally liked to discuss.
 
“The way the [digital landscape] has changed processes, and the way the technology is changing, you can’t always maintain control,” he says. “But the tools for detecting that somebody has done something wrong—tools like digital fingerprinting and others—make it so that there are more ways of getting caught than before. We have good detection mechanisms that can positively identify who leaked content, for example. In that sense, the philosophy at many studios is starting to change from total control to detection. I think the industry has done a disservice by not making it better known that we have better detection tools than people realize.”

 

News Briefs
AI at NAB

The NAB 2018 show emphasized Artificial Intelligence (AI) applications for media use, among other key topics, according to recent coverage of the show from the Hollywood Reporter and the Pro Video Coalition site. Putting AI and other forms of machine learning to work in the production and post-production arenas for things like script and budget analysis, versioning, editing, and visual effects applications were key themes throughout the show, according to those reports. For example, IBM’s Watson Media demonstrated various solutions for utilizing AI tools throughout a media pipeline and other ways to go beyond current use of metadata, captioning, and such. Several manufacturers ranging from Mobile Viewpoint to Tedial, MRMC, EVS, Cantemo, and Veritone demonstrated new AI-based tools for smaller broadcasters, sports producers, news operations, camera robotics, and such.

Cinematographer Concerns
Meanwhile, as the Hollywood Reporter and Studio Daily reported, NAB 2018 also included panel discussions in which leading cinematographers expressed a growing concern within their community about a trend toward cinematographers losing control of the imagery they shoot during the post-production process. In a panel presented by the International Cinematographer’s Guild (ICG), famed cinematographer Janusz Kaminski spoke frankly about his concern that ownership over the original creative intent of an image is being taken away from directors of photography, declaring “there are too many cooks in the kitchen.” He suggested that if the director or cinematographer, or both, are not involved, there is a high likelihood images will be manipulated away from their original intent.  

Measuring Modern Viewing Habits
A recent column on the Broadcasting and Cable site detailed some of the modern issues that have risen when it comes to measuring consumer television viewing habits accurately. The article points out that, historically, focus groups and surveys were the primary tools that networks and advertising agencies used to obtain such data. However, today’s digital technology allows them to directly capture data that potentially goes deeper in terms of analyzing viewer behavior. But, the article suggests, these new data collection approaches are inherently “passive” by their nature, meaning “our previous measures of success no longer apply.” Therefore, the article offers advice on how to analyze viewer behavior data, including the need to know the difference between device-level measurement and person-level measurement, meaning that measuring the activity of a device is not the same as measuring a viewer’s actual behavior. Moreover, so-called “completion rate” measurements do not tell a full story, or even if a viewer is in the room during the period the data was collected.