November 2017

SMPTE Newswatch Masthead

Hot Button Discussion

Preservation and Archiving: Next Generation
By Michael Goldman  

Andrea Kalas likes to joke that the No. 1 instruction she gives her staff at Paramount Studios, where she serves as vice president of Archives, is “don’t lose stuff.” In actuality, Kalas’ concerns are far wider ranging. She is the outgoing president of the Association of Moving Image Archivists (AMIA)—a position that ends in this month, with Dennis Doros of Milestone Films recently elected to take over the position. Given what she does for a living, and as someone intimately involved with the world’s largest association of professional moving image media archivists, her concerns revolve around figuring out the best methods for archivists to preserve reams of analog and digital media from industries across the spectrum on a rapidly changing, IT-centric landscape. Indeed, she emphasizes, “the things we are asking our digital archives to do today are more monumental and voluminous than ever before. In fact, I think it would be great if every film student were required to take a course about the history of film preservation and archiving because every filmmaker needs to understand and help support these efforts and make sure their material is managed properly.”

Preserving, storing, migrating, and keeping safe an endlessly growing stream of digital image data, however, is a highly specialized job. One of the things about this reality that is particularly important, Kalas says, is the fact that there has been a realization that the need to archive moving images is no longer an issue limited to the media world.
“Recently, we’ve seen people from law enforcement, banking, university libraries, healthcare all joining AMIA,” she says. “They all need to take care of video and file-based media.”

Therefore, on the preservation and archiving front lines, one of the most pressing concerns is the issue of training new archivists and institutions that have media libraries they wish to preserve. One of the challenges in doing that revolves around teaching people that there has been “a big change in just understanding how things work.”

“The skills, the materials, everything changes when you go to a file-based workflow,” she explains. “That caused changes in the management of the whole industry, and archiving is no different in that respect. People who once knew how to store film in the right environments; duplicate them at photo-chemical labs; deal with obsolete videotape formats; and migrate them to more up-to-date ones, have had to learn how to think about preservation in terms of the concept of bit loss, or figure out how to manage new ways of specifying file delivery.

Kalas notes digitization as another catalyst for change. "University libraries have become a new part of our world—many have joined AMIA because they are digitizing entire moving image collections. To do that, you need to know both the history of [analog] motion-picture technology and new file-based technologies.”
Kalas said she is pleased to see university-level programs in moving-image archiving at places like UCLA and NYU and in Europe, as well as through AMIA’s annual conference, and its collaboration with organizations like the Society of American Archivists, the American Library Association, and SMPTE. However, she says more must be done to link the many industries and entities that need to worry about preserving and archiving data and get them all on the same page.

Along those lines, she points to the concept of “community archiving” as a sign of how the industry is educating itself and a growing roster of new and previously uninvolved participants about the issues that challenge their work. She says that community archiving is essentially the concept of “reaching beyond our own institutional walls” to form initiatives and educational support programs for archivists of all types.
Kalas points to a range of initiatives that involve AMIA, including the International Federation of Film Archives (FIAF), the International Federation of Television Archives (FIAT/IFTA), the International Association of Sound and Audiovisual Archives (IASA), as well as various universities and the Film Foundation’s World Cinema Project that are dedicated to “broadening our efforts.” AMIA, she says, also conducts community archiving workshops, to spread the knowledge.

“[Community workshops] started as a group of people who, before our conference, each year, would find people in whatever city we were in who had [moving image] material that had value, but did not know how to care for it,” Kalas says. “We held one-day sessions and insisted they bring volunteers from their organizations, and we trained them. How do you build an inventory? How do you find vendors who can deal with your videotape material? How do you rename files, so they are consistent? This group built up an incredible set of skills and turned it into a training operation.”

Kalas says the industry needs to understand that “data storage, which has seen a steady decline in costs, is only one key issue when it comes to preservation. Validation, metadata and asset management, and ways of preventing data loss are perhaps even more complicated than storage.”
In particular, the issue of metadata has come under increased scrutiny, since its role is different for preservation and archiving work than it is for content creation or distribution, she adds.

“Once upon a time, you could go to a shelf and pull a film—that was the way we found things,” she says. “With a digital archive, you don’t have that luxury. The metadata’s first and foremost job is to identify a set of files—an asset in your archive. That is critically important. The secondary importance of metadata is its use in automation. As we script processes about our digital archive, identifying which assets we are talking about is also critical. That’s where digital asset management comes in—that’s the software around the assets and the metadata.

“Software and hardware will always change, but what needs to continue is the company’s asset. This should be considered when we are building out either our digital asset management system or a storage infrastructure. You need to know the requirements for your archive, and make intelligent, informed recommendations, based on those materials. But [formats and display technologies] are always going to change. Right now, it’s all about high dynamic range, but there could be other factors in the future. What, for example, is the best way to store a Dolby ATMOS asset for the future? These are the kinds of things we are working on.”
Thus, she says, proper storage infrastructures typically involve a set of software and hardware, specifically engineered with digital preservation in mind. That by its very nature is one place where you do not want to lose bits, she states. “You want data to stay intact, so you write software that enables that across a storage infrastructure. That storage infrastructure can be a set of servers, a Cloud service, or a robotic tape library, but it is necessarily an online idea.”

This all begs the question—is there, or should there be any industry standard foundation for how a preservation infrastructure should look? Kalas points out that preservation and archiving professionals already utilize a wide range of existing industry standards for moving image data in their work, but says best practices across the board is the real issue in this regard.
“First of all, archivists love standards,” she says. “DCI [specifications] and the IMF standards, for example, are good, because they provide structure and rules for assets. If it already exists, that’s fantastic; however, continuing with how to ensure you have an excellent preservation infrastructure is important. What are the guidelines, the best practices around them? That development and the refinement of those best practices and their evolution are factors that will definitely support the preservation of moving images.”

According to Kalas, one popular model, used by academic libraries and government institutions that care for critical scientific and medical data, is the “trusted digital repository” concept, coming out of the International Organization for Standardization’s (ISO) standard reference number ISO 16363:2012.  She says the premise has broadened into concepts articulated around the industry as “data conservancy,” “data curation,” and “digital stewardship.”
“The reason for these broadening and sometimes difficult to summarize ideas is that digital preservation involves many moving parts,” she explains. “Like all file-based processes and precise workflows, organization and management are critical to avoiding data loss. For example, if you know your movie must be preserved, it is necessary to ensure that the exact title and the collection of files that make up the picture and sound will endure through any series of hardware and software platforms. That usually involves multiple sets of data from disparate places. Your job as the archivist is to track all that information, despite the constantly changing systems.

“For archivists dealing with moving images, whether at a studio or a large library, making sure your preservation and distribution or access systems are aligned, benefits the data and your organization.” For instance, similar sets of organization principles are required whether you are trying to make sure everyone can see your movie on their tablet, in their living room, or students can get key research data, Kalas says.

Another debate across the archiving and preservation industry involves how best to incorporate the use of the Cloud for migrating, storing, and preserving data. The problem with that, Kalas suggests, is that “most Cloud providers have complicated ways in which they store data. As I said, archivists need to store assets, and those assets are made up of many bits in very specific ways. Thus, we are working to make sure that the integrity of your assets [and not just raw data] can exist safely in the Cloud. For example, the integrity of an asset is replicated very specifically with a check-sum associated with each film. We have very specific policies around such [factors]. The Cloud was not created with a preservation infrastructure environment in mind. It can be used for storage in the best possible way to duplicate data so that you will hopefully not lose it, but it’s never thought of as preserving a set of assets.”
“The Cloud could be a great benefit if we meet a little bit regarding how we perceive storage. But is it storage for bits? We have to get together on things like that.”

On the other hand, Kalas says modern archivists are excited that a sort of renaissance is currently under way with open-source tools to make the process of digitizing assets, QC, and asset packaging on digital files and such more efficient.  
“Within AMIA specifically, we see projects with brilliant people [creating] tools that can be open-sourced and shared throughout,” Kalas says. “Such examples include quality control scripts, health checks, and frame sizes, and standard deviations of color, which can be run on many different files, as a validation step. So, you can check to see how things are with these [easily accessible] tools, and also understand the basic quality of the assets you have.”


News Briefs
SMPTE 2017 Annual Technical Conference & Exhibition Wraps

The SMPTE 2017 Annual Technical Conference & Exhibition in Hollywood was a big success, according to SMPTE’s recent wrap-up of the event, drawing more than 2,500 registered attendees, 105 exhibitors, 70 expert presentations, and a lot more. SMPTE Executive Director Barbara Lange reported attendance at the event was “our highest in more than a decade.” Cinematographer and immersive media expert Andrew Shulkind gave the event’s keynote speech, with a focus on the state and potential of immersive content. You can view his address here. According to a recent TV Technology article, one of the event’s driving themes was the maturation of Ultra HDTV, particularly in areas like closed captioning for 4K content and storytelling in 8K. the SMPTE 2018 Annual Technical Conference & Exhibition (SMPTE 2018) will take place 22-25 October 2018, at the Westin™ Bonaventure Hotel & Suites in downtown Los Angeles, a new location for the event, and the technical conference will be chaired by SMPTE Fellows Thomas Edwards, vice president engineering and development at Fox; and SMPTE Education Director Sara J. Kudrle, product marketing manager for playout at Imagine Communications.

FCC Voting on ATSC 3.0
The Federal Communications Commission (FCC) recently announced that it will take action to approve the next-generation broadcast standard known as ATSC 3.0 formally. As summarized by TV Technology, the FCC released a Report and Order (R&O) and Further Notice of Proposed Rulemaking in October, designed to permit a voluntary transition to ATSC 3.0 for over-the-air broadcasting. The commission was expected to vote on the R&O at its November 16 open meeting. The R&O would start off by authorizing ATSC 3.0 on a voluntary basis, but require those doing so to partner with another broadcaster to simulcast similar programming in ATSC 1.0, and require for five years various other controls that would sunset after those five years unless the FCC decided to extend the period.  

State of the AV Industry
A recent AV Technology column offered an interesting dichotomy about the state of the professional AV industry. The column said that attendance increased at AV industry tradeshows like InfoComm this year, and that overall industry revenues are up modestly, with exciting new technologies available in almost every category. Simultaneously, numerous AV integration companies have gone out of business or been taken over by larger companies in recent years, and some manufacturers have downsized, as well. Author Justin Rexing, an AV systems design engineer, and consultant, suggests numerous factors are working to account for this, including economic shifts requiring more to be done with less, more educated customers able to manage their infrastructures in some cases, and many large companies pivoting to running their in-house AV departments. He suggests the role and knowledge base of technical managers, and management of labor and technology will all have to shift in coming years for the AV industry to avoid an eventual downturn.