SMPTE Newswatch Masthead

Hot Button Discussion

IMF's Growing Relevance
By Michael Goldman  
 
With the rollout of the Interoperable Master Format (IMF) ongoing, it’s instructive to examine the impact so far, and looming next steps for the flexible, new international standard format for file-based professional workflows. IMF is essentially an umbrella term for a linked family of standards that permit content publishers and distributors to exchange master files and linked metadata that make it more efficient to disseminate different versions of their material to all the world’s viewing platforms and territories, no matter what form those platforms may take today or in the foreseeable future. Practically speaking, since IMF officially debuted in 2012, it has achieved its initial goal to permit the use of a single, interchangeable, master file for any piece of content, while automating the packaging and delivery of that content, along with only the metadata needed for the specific platform it is targeting, simplifying transcoding processes and minimizing storage headaches, among other things. Albeit, IMF is still far from ubiquitous, largely because everyone’s needs and status along the digital workflow road are at different stages, according to Pierre-Anthony Lemieux, a SMPTE Fellow, who is the primary editor on the SMPTE IMF project known as SMPTE 35PM-50.

“Since 2012, lots of important things have happened with IMF,” explains Lemieux, a partner at Sandflow Consulting, where he focuses on entertainment content distribution, and chairs some other SMPTE committees, as well. “In addition to the specifications that make up the Core of IMF, two applications have been published, including the big one, Application Two, which is designed for studio masters and uses JPEG 2000 for video coding. That means IMF exists today and is ready to be used. We also held the most recent of our series of Plugfests in March [of 2015], kindly hosted by Netflix, with about a dozen manufacturers actively exchanging content and reporting on their findings. Those events have been crucial in sharing information and identifying issues for implementation. Now, the primary effort within the IMF community is two-fold. One aspect is to revise the specification, based on ongoing feedback from the field, which is important in keeping any standard alive—being able to actively react to feedback from implementers. That is being tackled right now as part of IMF 1.1. Additionally, one new application—Application 4—is being developed, which is called ‘Cinema Mezzanine,’ a joint project from collaborators in Europe with an objective aimed at preserving cinema material for historical purposes.

“But the adoption of IMF will be ongoing and different for every potential user. It is a foregone conclusion that the world is going toward a convergence between audio-visual and IT. That is happening today, but how soon different people convert to IT-based models for audio-visual content processing and management depends on many different factors. Obviously, IMF’s sweet spot is version management of high-quality masters. IMF might not be the best fit for those that do not have a need today to manage versions or receive content from those who manage versions. There are other options that focus on single version distribution, like UK DPP [an MXF-based interoperable format from the UK’s Digital Production Partnership]. This is being adopted in the UK and uses many of the same technologies as IMF, and also removes some of the complexity by not managing versions. So the question of rate of adoption has no simple answer. Certainly, one notion that has clearly changed in the last four years since IMF debuted is the realization that file-based workflows and version management are not just a fad. Obviously, it will only grow in importance. Everyone does not change at the same pace, and so, it will be ongoing.”
 
Lemieux elaborates that the importance and usefulness of IMF will logically only grow over time, and not only as a mastering file standard geared mainly toward traditional global content distributors like Hollywood studios. These entities were central in the original development of the specification in the mid-2000s when it originated as a project at the University of Southern California Entertainment Technology Center (USC ETC), before being handed off to the SMPTE IMF work group (as discussed in Newswatch in 2011) around the time that the original version of the IMF standard was being prepared for publication. That is because a lot has changed about the media landscape in the short period between 2011 and today, creating both new potential users and new potential IMF applications, according to Lemieux.
 
“It’s a reality that the media world is quickly changing,” he reminds. “A few years ago, it was just the Hollywood studios that had the problem of managing large numbers of versions of their content for worldwide distribution. Now, more entities are facing this challenge—not only content aggregators like Amazon, Apple, Google, and Netflix, which have a worldwide footprint, but also because we have content from [overseas broadcasters like] the BBC being routinely exported worldwide. Content is being distributed worldwide, creating multiple versions, different cuts, different title cards, and so on. In terms of the actual technology, it turns out that many of the leading IMF vendors are based in Europe and Asia. IMF uses international standards like MXF, TTML, and XML, all of which are deployed worldwide. So in other words, the foundation of IMF includes standards that have worldwide penetration, and things are continually moving fast.

“And that is why the modularity feature is so important. IMF is really a family of standards that consists largely of two layers. The core layer is about 95% of it, which is all the technology across all applications of IMF that are common to the different formats. The idea behind the core layer was that we kept that layer the same for everybody, and then allowed the differences needed for specific domains to be specified separately, and these are the IMF applications. There is 95% commonality across all applications. This makes it easier for implementers to support multiple IMF applications and to develop new ones, because only a small portion of any new application needs to be developed and written [from scratch]. In other words, the infrastructure part does not change. What does change are image characteristics related to image capture and applications, but that only involves about 5% of the spec, maybe even less. So IMF is architected to evolve rapidly without tearing the floorboards out.”
 
This means, for example, that new IMF applications may well gain in significance to broadcast-first entities, even though the specification was originally designed to be meaningful for higher bit-rate masters than what broadcasters traditionally have required. Lemieux points out that applications dealing with bit-rate reduction have not been an issue until recently, since IMF’s original focus was on studio masters and preservation work, “where quality trumps bit-rate reduction and JPEG 2000 is already deployed.”
 
“Plus, by merely removing the need to have multiple copies of exactly the same video [master] already, we were reducing the bit rate significantly by focusing on studio masters, and that was already a huge gain,” he adds. “But in more bit-rate sensitive applications like broadcast, or where there are established video codecs other than JPEG 2000 across the production chain, then I think we might see new applications that would include codecs that offer lower bit rates, like HEVC, for example. After all, if you look at how IMF has been managed so far, it has really been based on user interest. So if there is a need for that, then I would expect it to eventually be included.”

Similarly, IMF’s relevance in archival applications is likely to grow over time, Lemieux suggests, meaning more specifically near-line archives, or what he likes to refer to as “putting content on the shelf for future use, as opposed to complete disaster recovery archives, in which case, film strips into salt mines remain an option.”

“But for the more prosaic day-to-day archive, I would think IMF is a reasonable idea,” he suggests. “That’s because IMF is designed for the highest-quality master or version possible. Long-term archives, of course, require other ingredients, as well, but IMF can be a logical component for sure. After all, many components within IMF include standard tools and formats. I mean it’s hard to imagine people not being able to recover XML, for example, in the near future. And JPEG 2000 is a broadly deployed format, while the audio is actually baseband, with no encryption. So really, you are just talking about using widely deployed formats in the most efficient way possible—formats that a large number of people should be able to read and write and understand. Having the most people possible able to understand and use the data is, after all, probably the best defense against not being able to recover data.”
 
This leads to a larger point that Lemieux emphasizes—IMF’s human element—that aforementioned notion of evolving IMF, “based on user interest.” Along those lines, he calls SMPTE 35PM-50 “the real nexus for IMF these days, and the best place for anybody interested in IMF to get involved. The study group is where all the Plugfests and interoperability efforts originated, and that user input and communication directly impacted the standard, and what we are doing with it today. These things don’t happen randomly. We are encouraging people to participate. That is the big advantage of a standard being developed by an internationalized body—the ability for everybody to participate, to have an environment or ecosystem where people can discuss changes and new features. Part of the problem with achieving interoperability is when people do things on their own and don’t share with others. We have tried to create an environment where people can feel comfortable discussing their applications and their problems and come up with joint solutions. When they do that, there is a greater chance of interoperability. The corollary to all that, or course, is that nothing will happen if nobody participates.”  
 
For a comprehensive look at recent developments involving the IMF standard from Application 2 forward, check out a replay of the SMPTE IMF Standards Update Webcast presented by Annie Chang, Disney’s VP of Post Production Technology in March 2015.

News Briefs

Key Step for ATSC 3.0
The Advanced Television Systems Committee (ATSC) recently announced that the process to develop the next-generation ATSC 3.0 broadcast standard has taken an important step forward with the elevation of the first of five components within the Physical Layer transmission standard portion of ATSC 3.0 to ATSC “Candidate Standard” status. The ATSC Technology Group’s announcement this month declared that “System Discovery and Signaling” technology had attained Candidate Standard status, meaning that the process for approving the so-called “bootstrap signal” piece of the Physical Layer of ATSC 3.0 is now officially under way. ATSC officials expect core elements of the physical layer, which include its modulation system, error correction algorithms, constellations, and more, to be balloted for Candidate Standard status some time this summer. The importance of ATSC 3.0 as a terrestrial television broadcast standard capable of advanced functionality in the digital era cannot be understated in the sense that it will be the pathway that over-the-air broadcasters will use for the foreseeable future to deliver UHD content, immersive audio, and more, to various types of digital viewing platforms. Following the announcement this month, TV Technology published a couple of articles offering further analysis that you can find here and here.

Hollywood Cybersecurity
A major theme reverberating through the recent Hollywood IT Summit was the issue of cyber-security for major content creators and distributors. As Variety’s recent coverage of the event indicates, this is no surprise, given the impact that last year’s cyber-attack on Sony Pictures Entertainment had on the entire entertainment industry. The article states attendees at the event indicated that studios and production companies large and small are busy doing almost unending “soul searching” about how to avoid similar events, or at least, prepare to respond to them if they occur.  Sources who spoke with Variety’s Andrew Wallenstein said high-level preparations are being made at virtually every major media company for such events: security risks are being assessed daily; threat modeling and attack-mapping are routinely conducted; new encryption techniques are being developed; new security groups are being formed and security funding secured, among other techniques. One expert quoted in the article said he is “starting to see a retrenchment” in the industry,” and expects to see “more logging, watermarking, and session-based visibility of documents, budgets, and schedules” than ever before. However, as the Sony affair illustrated, one of the key problems at large media corporations can involve senior management, and so the article suggests the industry is also examining “new protocols” for requiring senior members of media companies to follow stricter security measures than was common before the Sony cyber-attack took place.

OLED Rising?
Display expert Ken Werner recently examined ongoing rumors across the display industry that a revival of sorts for OLED televisions is starting to gain steam. Werner detailed the trend in a column on the Website, HDTVexpert.com. In the column, he pointed to a variety of factors that seem to indicate that reports of OLED’s demise may have been premature. Among those trends are the fact that recently, LG Display has been increasing its panel production while LG Electronics has modestly been decreasing prices of certain televisions. Simultaneously, Werner points to industry details that he believes confirm a stream of rumors about Samsung re-entering the large-screen OLED business by utilizing a version of the Kodak-LG color-by-white technology, rather than the Samsung RGB technology it had used in the past for small and medium displays. Werner goes on to detail how recent events indicate that Samsung plans to commit to some kind of color-by-white methodology for large-screen OLED screens, and so, he concludes that, indeed, “Samsung is returning” to the OLED landscape after all.