Hot Button Discussion Private Clouds for Production
Of the many industries leaping headlong into cloud computing, few have more potential use for the concept than media companies. At its foundation, the term "cloud computing" simply references potentially locating an organization's data servers remotely—what people are calling "hosted data centers." The common view revolves around the public cloud, where data is simply stored and accessed remotely via the Internet, far from where work is being done, without the people doing that work necessarily knowing, or caring, where it is stored. But where sensitive, data-centric industries are concerned, the "public" concept breaks down. Governments, financial institutions, giant corporations, including, among others, major media companies want, and care, very much about where their data resides, who is using it, how, when, and under what conditions.
Such institutions also have to grapple with humongous data volumes, streams of giant files, and ultra-sensitive security requirements. For them, the cloud concept has become a private, rather than a public, affair. According to Spencer Stephens, Senior VP of Technical Services at Sony Pictures, private clouds qualify for the "cloud" label whether the servers are remote or local, because their location remains irrelevant to the data's users on a daily basis. Indeed, Stephens says private clouds are little more than "enablers for other work to be done elsewhere." Stephens has been an avid participant in Sony Pictures', and the industry's, ongoing conversations about private cloud services in recent years. In fact, he presented a paper on the topic of cloud services for media production at SMPTE's Annual Technical Conference in October.
"If you look at the nature of production—the structure that produces a movie is not just a studio," he says. "It's also indie production companies, post houses, audio houses, all sorts of big and little companies coming together, some only for the duration of the project. That brings up the issue of integration—you need to integrate their systems and material even if the content is being stored on various islands. So (content creation companies) need cloud solutions enabled with service-oriented architecture (SOA) services."
Thus, Stephens explains, there are tiers of private cloud approaches that media companies can utilize. They can use their own servers in their own data center. They can own servers in a co-location facility. They can lease, but manage, servers; lease fractional servers; or use Internet-connected servers—what Stephens calls "pure clouds."
But, in all these cases, how media companies get there is far from standardized—numerous options exist for server hardware, production software tools, security and management software, delivery pipes, and other tools. Data formats, of course, are the exception, with rapidly evolving work on the Interoperable Media Format (IMF) and other such formats happening to offer the industry a uniform format for digital masters that can be used as the foundation of any down-stream deliverables media companies may create. In many other ways, however, it is a question of making the right choices from a growing menu of options, depending on an institution's needs.
"We are in a period of transition as we move more toward web services for all this," Stephens says. "There is no real need to do data storage in a standardized way as long as the work stations or work in progress servers can access that storage through standard protocols. Therefore, the interfaces to the cloud need to be standardized, but that might just mean FTP, faster clients, and so on.
"But each (institution) needs to use some sort of SOA to allow for service extraction. Ideally, we can open up web services for specialized use, but we can do it through simple interfaces, like the idea of 'watch folders.' My process would watch certain folders on cloud storage. You drop your work in there, and my process picks it up. So you don't necessarily need a massive standardization effort to make this work. There are ad hoc solutions already available until web services can be deployed."
Stephens emphasizes there are also a variety of paths for connecting to clouds— including IP network WAN services, metro wave division multiplexing (MWDM) optical networks, extending Fiber channel and Infiniband storage area networks, among others.
Despite the components, it's becoming increasingly clear that major media companies can benefit from the private cloud business, but should be cautious. Stephens' employer, Sony Pictures, has done so with its Digital Backbone initiative, which launched in 2009. Essentially, Digital Backbone is a series of private cloud services, related networks, data management technologies, and various other hardware and software platforms that address two major "backbones" of the company's business— production, relating to data asset creation, and distribution. This is all a part of Sony Pictures' strategy to build a seamless, comprehensive infrastructure for the production, management, and distribution of reams of media assets on a daily basis.
Frequently, though, studios are investing in and building major components of that infrastructure to very particular specifications, often borrowing ideas, technology, and expertise from the IT world. Some highly advanced technology development went into Digital Backbone, for instance. One of those technologies is Sony Electronics' Media Backbone Conductor platform—an SOA system that was adapted from the IT industry. Conductor is an open platform, designed to make workflows visible while integrating different technologies from different manufacturers so that the studio can seamlessly plug disparate tools together, as necessary. Conductor also includes Sony's Media Bus Management technology, enabling it to support and manage high-resolution AV files.
That's only one high-profile example of what is going on in the industry. The point, Stephens says, is that sophisticated private cloud development is a particularly exciting concept in the media space—uniting technologies from various industries in new, interesting, and creative ways.
"What really governs how we put together a cloud service in the production world is economies of scale," he says. "We want the ability to expand as much as possible, regardless of geography. We were handicapped by massive data rates in an industry where an uncompressed 4k movie accounts for over a gigabyte per second, which is a massive data rate to run over a distance, but (transmission technologies) have improved for that. But it still affects where you put the cloud and how you manage it. (In the media world), we are focused entirely on cloud services for production right now. But our assets are so precious, and our risks are high if there is a security breach and our important data is not recoverable. Private clouds give us a way to be more efficient and still keep it all under control."
To learn more about the business, trends, and issues involved with cloud computing, check out Joe Weinman's website. Weinman maintains a personal site filled with articles, white papers, and scientific and business information and analysis about the evolving cloud phenomenon.
Devoncroft Market Research, which examines trends in digital media, is running a series of articles about the most important trends affecting the commercial success of today's broadcast industry. Part one of the series supplied data concluding that the conversion to HDTV operations remains, in fact, the No. 1 trend consuming the time and resources of broadcast institutions and professionals. Part two, released this month, concludes that the second important trend most crucial to commercial success in the broadcast industry is the shift to file-based operations. The main point is that the industry has inexorably concluded that a file-based infrastructure and workflow approach does, in fact, increase speed and efficiency in the production of broadcast media content. A host of other factors playing into the situation are examined in the report, which also lists the top technology vendors playing central roles in helping broadcasters to make the transition.
With high-profile web hacking and cyber war/cyber terrorism incidents on the rise, it is being reported that the U.S. Department of Homeland Security (DHS) has begun to deploy security measures to protect the so-called Border Gateway Protocol (BGP)—the Internet's core routing protocol system. A recent report from Network World suggests DHS is aggressively trying to make government agencies and their carriers the first adopters of these new measures, which are the result of millions of dollars and several years' research—measures that include the new Resource Public Key Infrastructure (RPKI) system created by the agency. RPKI is designed to help improve routing security with new layers of encryption to traffic between Internet registries and network operators to help prevent Internet routing attacks. The Network World article includes an interesting interview with Doug Maughan, director of the Cybersecurity Division at the DHS's Science and Technology Directorate, which explains the RPKI initiative in detail.
Rich Karpinski's blog at the Connected Planet website analyzes a recent report by the research firm IDC about the massive growth of mobile app downloads and their corresponding revenue in a year when most other sectors of the economy took a plunge. The report suggests App-related downloads in 2010 checked in at around 10.9 billion worldwide, and more importantly, it predicts the download number will skyrocket to about 76.9 billion by 2014. The report states that this will translate to about $35 billion in revenue in that period. Karpinski, however, argues that the analysis raises the question of what, exactly, is a mobile app to begin with? If "apps" come loaded onto tablet devices, are they "mobile apps?" Will "mobile apps" move over to televisions and other devices? Karpinski ponders these and other interesting questions as app madness proliferates across the globe.