Hot Button Discussion
By Michael Goldman
As 2012 matures, the cloud computing trend continues to impact media companies in numerous ways, as reported in SMPTE Newswatchlate last year. Exactly how these companies are using the cloud, or might use the cloud, however, remains a less than settled issue given the unique nature of the product they produce—high bandwidth, high resolution moving images and associated metadata that, when put all together, transform into pieces of intellectual property that are extremely valuable and require ultra security and backup. Such a product does not fit easily into many of the existing cloud paradigms being developed out of the IT world for other types of industries that are increasingly turning to cloud service providers (CSP's). Thus, such companies continue to rely on private networks and building proprietary data infrastructures of various types to produce, manage, protect, back-up, and distribute their audio/video content.
However, that said, studios, broadcasters, and other types of media companies remain, at their core, businesses—sometimes, very large businesses. And, therefore, many aspects of their operations are no different than companies in other industries, meaning, of course, that existing CSP paradigms do, or can, in fact, work in certain sectors of a media organization. Therefore, cloud providers and their various security practices and protocols are growing increasingly important to such businesses each day, according to Al Kovalick, founder of Media Systems Consulting, and a longtime technical strategist and designer in the field of hybrid AV+IT systems. Kovalick has worked for Hewlett-Packard, Pinnacle Systems, and Avid, and is a SMPTE Fellow and author who has written and lectured extensively about the merging of the IT world with the content creation world. The state of reliable security for cloud-based systems has recently matured so much, he suggests, that media companies will find increasing use for cloud providers in coming months and years.
"Keep in mind, we are only at about year five or six of the cloud," Kovalick says. "People haven't always known where it fits in (at a media company). You look at broadcast stations, and you say you will never playout to air from the cloud. But what about all the low-hanging fruit at the broadcast facility? What parts of that could be ported to the cloud? All businesses have those aspects that fit the cloud's usefulness. A major media facility has human resources, PR, finance, a web presence, scheduling, traffic, ad sales, billing, and so on. These facilities are multi-layered—the video side of the business, the business side, and the operations side. Media companies need to look at each layer and ask, is there anything in this layer that the cloud would be useful for? For on-air operations—probably not anytime soon. But for these other things? It's already happening."
Indeed, Kovalick suggests that service-oriented architecture (SOA) and software-as-a-service (SaaS) applications can seamlessly integrate into the operations of certain types of media companies. This includes some cloud-based remote operations (group sharing), file distribution, transcoding services, program streaming, third-party interaction, and much more, in addition to the "low-hanging fruit" aspect of the business. On the other hand, content creation, editorial, graphics, and long-term archival work are only partly viable today within a cloud model, he adds. And automating playback and general on-air operations, he suggests, likely will not port over to cloud service providers for some time.
But even for those other operations noted above, in the security-conscious media world, companies are moving cautiously. However, Kovalick estimates there are now about 1,500 major CSP outfits—SaaS providers, in particular—some of which cater primarily to media companies with online broadcast and distribution services, among other things, such as Kit Digital, Panvidea, and others. At the core of this growing industry is, he suggests, the urgent requirement to offer redundant and reliable security technologies, services, and techniques. Their job, he says, is to "keep the bad guys out," but that is, he adds, "a shared responsibility."
"If I'm a media company, the cloud service can clearly do certain things (on the security front) better than me," he says. "For example—denial of service attacks. I don't want to deal with a flood of malicious service requests, but the provider can. On the other hand, if I want to open certain parts of certain applications to certain users—that is something I want to do myself, rather than having the provider manage that for me. So there is an application layer of security that I would manage, while the CSP operates the infrastructure security layer. They keep most bad guys out, but after someone is in, it becomes the client's responsibility to deal with app or service access. So it's a shared responsibility in total."
The sophistication and technological capabilities of high-end CSP's, Kovalick suggests, makes, for instance, the irretrievable loss of precious data increasingly rare in the modern era. "They make two, three, four copies of all data and store it in different data centers in different parts of the country or the world," he says. "Because of this, they will likely always be able to replicate my data in time, even if (disaster strikes). The issue is what window of delay will there be, more than if I'll get my data back. You pay for it, of course, but when you do, you get multiple copies of your data over a wide geographical reach."
Kovalick says the IT world is engaged in an ongoing process of developing and honing best practices for data security management inside data centers which, in his opinion, also are values that cloud providers "will have to be compliant with." Central to that initiative are Service Organization Control (SOC) auditing reports for organizations dealing with large volumes of data to ascertain the risks associated with those organizations. These reports are being spearheaded by the American Institute of CPAs (AICPA), which recently published a new guide, dubbed SOC1, that offers guidelines for reporting to IT organizations about controls of cloud-based organizations. For example, in early March, a major cloud service provider, GoGrid, completed its SOC1 audit, as explained in this news report at the Cloud Computing website, and that company is now touting the successful audit as evidence of its commitment to the highest security protocols and data management policies.
Beyond that kind of work, standards for creating hash values, data encryption, and coding and decoding files, of course, are nothing new. But Kovalick says, just as important are general best practice concepts and awareness on the part of media companies about what a particular provider's methods are.
"If you have a file in the cloud and you delete it, does the provider really delete it? The best practice is not to just delete it, but to zero out the data," he says. "Most vendors zero out the data when you say delete. That is an example of a best practice, rather than a standard, and understanding them is very important (in choosing a CSP)."
Join Tektronix in our free video seminars to learn about the key measurements and test techniques for Digital TV, MPEG Test and File Based Workflows. Simply click here to register.
Data management companies are addressing and improving these practices daily, including many that cater to media clients. Intel, for example, published, in late 2011, an extremely comprehensive Cloud Security Planning Guide for clients to understand how to work with providers to make their cloud system as bullet-proof as possible. The document talks about setting security policies, layering technologies, sharing responsibilities, the changing nature of data threats, the latest on evolving architecture technologies, the regulatory environment, how to identify vulnerabilities, advice for IT managers, data encryption tips, and much more.
Further, an ongoing effort is under way via a not-for-profit organization called the Cloud Security Alliance (CSA) to promote existing best practices for maintaining secure data in the cloud, and developing new ones, as well as educating various industries about the issues involved with cloud computing. The CSA will be hosting a conference in Frankfurt, Germany, in May, designed to educate attendees on cloud security and privacy issues and developments. You can find out more about that event, Secure Cloud 2012, here.
Still, Kovalick and others are cautious when asked to prognosticate about whether cloud computing can take the next steps in coming years and evolve into a secure, longterm repository for archiving data, and eventually, even for doing editing and other forms of remote collaboration work online.
"Good cloud providers already do data migration, and I'm sure they can do it for 100 years if they are still around," he says. "But do you trust that provider? Will you trust them in 20 years? I'm not sure media companies will deposit their golden assets there longterm in that sense. But on the other hand, consumers are doing it—many of us have our photos or email stored with Google, Apple, Microsoft, or another provider. And, as the technology gets better, I'm sure the cloud could end up being a better place to keep longterm data compared to a salt mine. Will it replace a filmout stored in a vault somewhere? It could work some day. Maybe you would use different cloud providers, two for the belt and two for the suspenders, so to speak. If you had triple mirrored protection for each provider, maybe six versions of your content, I'd think you would be able to find the content even if something drastic happened to one of them. That will probably be something to take advantage of someday. But I doubt we'd see the cloud used for live production or on-air operations in the near term."
The cool thing, Kovalick adds, about the cloud and SOA is the wide range of interesting online applications that are being developed with relevance to media creators, including some that seem to show great promise for non-live production and collaboration. He points to Avid's Interplay Central online media editing and management initiative which, he says, although designed to execute from a data center, could potentially work seamlessly and securely with a cloud service provider for certain types of projects, such as news videos and content. "(Avid) designed (Interplay Central) to run in a data center, but the cloud is just as good as a server in the basement from the point-of-view of the people using it," he says.
An important recent digital camera acquisition announcement to hit the industry was Canon's official introduction of its new EOS 5D Mark III Digital DSLR camera earlier this month. With DSLR camera technology pushing further into the moving-picture world than anyone anticipated a couple of years ago—including major feature film work done with the Mark III's predecessor, the Mark II—industry pundits were quite interested in the camera's long-anticipated arrival. Analyst Barry Braverman was among the first to examine and comment on the Mark III earlier this month for the Studio Daily website, and you can find his initial impressions here. Braverman suggests that Canon has addressed many of the limitations, as it relates to motion-picture use, of the Mark II with the new camera, including a two-stop improvement in low-light capability, improved sampling and processing capabilities, better native resolution, better on-chip noise reduction, and much more.
Broadcast technology veteran Randy Hoffner recently penned an interesting piece for the TV Technology website pondering the future of computer processing chips if and when they need to evolve out of the current silicon semiconductor model. His column explores the history of the transistor and, later, the arrival of silicon semiconductors to power all sorts of electronic devices, and then, the arrival of the Moore's Law theory as it relates to the routine shrinking of electronic components on integrated circuits on a stable, ongoing basis. But Hoffner suggests the electronics' world may be approaching a point where Moore's Law begins to max out, as existing chip designs simply begin running out of room to continue adding components in an energy efficient way. Thus, he speculates, something else will eventually have to replace silicon-based semiconductors once they officially hit a wall. One idea he explores is the use of carbon nanotube-based transistors as, potentially, a "next step" before something even more radical, and as of yet uninvented, takes hold.
Meanwhile, though, semiconductor technology isn't exactly giving up the game just yet. IBM recently revealed a prototype design for a new chipset that it claims is capable of transmitting data at speeds of up to a terabit per second. According to a recent article at the TechNewsWorld site, what IBM has dubbed the "Holey Optochip" is a technology that utilizes optical pathways to increase data transmission speed—a development the company is touting as important for the growing data farm industry in order to let those facilities improve app and video download times, among other things. The chip's design is reportedly based on traditional compact silicon chip design, but combines that with optical pathway transceiver methodologies. IBM says it has freed up more space for data to travel through holes, or "optical vias" as they are known, punched into the chip. The article claims the design could theoretically permit the download to a supercomputer of 500 HD movies in a second or two under certain circumstances. For now, such supercomputers in data centers will be the likely beneficiaries of the technology once it matures, according to the article.
Opinions expressed in SMPTE Newswatch do not necessarily reflect those of SMPTE. Reference to specific products do not represent an endorsement, recommendation, or promotion.