Hot Button Discussion
Broadcast Networking Infrastructures
By Michael Goldman
If you want to see a technological culture clash in action, look no further than the ongoing drive to modernize the networking infrastructures of typical broadcasting institutions. It is there, suggests broadcast media consultant and longtime SMPTE Fellow John Luff, that traditional broadcast techniques, IT technology, and financial pressures are routinely colliding these days.
Luff's point is that a dichotomy exists as broadcasters push to rewire and position themselves to be ready for when highly expensive broadcast-centric hardware exits the scene entirely, in favor of IT-based digital software and networking tools that they hope will some day leave their infrastructures resolution-independent and, in theory, future-proofed with broadband capabilities being the only limitation on further evolution of their networks. Along the way, however, this means broadcasters have to make hard choices about what capabilities to spend their money on today, what data to store and manage, and how best to train staff and evolve their business culture for a world in which traditional content creation concepts will have to function under an information technology roof.
"The content creation side of the industry has not fundamentally changed at all," Luff says. "The process may now use different technology to facilitate what we do, but you still have to acquire images, edit, and publish them so that they can be transported downstream for viewing somewhere else. You can call that 'broadcast' or 'multimedia,' but it is not a fundamental change in terms of the creative process.
"At the same time, on the networking side, we have a flood of IT people coming in, and the IT industry has fundamentally transformed television, but they do not know a lot about content creation. At some point, you see how the industry can be split up into silos. We tend to think of this as a technology problem, but it's not--it's a business problem, a money problem. We can figure out ways to engineer solutions to any problems broadcasters have if there are good enough business reasons to do so. But it will take training and patience to make sure the next generation of industry professionals are savvy about new workflows and the content creation process."
The big technological change that Luff foresees for the broadcast industry is the eventual elimination of what he calls "purpose-built hardware for television broadcast use exclusively." He suggests, "we are currently in the last generation of purpose-built hardware, and I expect that it will go away entirely in the next decade."
In its place, Luff predicts entirely IT-based networks will be used to transport video around broadcast plants and distribute it to consumers. The reason for this shift, he adds, is not merely because of the larger IT revolution, or that it is now technologically possible. Rather, the reason is economics.
"A huge part of the industry's infrastructure has already made the shift," Luff says. "We are a small industry in terms of economic power on the hardware side of the business. The value of the industry lies in the value of the content we create. That value is potentially enormous, and a major segment of the economy. It is something consumers want to spend a lot of money on to support, but hardware manufacturers that make technology exclusive for broadcasters cannot make large profits on a small industry. The most obvious example of this is cameras. It used to cost between $70 and 100,000 for a [digital] broadcast camera head and another $50 to $100,000 for a lens package, which is more than $150,000 to deploy a professional digital broadcast camera system. Today, you wouldn't spend that much as a broadcaster if your life depended on it. Cameras now cost a fraction of that. There has been a huge shift [in] economics, so if you build cameras today, you need a market much bigger than just broadcasters."
Luff continues, "we adopted consumer electronics systems for much of what we do today in a broad sense for most production. Sure, there are still high-end facilities being built, but there is not enough business there to make an industry out of it, so that sort of thing is a specialized application. For most of us, [broadcast camera systems today] are built on the foundation of consumer technologies--that is how vendors make a viable business out of it." Therefore, he suggests, a similar evolution is taking place as it relates to broadcast facility and distribution networking technologies. The industry has no choice, it has to adopt IT tools--as well as embrace them.
Luff insists that even the typical areas of a broadcast networking infrastructure that have not yet fully evolved into the IT world will be doing so soon.
"Routing switchers for live production is the one place where this has not yet happened, but we are pretty close to it--no more than five years off," he says. "The reason it has not happened yet, like in these other matters, is money. Why design the next generation of HD broadcast switchers now, when you can put some 'secret sauce' on existing models and get more out of your old legacy hardware for another couple of years? That said, at NAB this year, I am told there was a demo for a video switcher from a major manufacturer with no BNC's at all--only Ethernet connections for different signals. That was an 'aha' moment for me, similar to my memory of seeing NHK's first HD camera and monitor at a SMPTE conference in the late 1970s. I felt then, that changed everything and I feel the same way now--it is only a question of time. Eventually, it will make more sense to design and integrate a software-centric infrastructure that runs through a rack of blade servers without anyone really caring what the hardware is. This represents a shift for broadcast engineers--they are starting to realize they should care primarily about what application runs on the hardware, more than the hardware itself."
Luff's argument is based on his belief that content creation and distribution will become more efficient if the transition is successful. Consumer television manufacturers, after all, are leading a push for 4K content in order to sell 4K televisions. That means "fully interchangeable future networks" will eventually be the way to go, because broadcasters can then monetize 1080p content as they are doing today, and also protect themselves for when the world of 4K content dawns, rather than having to rebuild their infrastructures from scratch.
"If you think about it, specifically designing a 4K network with a 4K routing switcher on top of the architecture of today's video switchers would be enormously expensive to maintain, and it would consume lots of power," Luff explains. "But if you switch your infrastructure to something that is easily interchangeable for future image formats, then you have crafted something that can go 4K, 8K, whatever frame rate or standard you want. In such a network, a 1080p editing platform today can easily be 4K tomorrow. That part of the system is already resolution-independent for the most part. You could build a distribution structure that does not care what resolution the content is, just where it is going, and that will be a matter only of bandwidth for each distribution circuit."
The goal is to have a resolution and frame-rate independent infrastructure. Although this would obviously mean big changes for hardware manufacturers, Luff believes the end result would be to "solidify our industry economically."
Thus, he argues, even with today's perilous economy, "now is a great time to future-proof. We have developed a stable industry that [provides] HD production, distribution, and delivery to the home already. So if you can build out your infrastructure based on software, to extend to 4K and beyond, then you will never have to worry about a new '4K infrastructure' or '8K infrastructure' in the future. If we can build an infrastructure that does not care if we move 8K images as easily as 4K or 1080p images or standard def images, then we will reach a wonderful place in our industry."
Challenges, of course, include the aforementioned issue of bandwidth. Luff sees numerous solutions on the horizon. He suggests, however, a more complicated challenge will be the issue of data storage/archiving/management--not because good technology is not being developed for these tools, but rather, because no amount of technology will be able to accurately keep pace with the sheer volume of broadcast data that already exists, and is being generated everyday.
Again, he believes smart business choices will be the ultimate solution for broadcasters in dealing with this issue.
"Developments like [the Interoperable Master Format] are being discussed for [metadata control and file management], and they have important potential at the high-end content creation level," he says. "That will mainly solve problems for the motion-picture industry and the high-end episodic broadcast side; however, that is not where most content resides. Most content, in terms of the total volume created each day, comes from broadcast and cable news, entities like big networks all the way to local TV stations and news services. If you examine the total minutes of content created each day, I would suspect it would overwhelm the production industry by a huge percentage. Most cameras are out there shooting news, and news does not need IMF, to be blunt. IMF won't enhance anyone's news or content creation strategy in a significant way."
"When we talk about archiving, the good news is that we are more concerned about it today than 50 or 60 years ago. With the total volume of content we create today, archiving, managing, and storing it all is a huge issue. This probably could be solved with enough money, but we know, that is the problem. People won't invest dollars without an economic reason to preserve that content, and the Cloud does not solve this problem, it just moves it to a different location. The data, if you want to keep it, has to be stored somewhere. Therefore, broadcasters from networks to local stations will have to decide which versions of broadcasts to save and which ones to let go. They will do that based on economics. Like anything else, it comes down to who has the money. The technology will be there to help them store whatever they decide is worthwhile to store. They just won't be preserving all of it, and it doesn't make much sense for them to try."
Luff emphasizes, the industry is working hard to investigate the best ways to use IT-based networks for professional media creation and distribution and to educate professionals in both the broadcast and the IT world about how the industries need to come together. The SMPTE-EBU Joint Task Force on Networked Media, for instance, ramped up in March and is now examining how the industry might better package uncompressed HD or SD data in various combinations so that it can more efficiently travel over IP networks within a broadcast infrastructure. As industry consultant Wes Simpson recently reported in TV Technology magazine, various other standard groups are also examining this issue and the larger general question of video over IP networks [see the June 2013 SMPTE Newswatch for more on Web-based broadcasting trends].
"The fruits of this work might start showing up about 24 months from now," says Luff. "When standards are in place that will remove the need to rely on video routing switchers to a large extent within broadcast institutions. It will also facilitate resolution independence. After all, the routing structure only cares about the amount of data packets passing through it--not about vertical synch or resolution or such. This can permit an incremental change to happen over a period of time without being disruptive to the broadcast industry, so that is where we are going."
As important as the technology, however, is the human element--helping industry professionals comprehend the cultural divide between broadcasting and IT. Luff points to educational efforts around SMPTE and other industry organizations, including one he and Wes Simpson are spearheading--an ongoing seminar program called "Bridging the Broadcast/IT Gap: Video and IT Technology for Engineers." Among other things, the program guides participants through the design process of a model broadcast facility implementing IT technology and concepts.
"That is our next major issue--to make sure the next generation of industry professionals are savvy about not only the workflow, but also about the content creation process at a technical level, and how they inter-relate," Luff says. "People need to know what is good imaging and what is not, and how to use the tools to the best advantage. We want them to have the institutional knowledge of decades ago and be able to apply and sustain it into the future."
With bandwidth being central to the entire future of the IT-based broadcasting paradigm, the evolution of the High Efficiency Video Coding (HEVC/H.265) compression scheme is crucial (as discussed in the April 2013 SMPTE Newswatch). In a recent analysis in Broadcast Engineering Magazine, Kanaan Jemili, an executive at digital entertainment technology company Rovi Corp., postulated that HEVC is crucial for both consumers and content creators as they take the next step into the IT-centric future of broadcasting. He also discussed the slow adoption rate of HEVC thus far and the business factors behind it. The primary factor is a lack of HEVC content and HEVC media playback devices on the consumer end. Jemili suggests that the key to breaking this chicken-egg dilemma is for software manufacturers to see the benefit in providing free or cheap tools for consumers to convert personal content to the HEVC scheme and view it that way. On the professional side, he recommends highly-automated encoding methodologies to simplify the conversion of existing content to HEVC, and many other proposals.
Don't Forget VP9
On the topic of new video compression schemes, Google has been in the news recently with its announcement that it has officially enabled its new VP9 video codec on its Chrome development channel. Google is promoting VP9, the successor to its earlier VP8 scheme, as a free and open-source alternative to HEVC, which Google claims, is capable of performing "slightly better" than HEVC and its predecessor, H.264. This claim has been hotly debated in recent months in various online forums. VP9's mere development and advancement, however, indicates the importance of the compression solution to bandwidth logjams for broadcast content distribution in the modern era and going forward. Analyst Jan Ozer examined HEVC back in February on the Streaming Media website, and as part of that discussion, compared HEVC to VP8 and speculated about looming comparisons between HEVC and VP9-comparisons likely to become a hot topic in coming months.
Speaking of Google and the evolution of Internet companies generally, the Financial Times website recently published an in-depth look inside the new 41,000 square ft YouTube Space production studio, which Google has helped finance at its new complex in Playa Vista, near Los Angeles. Matthew Garrahan, who penned the article, was given an in-depth tour of the facility, which had been used for decades as a motion picture sound stage. He suggests the studio represents the official merging of Hollywood and Silicon Valley as content producing entities. Under certain circumstances, the space is being made available, for free, to so-called "content creators" out of YouTube's universe--anyone potentially--to make videos, shorts, webisodes, and movies that could draw traffic and revenue to YouTube. The facility features three full stages, a green-screen studio, motion-capture studio, rehearsal studio, screening room, outdoor ampitheater, and audio and video post-production suites that include state-of-the-art digital equipment. In the article, Robert Kynel, YouTube's chief of global content, claims the facility brings the best of the Web-openness to Hollywood. Garrahan adds that is not completely the case. YouTube is prioritizing time and spots to content creators who it thinks have potential to create commercial, or at least traffic-producing, content.