SMPTE Newswatch Masthead

Hot Button Discussion

New Media Over IP Standard Foundation Set
By Michael Goldman  

Heading to 2017 NAB Show, Al Kovalick, founder of Media Systems Consulting and a longtime technical strategist and designer in the field of hybrid AV/IT systems, predicted that tools and techniques geared toward IT-based broadcast production facilities would be “the buzz of the show.” Shortly after NAB Show ended, Kovalick believed his prediction was correct, that “everyone was talking about IP,” and that “we didn’t have nearly that much interest last year.”
 
In particular, Kovalick was taken with the IP Showcase, which included, according to Kovalick, “a full video wall with 40 vendors contributing about 20 racks of equipment, filled top to bottom with IP senders, receivers, test equipment, and many multi-view monitors. The devices were all using the same [final draft media over IP] standard. It was quite an impressive interoperability demonstration.”
 
The final draft standard Kovalick referred to is SMPTE ST 2110, developed by SMPTE’s 32NF-60 Studio Video over IP Drafting Group, of which Kovalick is a member. It is not yet published, but its key components, at press time, had moved out of the ballot stage. During NAB Show, manufacturers were able to demonstrate beta versions of compliant tools that are well into development. Demos such as the IP Showcase created the “buzz” he referred to around the notion of moving the industry further along in its quest to achieve the interoperable, networked facilities that Kovalick says are inevitable.

“The standard is far enough along that [companies] feel confident they can start beta testing, and plans for future work,” Kovalick says. “All the major manufacturers are already aware of the standard’s status. The ‘on-the-wire’ bit packing had not changed over the months leading to NAB Show, and so, the remaining areas [to be approved] involved language about how to refer to other standards—the minutia, in other words. For that reason, companies came [to NAB Show] claiming to have products that were SMPTE ST 2110 compliant in some cases—they were that confident [about the status of the standard].”
 
Kovalick suggests the significance of this development, as the emerging standard is the lynchpin of the goal towards an IP-based industry.

“Those building new facilities, at this point, are going to make them as IP-based as they possibly can,” he says. “No one is going to say a brand-new facility should be all SDI and AES3 audio—they will leverage IP as much as is practical with, for the time being, bridges to and from the SDI/AES3 worlds. So facilities will be filled with native IP technology—cameras, servers, monitors, and devices of all kinds with IP I/O. The idea is to be future-proofed for the IP-only world to come, and in the meantime, build bridges so I can work between the two worlds.”
 
Indeed, that vision will take a long time to be fully realized. Kovalick predicts a hybrid world will continue for 10 or more years, but with the difference being that the bridges he talks about will increasingly be built permitting SDI and AES3 audio connectivity for as long as necessary.
 
“The idea will be as much IP as possible, with SDI and AES3 on one side of the bridge, IP on the other,” he explains. “And in the meantime, all major companies are making bridges. They don’t have any choice. A lot of people must be able to spit out [SDI-based] media and then [put it into] the IP world. So, going forward, it’s bridge world, believe me.”

The SMPTE 32NF-60 work is based on Video Services Forum’s (VSF) technical recommendation TR-03 (based on UDP/IP to transport media) and will eventually incorporate five primary sections. Those five sections are SMPTE ST 2110-10 (system timing and definitions), 2110-20 (uncompressed active video), 2110-21 (timing model for uncompressed video), 2110-30 (digital audio), and 2110-40 (metadata). At press time, -10, -20, and -30 had made it through the ballot process and were near to publication.
 
“Proposed standards SMPTE ST 2110-20, -30 and -40 describe different kinds of essence flows,” Kovalick says. “The actual payload data [video, audio, metadata] is different, but they are all very much related. “Without a doubt, this is the most important standard work at SMPTE with regard to IP.”

This suite of standards is a central piece of a larger industry-wide initiative to achieve interoperability for networked media, as illustrated by the Joint Task Force on Networked Media (JT-NM)’s so-called Roadmap of Networked Media Open Interoperability, designed to project how experts expect the fundamental technologies and possible timeline for how the JT-NM calls “an interoperable multi-vendor system” can be built across the industry. These topics are also mentioned in a recent report by SMPTE Standards Vice President Alan Lambshead in the May/June issue of the SMPTE Motion Imaging Journal.
 
But the more basic question might be, why is all this happening? Why, exactly, is the industry forging ahead with an initiative to replace, according to Kovalick, what is a historically effective, stable, easy-to-implement, and affordable technology in SDI?
 
“When you look at SDI, it's all one link, a single wire essentially,” Kovalick explains. “Audio, video, metadata all on one wire, multiplexed together in a complicated stream. A receiver takes the stream and says, here is the video part, here is the audio part, here’s the metadata part.

“In the new world, there will be different data flows for various types of data. There will be an independent IP flow from sender to receiver just for audio, an independent flow just for video, and another just for metadata. Why are we doing that? What is wrong with having it all in one? Well, the nice thing about having independent flows is, you can duplicate or drop them separately. If you want audio but don’t need the video, you can steer them how you want, providing much more flexibility. As a receiver, you can receive audio, video, and metadata and put them back together again, so it is entirely lossless in that sense—they are separate essence streams. It’s an important concept and a completely different way of thinking about the media transport problem.

“Also, when you get an IP connection, it’s bi-directional. Each end-to-end path supports the commonly used TCP and UDP protocols to transport data. I can send you video or control data [any data type], and you can send back audio or another video. IP is all about flexibility. So, you always have a bi-directional universal path between sender and receiver. You might not always use it, but it is technically there if you need it. Ethernet is, by definition, bi-directional—send and receive. SDI, on the other hand, is unidirectional—you can only send one way over the single wire, and dedicated to media
 
“Now, admittedly, all this makes the world of IP more complex—you have routers, switchers, identification, IP addresses, etc.,” he continues. “Whereas, there is a simplicity to SDI that people like. I think that simplicity will give SDI some additional longevity. Plus, all the large companies still have large SDI businesses. And there are still [advances] with SDI, like the 12G standard, which is especially good for 4K video.
 
“But all that said, there is no turning back. IP in the facility will be the replacement for SDI. The point is that, for a long time, they will have to co-exist, as I mentioned, with these bridges between the two worlds—converter boxes and things. So, it’s not like SDI will be ‘turned off’ tomorrow. It’s just that new facilities have to be [IP-based] to be future-proofed.”

For now, therefore, this ongoing transition period has resulted in a variety of vendor groups and industry organizations participating in developing a broad range of methods for essentially mapping media over IP transport. Among them are the Alliance for IP Media Solutions (AIMS), which promotes open standards used by media companies to move themselves from legacy SDI systems to IP-based systems; the Adaptive Sample Picture Encapsulation (ASPEN) group, a consortium of companies that have developed their own format for packaging and transporting uncompressed video streams around IP-based facilities; the Advanced Media Workflow Association (AMWA), an open industry forum for media companies and vendors to promote networked media workflows; and the aforementioned JT-NM, which is a task force made up of several key standards organizations, including SMPTE, AMWA, European Broadcast Union (EBU) and, VSF.
 
These and other organizations have put together various initiatives to work on specific challenges related to transitioning media facilities to IT-based systems, such as nodal registration and discovery—how to register networked devices, query such registries, and so on. AMWA’s Networked Media Open Specifications (NMOS) group has been working on problems such as this, and in the case of registration and discovery, the AMWA IS-04 specification resulted.

“There are many things that people are working on,” Kovalick states. “The new SMPTE suite of standards is the basis, but you do need things like registration and discovery, for instance. You must have ways to tell who the sender is, what formats [are available]—does the transmission only involve audio or only video or both? You need to know that. And then, there are other things we are working on that will come later—things we can’t discuss in detail yet. What about compressed video or compressed audio, for instance? We’re looking at all these things.”
 
Kovalick adds that the older SMPTE ST 2022-6 standard for sending digital video over IP networks—basically, placing high bit-rate SDI signals into an IP format—also will retain relevance on the new landscape for some time to come.

“SMPTE ST 2022-6 is basically wholesale, take the entire SDI payload data, don’t split it into pieces, and then put it over IP,” Kovalick explains. “That has been around for several years, and some people use it as a kind of SDI replacement, but it is a form of IP. Sports broadcast trucks use it because they like IP and sometimes equipment weighs less and is more flexible in terms of power and so on. In that sense, SMPTE ST 2022-6 is very mature, and like SDI itself, it won’t disappear just because 2110 is on the way. But, over time, I believe the SMPTE ST 2110 suite will replace ST 2022-6 because of the flexibility and universal nature of [the new standard]—realizing that everyone will be putting it into their facilities eventually. In a year or two, I’m guessing most [new products] will be geared toward just SMPTE ST 2110. So, oddly enough, we are not just in a transition between SDI and IP, but between two different forms of IP.”
 
Beyond the current transition, which as Kovalick said, will take many years to complete, the future for IT-based broadcast systems could eventually evolve far beyond the new standard alone. That’s because no one knows precisely what role the cloud may eventually play in how broadcast facilities produce content, Kovalick says. He discussed this issue with Newswatch in 2013, but says the precise notion of how the cloud may be incorporated into content creation at broadcast facilities, as opposed to archiving, backup/proxy work, and remote communications, is still forming, but doubts that SMPTE ST 2110 will have much relevance for public cloud workflows since “cloud networking does not support lossless transport of IP over UDP. ”

But generally, outside of the cloud, SMPTE ST 2110 will certainly serve as the foundation for the industry’s transition to IP-based workflows, and for a long time to come, he suggests.
 
“SDI lasted almost 30 years before something else came along? Will this new standard last that long? I can’t say that, but the world of IP [for media facilities] is just starting up. I expect [SMPTE ST 2110] to be [at the center] of the industry’s transition and around for a very long time, though.”

 

News Briefs
NAB Show Trends

With 2017 NAB Show now in the rearview mirror, as noted in the article above, IT-based systems were a major theme this year. In fact, a recent, comprehensive roundup of the show from TV Technology elaborated on the trend and highlighted many of the IP-connectable tools and infrastructure transition topics that were discussed formally and informally at NAB Show. Likewise, a recent NAB Show product summary focusing on audio technologies on display this year from CineMontage magazine highlighted several tools at the show that featured audio-over-IP solutions. But plenty of other trends were also emphasized this year, according to both articles, ranging from a major focus on the arrival of the new ATSC 3.0 broadcast standard, various business and regulatory changes impacting the broadcast industry, and a host of cloud-based creative solutions. 

VR for Stage and Film
Meanwhile, NAB Show's Future of Cinema Conference (produced in partnership with SMPTE) keynote speaker Rob Legato mentioned during his talk how some of the virtual production techniques his team utilized on the Oscar-winning movie, The Jungle Book, would soon evolve and contribute to a host of future movies and the art of movie-making generally, including a new effort he is working on with Jungle Book director John Favreau, reported to be a live-action version of The Lion King, according to The Hollywood Reporter. But The Hollywood Reporter mentioned in another article that some virtual reality technologies could soon contribute not only to the production of feature films but also to the staging of live shows. That article reported that Microsoft is currently working with Cirque du Soleil on a project to test how the Microsoft HoloLens augmented reality system can be used by directors and production designers to visualize the creation of sets, props, and staging before set construction.

MoCap for Dementia Research
A recent report in Computer Graphics World says the current trend of using new media technologies for medical research and therapies for aging patients battling dementia has now extended into the use of markerless motion-capture tools for such work. The article says researchers at the Rush Alzheimer’s Disease Center (RADC), which is currently conducting several studies on aging patients who have dementia, have found a way to make mo-cap useful in their research. In the article, researchers state that their work has determined that gait, mobility, and physical activity influence brain health, so they have launched an initiative to make a “digital record of participant’s motion as they complete different motor performances.” The goal is to quantify metrics of their physical capabilities to determine the correlation between those abilities and their cognitive problems. For the project, they are utilizing Microsoft Kinect motion sensor technology and iPi Soft mocap software to test patients in their homes as they work their way through approximately a dozen different kinds of movements.