Newswatch e-newsletter

Current Issue - September 2015

SMPTE Newswatch Masthead

Hot Button Discussion

Precise Time and Synch on an Imprecise Landscape
By Michael Goldman  

Over the course of the past year, one consequence of the media industry’s initial strides toward networked, IT-based broadcast facilities has been for the industry to ramp up an examination of the steps needed to ensure that broadcasters can generate and transmit synchronous video signals across networks large and small, including on and between those that are state-of-the-art and those that will still dwell in a barely digital nether-region for the time being. In other words, the industry has been working on the problem of what is the most efficient way to generate, transmit, and manage such signals through a hybrid, or patchwork, minefield of systems and technologies. This question has led SMPTE and other industry bodies to advance work on how best to apply the industry standard specification for ensuring that synchronized time can be delivered from control stations to all kinds of slave devices—the IEEE’s 1588 Precision Time Protocol (PTP)—to professional broadcast applications specifically.
Patrick Waddell, standards and regulatory manager for Harmonic Inc., a SMPTE Fellow, and head of the SMPTE 32NF-80 Time Labeling and Synchronization working group, points out this is an area where “lots of paradigms are being shifted” for the simple reason that synchronized networks consisting of multiple devices and technologies logically need ultra-accurate timing to make sure data transmission and broadcast events all go down as broadcasters intend. On the current landscape, therefore, the IEEE standard is now becoming critically important as networked media facilities, such as ESPN’s Digital Center 2 facility, begin to enter the scene.

“PTP is actually much more than just a protocol, because there is a database of registers and defined roles that devices can take, including how you negotiate what your role is and so forth—it’s actually close to 2,000 pages,” Waddell says. “But the really smart thing that the IEEE committee did when they wrote 1588 was that they realized that each industry would have their own particular set of requirements and restraints. And so, they defined a set of profiles and levels. IEEE 1588 allows other standards development groups to define their own sets of profiles, and so, both SMPTE and the AES [Audio Engineering Society] have moved forward in defining specific constraints for applications in their particular industries.
“While there are some minor nuances that are different, those two groups are talking, and we are all ‘joined at the hip.’ Having said that, the SMPTE working group that I head will start interoperability testing in November 2015, with another session before NAB [2016], another after NAB, and repeating, in all likelihood, one or two more cycles. This will be an ongoing activity that will last for several years.”

Meanwhile, the SMPTE profile for using PTP to maximum effect in the broadcast world was published at NAB this year when both SMPTE ST-2059-1 (Generation and Alignment of Interface Signals to the SMPTE Epoch) and ST-2059-2 (SMPTE Profile for use of IEEE-1588 Precision Time Protocol in Professional Broadcast Applications) were published.
“Essentially, if you think about how cellphone systems work, as you move between cell towers, your call is handed to the next tower without you being aware that anything is happening,” Waddell explains. “That kind of thing requires timing not quite to the nanosecond, but close. 1588 provides that level of timing accuracy, and there was no need for SMPTE to re-invent the wheel when we needed to look toward moving to new timing and synchronization systems, since the IEEE standard had already been around for a while—that was the standard to use. Now that we have created profiles that address the broadcast industry specifically, it has started to solve a whole bunch of problems in certain corners of the industry.”

Those problems badly needed attention because, although the industry has “marched off into the digital age, we still have an analog millstone around our necks.” Waddell says. By that, he means that all-digital facilities like ESPN’s are still the exception, rather than the rule, and across the country, a wide range of broadcast plants continue to use tools and techniques for synchronization and control that come from earlier eras. The new specification, therefore, had to be constructed in such a way as to be robust enough to handle what will come in the foreseeable future, and yet, be modular enough to handle all the many transmission flavors currently in existence, and expected to continue in existence for quite some time.

“Baseband media includes video, audio, and metadata over IP, which is covered in standard [ST-2022-6] and has an optional FEC component that is documented in ST-2022, part five, but not typically used in studio applications,” Waddell says. “Those documents were written originally for long-haul applications, for folks who do major remote production. If you think of what you do for a football game or the Oscars or any other special event—you set up circuits well in advance of the event, get them provisioned and operational, and then when everyone is happy on both ends, do the event, and finally tear down the circuits, pack up, and go away. [ST-2022-6] was not set up for the kinds of things we typically do in live production, or even in a lot of episodic production anymore, for that matter. Meanwhile, a lot of users began demanding a switchable version. Then, new, big systems like ESPN’s came along, illustrating that there were newer ways to provide switching. 

“Still, that did not solve all the problems, because many facilities still have miles of cabling tucked into their floors or ceilings—not just to run all the signals around, but also to carry what are called black burst synchronization signals. While tri-level synch is now sometimes used instead for HD, much of the world still uses black burst, which is what they have always used for NTSC or PAL analog.
“One of the reasons that SMPTE wrote its own profile for PTP is to permit devices to generate synchronized video without black burst. In addition, there is a need to accommodate a daily discontinuity [at many broadcast stations] called the ‘daily jam.’ Almost everybody still does a daily jam—its just part of typical station operation in facilities that use ST 12-1 timecode to match up the broadcaster’s internal system time of day with official civil time. We had to make provisions for things like that, to ensure that everybody will be doing this the same way.  
“Black burst is our industry’s analog millstone, and SMPTE’s work in timing and synch, among other things, has been about getting that millstone off our necks.”

All of this relates to the fact that the hybrid nature of the industry now and for the foreseeable future continues the so-called “island trend” of building broadcasting infrastructure as technology evolves and improves as identifiable “islands” of technology. Waddell emphasizes that as the digital transition began, broadcasters started with “digital islands in composite plants. Then, they built HD islands in SD plants. And so, we are now going to end up making IP islands within coax plants. It’s the same paradigm, and should work as well as in the past. ST 2059 [SMPTE profile for the Precision Time Protocol] in this area is crucial to making those islands work, and allowing that transition to go forward.”
With that standard in place, perhaps the biggest part of this “paradigm shift” that is still evolving is the “control” portion of the time and synchronization topic when it comes to building broadcast plants in the IP universe, according to Waddell.
“There are three legs that make a nice little tripod of elements for synch and control over IP,” he says. “The first is the essence itself over IP [SMPTE ST 2022], the second is the synchronization of time information necessary to keep all systems working together in harmony [SMPTE ST 2059], and the third is the ability to control all of them and figure out what is out there and what is working correctly and what isn’t. This third element of media control over IP is necessary for building a real broadcast plant. Think about what you have to do to put a show made in front of a live audience onto servers, from where it will get distributed—now called ‘live to disk,’ rather than live to tape production. The control room has to function in realtime, in front of a real audience and real people on the stage, and real camera operators and crew around. But you still need to be able to press a button and roll—not a tape deck, now it’s a server, but it’s the same thing, to bring in interstitial material and pre-edited pieces. Right now, most users are still running a whole bunch of serial data cables around to carry all those control signals that say start, stop, play, rewind, and so on.

“But our friends at ESPN, since they are one of the leaders in the space of building new facilities not tied to serial data communications, knew that something better had to come forward. So they led the effort to put together a new drafting group within SMPTE’s 34CS Technology Committee that deals with media device control over IP, called—no surprise—Media Device Control over IP, or MDCoIP, for short. And they came up with a set of documents—the ST 2071 standard, which is a four-part set. Three of them have been published in the previous years. What they provide is the ability to discover devices and signal flows and assets in an automated way, so that the system can keep reconfiguring itself and keep going. This suite of standards heavily leverages the existing technology developed for the Internet, so once again, SMPTE is repurposing existing, outside standards. At this point, the ST 2071 documents are reasonably mature, and people are now beginning to build on them.”
All of which has led to what Waddell calls “a critical mass of technology.” By that, he means that all the standards work and industry development in this area has finally gone far enough down the road for hardware and software manufacturers to have all the tools, data, and infrastructure they need to build the technology broadcasters need to construct fully modern broadcast plants.
“The production equipment world for a number of years seemed to ignore the work SMPTE was doing in this area, but has now discovered it, which is good news,” he says. “We have a bunch of new players in the business joining my working group, and you also have a scramble of development teams and marketing teams creating new products to match the new standards. Still, while we may have most of the documentation in place right now to build IP-based systems fully timed by PTP and controlled via ST 2071, it is true it will take a couple more years before we get a sizeable amount of gear that actually does all that. That is a reality check for users, but then again, they can now go to the manufacturers they know and love and ask them, when will you be bringing this gear out?
“So the bottom line is that the end users really need to pay attention to this subject and ask for the features. The standards are in place, so if there is enough of a demand, they should be able to get that equipment sooner rather than later, and to be able to buy it from more than one company, which is typically what a user wants to do.”

News Briefs

IBC Roundup
Numerous analysts have recently covered this month’s IBC trade show in Amsterdam. Virtually all of them hone in on the fact that major themes at the famed broadcast convention this year included higher dynamic range and various issues related to the ongoing rollout of UHD. One of the more compelling examinations was The Film Book blog coverage from industry pundit Benjamin B on the American Cinematographer Website. He suggests that this year’s show is really a continuation of last year’s show in the sense that the issues of HDR, and more generally “more color and more contrast,” continue to grow exponentially in importance across the industry. Benjamin looked back at his coverage of the 2014 show and compared it to his observations this year, and discussed what has changed. His main point is that everything he discussed in 2014 is further down the road in 2015, including things like new resolution standards for 4K cinema and UHDTV, color space expansion with Rec. 2020, and ongoing developments with higher frame rates, compression, and the rollout of the ACES standard, among other things. Ericsson CTO, Steve Plunkett, also had an interesting take on the show in a posting at the Televisual site, and here is the Hollywood Reporter’s coverage of the event in two parts—one and two

4K Live from Space
A recent article from TV Technology points out that, regarding the ongoing “dearth of 4K programming” to match up with “the sales growth” of UHD televisions, one unexpected entity is doing its part to change that—NASA. The article says NASA, in partnership with Harmonic, has announced it will be offering free 4K content to cable systems and other providers, straight from the International Space Station, the Hubble Space Telescope, and other NASA missions. Such imagery live from space will be available on a new NASA 4K channel that Harmonic will help make possible using a process that includes taking a raw 4K feed from NASA and processing and re-encoding the feed using AVC compression to free up enough bandwidth for the journey from space to NASA’s transponders. The article also states that NASA and Harmonic are conducting experiments with high dynamic range video in order to figure out what kind of encoding scheme they will need to employ to eventually bring HDR-style 4K imagery to the channel for those with newer HDR-capable televisions. It also examines the ongoing cinematography training that NASA astronauts are getting to learn how to utilize modern, high-resolution digital camera systems, such as the Red Epic Dragon 6K camera.    
LiFi Coming?
A report on the TechNewsWorld site recently examined ongoing progress being made with research into communications systems that can transmit data through the use of what is being called visible light communication (VLC). According to the report, researchers from Disney demonstrated something called “Linux Light Bulbs,” which essentially are bulbs designed to communicate with each other and other VLC devices using Internet protocol. The research, right now, according to the article, related to how to make devices like toys and wearable technology send and receive signals using light, but conceptually, the work could eventually have applications for those interested in establishing networks comparable to WiFi systems—what is being called, for now, “LiFi.” The article hastens to add this research is not brand new, VLC has limitations, including speed currently, and there are other approaches being investigated, as well. But it does give a thorough explanation of how VLC operates, and the reason why the bulbs are called Linux Light Bulbs (basically, the systems utilize Linux software to enable the signals to work with Internet protocol).