Newswatch April 2012

 

SMPTE Industry News - Monthly Tech Focused Newsletter of the Society of Motion Picture and Television Engineers

Hot Button Discussion
Monitoring and QC Trends
By Michael Goldman

Although the various digital broadcast signal monitoring technologies on display this month at NAB 2012 hardly represent the sexiest category at the show, such systems represent a technology classification growing in importance as the file-based, multi-signal, multi-format broadcast era matures. Unlike the old analog world, flaws in a data stream somewhere—anywhere—between ingest and transmission to consumers can frequently be too difficult to detect until it is too late. That's the complication of the DTV era as far as signal monitoring and quality control are concerned. The good news is that high-end digital broadcast station facilities designed and operated by people who know what they are doing are growing increasingly stable and efficient, meaning there is generally less potential for flaws and artifacts to be introduced into the data stream to begin with.

At least, that's the view of Harvey Arnold, Corporate Director of Engineering for the Sinclair Broadcast Group, a SMPTE Fellow, and conference program committee and Board of Editors' member. Arnold's point is that the first step modern broadcasters need to take in terms of ensuring a stable signal all the way down the distribution chain is to design and construct an infrastructure in a way that mitigates the potential for signal problems in the first place.

 

"In new station builds, we focus more on selecting and installing equipment that is stable, rather than spending huge amounts of money on doing complex measurements," he says. "It's simply not needed as much as it was in the analog era. On a daily basis, there is generally less maintenance to worry about. Digital video and digital audio signals are generally stable, and in most cases, they pass through a system quite well. In signal measuring equipment, we are generally looking more for multi-purpose measuring technologies with universal interfaces than single, specific parameter monitoring."

"The problem is that in a broadcast and transmission environment, especially where the signal can be traveling over long distances, you may not see developing degradation until it is too late. It might look or sound OK to the naked eye or ear, and yet be on the edge of failure due to signal jitter and noise, or other distortions. Because of the nature of a digital signal, the cliff effect (sudden loss of a digital signal) can take place earlier in the signal chain, and you will not see it or know it unless you are specifically measuring for it. The signal may look great on a simple picture monitor, but by the time it travels to a transmitter (or cable/satellite head end) on the far end, there may be occasional breakups that can't be explained. In this case, built-in error correction is masking (covering up) distortions. You may not even know about it until it gets to the end viewer. So one of the things we are doing, and it might sound kind of basic, is to look for ways to inexpensively improve the quality of monitoring at our facilities, especially at ingest and in remote monitoring."

Probe the Signal

This search for good monitoring solutions, however, is unfortunately taking place in an environment in which broadcasters are being forced to cut back on manpower and expenditures. Simultaneously, the industry is evolving into a file-based way of transmitting multiple broadcast signals for multiple channels. And, Arnold adds, stations are being asked to do all this with smaller engineering teams that are generally less experienced in how to monitor these new kinds of signals than in the past. Economic realities and technological breakthroughs are therefore leading broadcasters to automate the process as much as possible, and as inexpensively as possible, while trying hard not to sacrifice quality control.

Fortunately, Arnold says, manufacturers are increasingly providing relatively inexpensive test and measurement tools that are quite powerful.

"In modern TV technical operations, we are now dealing with file-based monitoring, rather than composite or component monitoring where you would actually plug a viewing monitor in and see the video or listen to the audio," he adds. "In most cases (today), the audio and video stream is a file, and it is more difficult to monitor that file in a traditional, realtime manner like we used to do it. You have to decode it, which is a big problem (for monitoring), and there are multiple things taking place at the same time—signal delay, encoding and decoding, lip synch, and the like."

Indeed, Arnold says the multi-format world has brought with it all sorts of new potential challenges to consider in terms of how best to monitor a broadcast signal.

"Broadcasters originally believed there would be a single format to deal with," he says. "That hasn't turned out to be the case. Today, there are multiple file wrappers and picture formats, so there may be a problem when we transcode. Is transcoding causing distortion of the signal? It shouldn't, but some files do not have lossless transcoding. Additionally, since we don't have a single file format or, at least, common file wrappers, there are distortions that can go undetected if we don't measure smartly. Also, in transport stream issues, with many different parameters, some are critical and some are less important. So in setting up a quality assurance plan, you need to measure and provide alarms when specific distortions are indicated."

In the area of monitoring audio level consistency, the process has been streamlined somewhat, Arnold says, due to the arrival of the Calm Act legislation, which requires measurement systems to be in compliance by the end of 2012. Arnold explains that this includes uniform dial-norm levels for contribution and distribution, among other things.

"Until recently, broadcasters and networks had different ideas of what loudness levels should be," he says. "But now, we have a specific target of -24 LKFS. With proper setup and monitoring, we can more easily ensure compliance."

The other big step forward is the fact that tools now exist to break down signals to their component parts and monitor specific aspects of those signals in all sorts of ways. Therefore, with this paradigm shift, the days are long gone of relying on a single pair of eyes controlling and watching a single signal on a high-end viewing monitor with vectorscopes and various other analog devices as a one-size-fits-all process. Now, according to Arnold, the name of the game is to strategically use these various automated tools to essentially probe the signal flow looking for impairments, taking captures, analyzing flagging, and alerting operators and maintenance personnel.

"You have to look at the situation differently (than in the analog era)," he says. "It's a multichannel world, and obviously, it is more difficult to monitor video and audio when there are multiple things taking place at the same time. And there is a signal delay in encoders and decoders, which makes it more difficult to accurately and critically monitor—i.e. monitoring lip synch is a universal problem in television facilities."

"One of the most important aspects of television station operation is to ensure that your signals are stable and compliant at the ingest point of the workflow. You want to spend more time thinking about proper file management at the beginning in order to avoid having it come out wrong at the far end."

Seeking Exceptions

Then, along the path, Arnold emphasizes that broadcasters need to be using an exception-based methodology of monitoring their signal, meaning their test and monitoring equipment displays flaws and brings them to the operator's attention without forcing the operator to look at channels that are functioning correctly.

"You can't possibly look at and display everything, so you look for exceptions to things that are working properly," Arnold explains. "The signal might look good (on a viewing monitor), but if an explicit area has a distortion that falls beneath some pre-set threshold, it would send an alarm and alert. Monitoring by exception is becoming the more common way to do it these days, and a number of manufacturers are offering monitoring systems that work this way, including Evertz, Triveni Digital, Tektronix, Miranda, and others."

Indeed, at the recent NAB show, there was significant focus from those manufacturers, and others, on tools for remote signal monitoring and analyzing, feed quality control, play-out monitoring, and more. Arnold points to Triveni's RM and MT series StreamScope technology and Miranda's series of Probe products, among others, as typical examples. Such tools employ Web browsers for remote access and system configuration, generally have a small footprint, are largely automated and reside in the background, and are constantly hunting for exceptions within the signal stream.

These developments are a major step forward, Arnold argues, compared to where the industry stood in the early days of digital broadcasting. "Back then, these were all single systems that had limited common standards, and it was hard to get different kinds of system manufacturing equipment to work together for different types of measurements, so you had to separately use different types of standalone equipment. Now, we can see multi-functional test and measurement equipment that can work remotely, with prices coming down, and with a greater commonality of interfaces."

Centralized Hub

All this is pushing signal monitoring technology in the direction of what Arnold calls "measurement in a box," meaning a centralized hub with universal interfaces for linking to all other measurement tools on a network to provide "multi-measurement ability" for straightforward remote monitoring, as Arnold describes it.

"I guess you could call it 'elegant simplicity' that we are looking for—a simple way to make the broadcast and production chains more reliable, to shore up quality control through the whole signal flow. Now, we are starting to see single pieces of equipment do more things as multi-functional test equipment than in the past, and it will only improve from there."

As NAB illustrated, "exciting things are happening in this area," as Arnold puts it. But, he adds that another crucial aspect is the human component—proper training. He says broadcasters need to educate engineering personnel more efficiently on the new reality of file-based signal monitoring. That, he says, is a big issue and one perhaps not being adequately addressed in the era of budget constraints.

"Many facilities today have smaller and less qualified engineering staffs for this type of work," he says. "The dilemma is how do you train people to use this test and measurement equipment? Broadcasters need to improve in that area, but the good news is that vendors are stepping up in that regard and offering more training resources. The television engineering community needs to get its people into these programs and help them learn new measurement techniques, and to understand the limitations of the equipment."

"While at the NAB show, I was approached by a few of our partners wanting to do online training," he adds. "Our company will be taking advantage of this and we are hopeful others will also."

Comment
back to top

 


Join Tektronix in our free video seminars to learn about the key measurements and test techniques for Digital TV, MPEG Test and File Based Workflows. Simply click here to register.

 

News Briefs
HD Down Deep

Just as filmmaker James Cameron was getting ready to make his much publicized single-man submarine dive to the bottom of the Marianas Trench in March, National Geographic Online ran a fascinating, in-depth article on the project and Cameron's submarine—the Deepsea Challenger, detailing its technology systems and methodologies. The dive—only the second to the deepest part of the Ocean floor, seven miles beneath the western Pacific Ocean—was notable for many things, but from a filmmaking point of view, the high-definition video technology Cameron took with him was pretty intense. He had a total of six HD cameras with integrated hard drives, most notably a Red Epic system to give him 5k raw imagery down there, but his team also custom built a micro-sized 1080p HD camera system that the Cameron/Pace Group developed and used to capture left eye/right eye imagery for the eventual 3D documentary that will result from the effort. At NAB 2012, as documented by The Hollywood Reporter, Cameron detailed the creation of those tiny cameras and their housings, which he operated at the bottom of the sea on the end of remote-controlled camera booms. But the National Geographic report noted above has the most detailed accounting of those camera systems in the equipment section of the report, and for that matter, the entire infrastructure of the Deepsea Challenger and Cameron's eight-foot high tower of LED lights he used to light up the mysteries of the deep.

Comment
back to top

Blackmagic Camera

Speaking of NAB 2012, one of the show's big surprises was the debut of a digital cinema camera from a company known for hardware to enhance graphics in post-production, not image capture technology—Blackmagic Design. The company debuted the Blackmagic Design Cinema Camera, a 2.5k/1080p HD camera with a built-in SSD recorder at the show. It surprised many industry pundits because Blackmagic normally dwells in a different market, while the camera market is already crowded as it is, but also because of its streamlined design approach. Studio Daily was among the first to offer an in-depth report on the camera, and they link to another report and some test images created by cinematographer John Brawley using the system on behalf of Blackmagic.

Comment
back to top

 

Social Broadcasters

Michael Grotticelli at the Broadcast Engineering website recently published an interesting column on the challenges broadcasters face in terms of how best to incorporate social media into their product in a feasible and affordable way. Grotticelli points out that while we see Facebook and Twitter graphics and promos and teasers all over the place on television these days, the technology and workflow for doing all that has placed another burden on broadcasters—one of his sources even refers to it as the new "Wild West," because it is labor intensive to find, evaluate, and flow that material onto the air, while simultaneously managing social media posts from viewers. But, he adds, the good news is that companies like Ross Video, Chyron, ConnectTV, Avid and others are now debuting integration software tools for broadcasters to help them seamlessly weave Tweets and other viewer interaction through various social media directly into broadcasts.

Opinions expressed in SMPTE Newswatch do not necessarily reflect those of SMPTE. Reference to specific products do not represent an endorsement, recommendation, or promotion.

Comment
back to top

 

Let us know what you think about SMPTE Newswatch! Provide suggestions and feedback on improving this newsletter by completing this brief survey.

back to top

Table of Contents

Monitoring and QC Trends

HD Down Deep

Blackmagic Camera

Social Broadcasters

 

Connect with Us

SMPTE Events

13-15 May 2012
The SMPTE Forum on Emerging Media Technologies
Geneva, Switzerland

 

 

 

Get more information about SMPTE.

 

To advertise in this e-newsletter, contact Jeff Victor at jeffvictor@comcast.net or 847-367-4073.
Eletter created by Cavich Creative, LLC