The Big Game is only a few days away as millions of fans gear up for watch parties of what has been foreshadowed as the most technologically-advanced broadcast in the U.S. to date. The broadcasting rights generally rotate between CBS, NBC, and Fox each year, with each showcasing innovative technology—this year it will be shown on CBS.

Here’s a look at the tech to get a better idea of what it takes to broadcast the biggest football game of the year:

A Massive High-Tech Production 

A regular-season NFL game is already a major production, with 12 to 20 cameras and 150 to 200 employees needed to pull it off. However, those numbers seem small in comparison to the massive production of the biggest Sunday showdown coming up on February 3, 2019.

Over the last couple of years, at least 70 game cameras—including multiple 4K high-resolution and pylon cameras—have been used to show everything from yellow-line technology (the yellow first-down line you see on your TV) to instant replays. With the three major networks debuting cool tech during their dedicated coverage each year, you can expect this year’s game to be no different. 

The Cameras at the 53rd Gridiron Battle

A total of 115 cameras will be used this year. Of those, 14 cameras are dedicated to virtual graphics: four (including a SkyCam) for augmented reality graphics offering “never before seen” field level views and 10 with “trackable first-down-line” technology. There will also be 16 4K cameras and nine Sony 4800 camera systems throughout the stadium. This will allow additional live game camera angles and the ability to replay key moments with minimal resolution loss. 

For the first time in a U.S. live network broadcast, multiple 8K cameras will be used at each end zone in what the network calls a “highly-constructed engineering solution.” New broadcast booths have also been built to accommodate international networks. 

As technology continues to advance immersive viewing opportunities for fans, we’re beginning to see innovative solutions to improve the experience for blind and visually-impaired fans as well (though there’s still much to be done). 

The Future of Live Audio-Descriptions

When it comes to television, there are a limited number of networks and shows that provide audio descriptions, let alone live audio-descriptions.

Last year was the first time many in the community were able to experience the Big Game in a new light through Aira’s #AiraBowlvirtual watch party, with a live audio-described narrative of the game, commercials, and the halftime show. Aira helps connect those who are blind or have low vision to trained professional agents using wearable smart technology and augmented reality. 

In order to offer a true live experience, they sent their director of product management Greg Stilson—who is blind himself—to the 52nd game with the wearable glasses. The agents were able to see everything play out and describe the action live to Stilson, while also streaming the audio to interested parties (sighted and sight-impaired) online. This is just one amazing example of what more advanced technology could offer as an improved experience for all fans. 

Taking it a step further, the Japan Broadcasting Corporation (NHK) has created an automatic system for live generation of audio descriptions through text-to-speech. The automatic system may not be ready for use domestically in the U.S., but the technology itself was tested at the Rio Olympic and Paralympic games. This innovation could greatly benefit not just blind or visually impaired people, but sighted individuals as well. 

For more on machine learning and artificial intelligence technology, be sure to check out the January/February issue of the SMPTE Motion Imaging Journal