<img height="1" width="1" src="https://www.facebook.com/tr?id=414634002484912&amp;ev=PageView &amp;noscript=1">
SMPTE stands with our friends, colleagues, and family in Ukraine, and with all people of Ukraine

Near Live Streaming Platform for second screen

October 23, 2014

Aspera President/Co-founder Michelle Munson presented a paper on "End-to-end near live streaming platform for second screen combining multi-camera capture, high-speed transport and cloud video processing." "The system I'm going to talk about was deployed this summer, as a partnership between us, EVS and Elemental for large volumes of highlights from the FIFA event," she says. "This ended up being a cloud-oriented solution, which was essential."

This approach to the problem was grounded in the fact that they needed to reach a very large audience with a near-live experience at the World Cup matches. Aspera deals in the transport technologies that move digital assets at maximum speed regardless of file size, transfer distance and conditions. "The drivers for us are volume and quality, speed and control," she said. "The fact that traditional ways of doing transport are limited in these environments with variable bandwidths. This is all coupled with the idea of needing immediate quality with a global audinece. The environment we're talking about is the maturity of cloud computing."

Why a distributed cloud architecture for near-live content? Near Live experiences have highly burst processing and distribution requirements, and near-zero delay in the video experience is expected. Linear transcoding approaches simply cannot meet demand -- and are too expensive for short-term use. Furthermore, investing in on-premise bandwidth for distribution is impractical, since we're talking about millions of streams and terabits per second. "This just calls for a cloud architecture," said Munson.

EVS' challenge was to deal with 12 stadiums and 64 games for multilateral production and up to 24 different camera angles, streamed live to millions of viewers worldwide and supporting simultaneous matches to multiple devices and formats. Munson showed the solution, from live ingest to generate a mezzanine level to Aspera server software, using embedded Aspera client software in the EVS software. The goal of a near-live experience was to have six feeds at 10Mbps, equalling 60 Mbps times two double headers and twice for safety, for a total of 240 Mbps. The packet loss rate varies from a few percentages up to 9 percent. A traditional way of transporting the mezzanine level images wouldn't have been possible with so much content. WAN transfer challenge is compounded in the cloud, she pointed out. Second is the storage challenges with failure for high chunk sizes. Aspera has an offering called Aspera On-Demand, which is used for high-performance data transport. "We built a new architecture in our third generation that lets us odo in=order delivery on top of our transport," said Munson. "It's an API. We can do best effort in ordered delivery. We want to go in order as much as possible, and we can deliver the first or last block in the file at any time. In the stream, these are order-based apps. The challenge becomes how you can quantify the efficiencies contained in this?"

Aspera modeled the probability that you'll have to wait for retransmission time-outs for the next expected packet to arrive. "What's interesting about this is this allows us to quantitatively look at the probability of glitching as we're delivering video," she said. "Assuming you have a video playing rate of X bytes/s and packet size of Y bites and a known packet loss ratio, the probability of waiting for retransmission can be computed by looking at the geometric model. What this leads to is a very small glitch probability  for very little buffering." To ensure one glitch per hour is, even for the worst case Internet conditions, the answer is five seconds.

Tag(s): EVS

Debra Kaufman

Related Posts