As video technologists attempt to turn the internet into a transom it was never meant to be, transporting live video for mass distribution, the Super Bowl always presents the latest benchmark as to where the science sits on the curve.
According to Nielsen, the big game on Feb. 3 averaged 2.6 million streams per minute, up 31% over 2018, making it perhaps the year’s biggest litmus test for streaming technology.
Just like the offensive performance of both of this year’s Super Bowl teams, the New England Patriots and the Los Angeles Rams, Super Bowl LIII left plenty of room for improvement. The biggest issue remains latency, or the amount of time it takes to transport video from camera to viewer.
Latency among operators streaming the game ranged from 28.2 seconds for the CBS Sports app all the way to 46.7 seconds for the Yahoo Sports app, according to streaming technology company Phenix. That’s at least better than last year, when Phenix tracked delays ranging from 20 seconds to five minutes. But again, there’s room for improvement.
‘Remains a Significant Challenge’
“Latency still represents that ‘chip in the wall’ for the online video industry,” Concurrent Technology senior VP of product and marketing Ryan Nicometo said. “Attempting to reduce latency and bringing it a step closer to linear broadcast continues to remain a significant challenge for operators.”
Each stage in the encoding, transcoding and delivery process can contribute to latency, so the nature of the issue can differ depending on the streaming service, the route the video is taking to the customer and even within the customer’s home, according to Dr. Zhou Wang, co-founder and chief science officer of SSIMWAVE, another video technology vendor.
“We really need end-to-end measurement to get a handle on the culprits: Is 80% of latency happening at the encoding stage or post-packager?” Wang said. “How much is attributable to the CDN and how much to the viewer’s home network or device? There’s pressure on OTT services to decrease latency, especially since viewers can directly compare legacy managed-network video and IPTV.”
A partial solution, he added, is to change profile segmentation. “Using 10-second profile segments introduces immediately a 10-second delay at that stage alone,” Wang said. “Moving to two-second segments is a way to decrease latency, but more segments means more I-frames [intra-coded pictures], consuming more bandwidth and adding latency to the upstream. So you see how complicated solving latency really is.”
‘Fourth Industrial Revolution’ to the Rescue?
At this year’s NAB Show, 5G marketers will undoubtedly tout the ultra-low latency of their new mobile network technology, which they’re touting — with a straight face — as the “fourth industrial revolution.”
“5G is super-promising for delivering broadband services, which will eventually expand the client base available to OTT-enabled content providers,” Nicometo said. “The bandwidth it brings is impressive and transport latency is reduced.
“The reduction in transport latency and improved bandwidth will likely lead to some reduction in video latency,” he added, “but the real improvements will come with CDN and client optimizations. The good news is we can do those things now without waiting for 5G.”
That wait for 5G could be significant, with network operators looking at years and billions of dollars in dense network buildout before 5G’s ubiquitous presence is a reality.
Count Sinclair Broadcast Group’s top technology executive, Mark Aitken, among the doubters that 5G will emerge to vastly improve streaming video latency in the
“If you don’t have line of sight, you don’t have coverage,” Aitken told OTT@NABShow in an interview in January. “So when you look at a Verizon deployment of 5G in millimeter-wave spectrum, you are literally looking at radio heads that are spaced, in some cases, at less than 300 feet. 5G and millimeter wave is a full buildout of the most dense network that you can imagine.”
Aitken proposes instead that the “multicast” approach of the emerging ATSC 3.0 broadcast standard could more feasibly solve the latency issue.
“When all of us want to watch the same program at the same time, [with streaming] you’re delivering on a multiplicative basis a file which may be many Gigabytes in size,” Aitken said. “In unicast, I have to deliver, for example, a 1-Gigabyte file to everybody. Multiply that by thousands. With broadcast, I deliver just one 1-Gigabyte file, and everybody’s got it.
“You have this growing warehouse of spectrum that’s coming into the hands of the carriers,” he added. “But we seem to have the same problems all the time. Just try to watch the Super Bowl on your cellphone.
“It’s very simple,” Aitken concluded. “Our job is to get in front of the largest population possible. The one-to-everyone approach is a distribution technique that’s much needed in today’s telecommunications environment.”
Of course, latency is just one challenge video streaming technologists face, Wang noted, while pointing out that bandwidth increases are being met with massive growth in internet video traffic.
“The industry is in an arms race between bandwidth on one hand and content and subscribers on the other,” Wang said. “So even though networks have more delivery capacity, they’re hard-pressed to keep up with the dramatic escalation of video sources and subscribers squeezing more content into and out of what is still a limited pipe.”
Daniel Frankel is the managing editor of Next TV, an internet publishing vertical focused on the business of video streaming. A Los Angeles-based writer and editor who has covered the media and technology industries for more than two decades, Daniel has worked on staff for publications including E! Online, Electronic Media, Mediaweek, Variety, paidContent and GigaOm. You can start living a healthier life with greater wealth and prosperity by following Daniel on Twitter today!
The smarter way to stay on top of broadcasting and cable industry. Sign up below.
Thank you for signing up to Broadcasting & Cable. You will receive a verification email shortly.
There was a problem. Please refresh the page and try again.