Shira Kadmon, program manager at Qwilt, looks at how live streaming can improve reliability

Man-phone-streaming

It’s that time of the year when many sports fans are glued to their TVs, tablets and phones as multiple football, basketball, rugby, and cricket leagues reach their exciting climax. Two decades ago, it was satellite networks that tended to exclusively carry many crucial games, but today, live streaming IP delivery is becoming the dominant force, with high-profile streamers like DAZN and JioCinema welcoming millions of fans onto services that are only available via OTT.

Apart from freak atmospheric conditions, satellite-based sports coverage is pretty reliable. Yet, a “bad day” on the Internet means streaming services can be disrupted, leading to poor quality of experience (QoE) for viewers. If that happens during the final or a crucial penalty shootout, OTT broadcasters can expect subscriber churn – even if the cause was outside their immediate control.

Addressing the key issue

As the clamour for more sports content grows, along with the proliferation of 4K and other dynamic viewing experiences, having a bad internet day is becoming more difficult to predict and counteract for streamers.

The main issue is that end to end content delivery over the internet has historically been left to the somewhat uncoordinated actions of multiple actors. From the broadcaster through content distribution networks, across peering and exchange sites and finally down the last mile across ISP networks to viewers’ homes. This last-mile leg is where issues can arise as ISP networks must build excess capacity to deal with the “peak of peak” of demand. With more sports content consumed via OTT, the surge of traffic at the end of the season during a championship game that yields a new peak traffic record is a certainty.

Focusing on the solution

Building more capacity across the ISP core and networks aggregation points is most often a ‘brute force” approach that’s expensive and inefficient. Alternatively, the industry needs to get more efficient at end-to-end content distribution. So, the answer is partly technical but relies on the different parts of the media delivery chain that get sports content directly from the stadium to the sitting room to collaborate more closely.

Adopting an open edge architecture, based on open caching, is the technical side. It works by moving content efficiently across a service provider’s network and caching it closer in proximity to the end user. Caching content at the edge of the service provider network reduces buffering, improves video quality, and delivers an overall enhanced user experience. These advancements are possible because the edge location delivers lower latency, higher throughput, and reduced time to the first frame.

Unlike traditional commercial CDN nodes located centrally in the mid-mile range, open caching nodes are deployed at the closest possible location to the users – often tens of miles or even just a few blocks away. This proximity allows applications and content to bypass peering and exchange points, traditionally the biggest chokepoint and roadblock to QoE.

What’s next?

But the technology alone is not enough. There needs to be a platform agreement to ensure that sports broadcasters and ISPs can understand what content needs to be cached and served from the edge – along with a mechanism to equitably compensate ISPs for this improved level of service. Qwilt has over 175 content and service providers are now on board with this, including the likes of Airtel and Telefonica, that serve over one billion unique subscribers globally. This year, our federated CDN is expected to reach 200Tbps capacity.

Sports is just one example where open caching is making a real difference in helping to improve the sports streaming experience – for fans, broadcasters, and ISPs. Unlike kicking, shooting, throwing, or hitting a ball – this is a game where everybody wins.

Shira Kadmon - Program Manager - Qwilt

Shira Kadmon is program manager at Qwilt