Video Intelligence Framework enables sports organisations to embed AI models directly into live streaming workflows to generate real-time metadata, clips and alerts

Wowza

Wowza has launched Wowza Video Intelligence Framework, which enables sports organisations to apply AI inside live streaming workflows. It generates real-time metadata, clips, alerts, and machine-readable event signals while streams are still live. 

Wowza Video Intelligence Framework runs alongside Wowza Streaming Engine. It extracts frames from live streams, routes them to AI models for inference, and converts the results into structured outputs that downstream systems can immediately use.

A single moment in a stream can simultaneously generate metadata for ad targeting, trigger a clip, fire a webhook, and so on.

Wowza Video Intelligence Framework can also detect degraded image quality, obstructed lenses, and misaligned feeds in real time, routing alerts into monitoring dashboards and operations workflows before a broadcast is disrupted or a subscriber notices.

Users can bring their own AI models and tailor detection logic to their specific use cases, evolving their workflows over time without rebuilding the streaming infrastructure underneath.

“Live video is the most valuable, most perishable asset in media and sports, and most of it still goes unused,” said Krish Kumar, CEO of Wowza. “The moment passes before anyone can act on it. Video Intelligence Framework puts AI inside the live workflow, where the value actually is, so teams can detect what’s happening and do something about it while it still matters.”