Will Strauss looks at a new research project that could result in technology capable of producing 'intelligent' interactive programming.

After being ticked off by a reader last week for being too generic with my recession blog, I thought I'd better discuss something very specific this time out. The Olympics - a breeding ground for new toys and technology - starts on Friday, so let's talk innovation.

In all amongst all the technological hullabaloo of Beijing 2008 I think I've found a hidden gem, something that is worth investigating further (even if there is always a chance it might not come to anything).

LIVE is an “integrated” research project that is partially funded by the European Union. It is attempting to produce "intelligent" interactive programming and the first trial is taking place during the Olympics.

From what I can work out, it's a potentially new way of streaming multiple video signals to the home via IPTV that allows viewers to control what they watch - and allows broadcasters to tailor their service as they go along.

Suited to big events where there is a lot going on (like the Olympics or a music festival), it's “non-linear, multi-stream and real-time” (it says here) and it adapts to the interests of the consumer. Well, IT doesn't, whatever IT is. The broadcaster adapts the service to the interests of the viewer.

It's a combination of production tools and content formats.

You can watch a video about it at http://www.ist-live.org/

How does it work?
Ordinary AV media is enriched with sophisticated metadata; archive material and live streams are linked together in real-time; and TV consumer feedback is considered along the way in order to make convenient programme adjustments.

Who is doing the tests?
The LIVE production system will be tested by ORF (Austrian Broadcasting Corporation) during the Olympic Games. A total of 500 Austrian households will view and interact with the "LIVE Olympic Show". Over a two week period a total of four interlinked channels will be produced.

Who is behind it?
The LIVE consortium consists of the coordinator Fraunhofer IAIS and nine research partners from five different countries including the University of Bradford.

Here's some blurb from the ‘brochure'

  • In the time-critical production process of live broadcasting there is little time to search databases for new material so archival content is usually pre-selected. This places a constraint on the ability of the production team to respond to unforeseen events or even satisfy creative impulses during a live broadcast. The innovation behind LIVE therefore is the ability to analyse, link and recommend content from multiple content sources in the spontaneous and fast moving environment of the live broadcast.

  • The LIVE system during the live broadcast automatically analyses and aligns content coming in from the multiple incoming streams and available archive material. Additionally, feedback coming in from the TV viewers (switching behaviour and on-screen polls) is also analysed. The meaningful connections between viewer preferences and analysed video material are then processed in real-time and fed into the control room to guide the production process.

On the surface it is fairly difficult to see the difference between this and, say, the BBC's streaming service.

However, it does appear to offer some real control from a production point-of-view and the ability to tailor streams to the viewers' requirements. I have something similar when I am editing a web site. I can tell what you are or aren't clicking on and therefore adapt the running order, add extra content or remove stories, accordingly.

If this project ends up creating a product that affords television production galleries or control rooms that same kind of instant control, they could be on to something.

I'd love to know what you think of it, good or bad. Watch the video and then why not have your say below.

[Thanks to William Cooper at informitv.com who tipped me off -unknowingly - about Live.]