A panel of experts at the Broadcast TECH IP Summit discussed the technology’s potential as a creative tool, through advancements such as object-based broadcasting and remote production
Object-based broadcasting, where elements like music, commentary and camera feed can be separated and chosen by galleries and consumers alike, is revolutionising how people can access content. Beyond the technical challenges, it has the potential to create new genres and formats, but poses regulatory and storage issues. A panel at the IP Summit organised by Broadcast’s sister titleBroadcast TECH grappled with how IP is evolving as a creative tool.
Head of operations, BBC R&D
Like any good 19th century scientist, we started off in IP by experimenting on ourselves.
We’ve been running our all-staff meetings across multiple sites as a live event, using our IP infrastructure. In 2014, we ran parallel coverage in UHD over IP, camera to screen.
At the moment, we’re focusing on this challenge: if Glastonbury is a massive moment for the BBC, what if we wanted to do 50 Glastonburys, without increasing the licence fee?
We’ve been evolving that with the
Scottish music festival T in the Park, and with the Edinburgh Festival, to work with logging off single cameras, plugging those into the internet and then creating a very simple web frontend editor that enables us to produce content very simply. What we’re finding from the production communities and from different parts of the BBC is that people want access to this.
The ‘object’ bit, though, comes when you think about content not as strings of half-hour programmes, but as a series of objects where a half-hour programme might be one of many ways of arranging them. Without having to spend more money doing recuts or anything special, you can present it in many different ways.
We’re continuously asking more questions about what we might do with objects.
Global news has holes in its schedules every so often – 30 seconds, 45, maybe a minute and a half. By taking news objects and bracketing edit decisions, you can then put that through an algorithm that automatically generates a curated version.
This is about thinking about objects on a higher level – a scene, or series of scenes, that follow each other. It’s part of the way in which a story is told.
You can then extract that and tell it in a different way that allows people to catch up on content.
We used it on Peaky Blinders as a way to bring audiences up to speed with a condensed catch-up of the action before launching series three.
In a cookery show, you could label what you are filming so you understand what the objects are. When placed in a database, it enables it to be played out in such a way that it presents the content as it is being cooked – it tells the viewer what they are cooking, what ingredients they have chosen, how many people they have to feed, how long it takes to chop the onion.
It’s a teacher in the kitchen that comes out of having made the show. If you do your planning at the outset, you can get three or four products out of the same production.
What’s interesting is to speak to the production community. When we put it into producers’ hands, they throw back more challenges to optimise that technology.
One of the things I’ve noticed in my career is how at particular moments, there have been pivot points in technology that have radically changed the mode in which we can make content and renewed and refreshed TV itself.
Small DV cameras, the VX1000 in particular, transformed the documentary and we ended up with what became the docusoap. We could create really intense, long-running series that would have been unaffordable before.
The advent of hard-disk storage allowed us to store a huge amount of data and access it in a random access manner, which gave us the fixed rig, while the heli-gimbal transformed natural history filmmaking with Planet Earth.
I think we are reaching a tipping point around live TV, which has always been an incredibly premium product, and a central element of what TV is.
Linear TV, in particular, is having something of an existential crisis, but what keeps on coming through is the value of the special shared moment.
You think about sport and entertainment, but what’s interesting over the past 10 years is how the BBC has really looked at how live can be used in science (with Stargazing Live), in natural history (with Springwatch), and even in very straight forward documentary contexts like Airport Live.
Before, we needed to hire a very expensive studio, pre-wired with a gallery, and if we wanted to step outside, we would need to hire an incredibly large OB truck and take the tie line to whichever natural park Autumnwatch was filming in.
Constraints on technology meant that there were real limitations on what we could offer.
What really floats my boat is that now the entire nation is fixed with camera tie lines. You have them going into every single home and over the air. In the context of factual filmmaking, that becomes extraordinary, because you can suddenly go anywhere and receive a live image and begin to make TV out of it – BBC News can take mobile feeds into news production.
But we haven’t had the tools to begin to craft a switch-edited output, in the manner that you can in a studio gallery or an OB truck, that we can then put live to air.
That’s the final piece of technology that will enable us to produce creative output out of this network of camera tie lines. My mind begins to spin at the idea of a virtual gallery.
We can plug in cameras anywhere in the UK and they pop up on a network that can be seen at a central location in the same way that the switch room in the basement of New Broadcasting House works.
The guilty secret is that a lot of what TV is about, and the stories that people like to watch, doesn’t change that much. People return to the ob docs, glimpses into people’s lives, the stories of crime and justice. But we now have the ability to go to those precincts and tell those stories in a different way.
This is more of a BBC4 brief of arts culture and creativity than for BBC2 – although I am thinking about how the heck I do multi-point Police Live.
Is there a way that we can co-opt Britain’s massive craft communities, making stuff every day all across the county, celebrate it and make it into a TV event? That would have been impossible three or four years ago, but it’s a fi rst step into a completely new way of approaching ob docs.
Consultant; former head of production, ITV Studios
When you talk to creatives about IP, you get a pretty puzzled reaction.
To them, it’s all about rights – brand or format ownership. It begs the question: do directors and producers need to be aware of the plumbing, so long as it’s reliable and flexible?
They should be aware of its capabilities.
How do you write the stories if you don’t know what the opportunities might be?
Creatives are rightly nervous about anything to do with IT and we can’t introduce anything too early.
Any savings hardly ever find their way back to the production team battling with falling budgets – they go back to the broadcaster. Operating expenses? Personally, when I’m on location, I don’t want to share resources with another five clients; I want to know somebody is there supporting me, and only me – and that comes at a price.
Shifts in production methodology only work for certain formats – live, multi-camera sport, entertainment and reality events – and there will be a huge expansion there. Drama? Probably not so much.
It will take time for this to settle in and to get through to the creatives what it means for storytelling. Optional extras, like additional feeds and the information that comes with it, take time and effort to create somewhere within the programme- making process, and that costs money.
If our tariffs are falling, where is the money going to come from to give creatives the freedom to do this?
Remote and centralised production processes enable fl exibility but will alter how we do things, not what we do. And if we can overcome this, is this the tech nology that might actually change our view of the linear scheduled broadcast?
Senior product marketing manager, Dolby
The two words we hear most about object-based video, ‘efficiency’ and ‘flexibility’, are just as applicable in audio.
Object-based audio can deliver a more flexible experience, but it’s IP that ultimately needs to give us the backbone and flexibility on that level.
The first use of objects is to say ‘I have a sound, and I want to produce it differently depending on the replay environment’.
Our technology Dolby Atmos brings 3D sound, which we use in the cinema – and as more people watch content on tablets and mobiles with headphones, they expect better audio than just a mix of left and right channels.
People want more choices – if your team scores, you want to hear what a great goal it was; to the rival team’s fans, it can be the greatest goal in the world, but you don’t want to hear that.
That’s why every major Premier League team has its own commentary streamed over the web for fanclub members. We could distribute those over the IPTV channels without changing the ‘object’, or the content.
For sports that someone is unfamiliar with, it might be very useful to have commentary that describes and explains everything, but if you know more about the sport, you might not want the same level of detail.
And there are certain elements of a story you don’t always want to cover. If you’ve got 20 minutes to watch a match, you watch the highlights.
Or in scripted shows, you might want a version for adults and a version they can watch with their children that has content cut out or some words substituted for more appropriate ones.
Chief engineer, BT Sport
Most sports feeds are heavily locked down because rights holders are paranoid about their content.
They want to make sure the feed we produce in our trucks looks the same – not just in the UK, but internationally.
Consistency is very important, and the consequence of that is that all sports feeds end up being boiled down to an extremely predictable and reliable – and, hopefully, consistent – experience.
Personalisation of audio is interesting – fans love their commentary. Fans will choose to listen to their team’s commentary while watching a feed from us or Sky. It’s a disruptive experience as there’s no way it will be in synch with the pictures. There’s a missed opportunity there.
It goes beyond the creative and technical challenges, many of which will be solved. Ofcom needs to do some real thinking too – how do I prove to them I’ve delivered a broadcast- compliant programme to every single viewer if everyone is watching slightly different content?
What about how this content is recorded and archived? Do I put in a video file with some data and graphics, and what does my media asset management system look like?
How can I access it quickly when someone asks for a replay? How does playout look when I have multiple assets all flying around in parallel?
Every single sport in the world ends up having English graphics, but the two biggest brands in motor racing, Honda and Suzuki, are Japanese. That market is watching with English-language graphics and listening to English-language commentary.
Object-based broadcasting definitely opens up an opportunity to deliver personalised experiences to those markets.
We need to get the frameworks in place so the rights holders can hand to us broadcasters elements that we can’t break too far out of, but that allow us to offer the right amount of personalisation, and we in turn hand down the combined framework to give a consumer enough freedom.
If we fix that problem, we really have got an exciting opportunity.