Developments such as 5G, AI and remote working are set to revolutionise the way producers capture, edit and deliver programming. Adrian Pennington takes a peek into the future
5G AND HIGH-SPEED DELIVERY
Though mainstream adoption of 5G is still years away, the potential benefits are tantalising, as the TV and film industries will be among the biggest beneficiaries of increased speeds.
With speeds of up to 10 gigabits per second, there will be virtually no delay in relaying signals from A to B – and 5G technology will help producers replace costly on-site broadcast facilities with a lighter footprint in terms of equipment and personnel.
Sport is one obvious genre that will benefit, with live pictures from pitchside cameras reaching studio hubs more cheaply using 5G signals than satellite. But it’s not the only one.
“If I want to broadcast a live interview with the prime minister from an election bus, I can with 5G,” says Ian Wagdin, senior technology transfer manager at the BBC. “Trying to achieve that currently is a logistical nightmare.”
Similarly, technicians could set up remote-controlled cameras at council chambers on election night and bring all the feeds back for mixing live. “It allows us to create more content with fewer staff at less cost,” says Wagdin.
Cameras linked to 4G cellular networks are widespread today but are unreliable and compete with public traffic. 5G guarantees connectivity, achieved by network slicing – the assignment of a dedicated piece of spectrum.
“The real gamechanger is when you combine remote production with producers who understand how to make the most of the creative flexibility of the technology”
Matt Stagg, BT Sport
Provisioning the network for genres such as breaking news programming should, in theory, be as quick and easy as going to a web page, entering a postcode and specifying an amount of bandwidth. Producers will only pay for what they use.
“5G will enable a broadcast-grade network,” says BT Sport director of mobile strategy Matt Stagg. “The real gamechanger is when you combine remote production with producers who understand how to make the most of the creative flexibility of the technology.”
Wireless camera technology will be increasingly used on reality and live entertainment shows, reducing – and eventually eliminating – cabling and fixed lines. Meanwhile, aerial filming by drone could be controlled from a central location over longer distances.
The 5G network is not purely cellular but a combination of mobile and other infrastructure including fibre and satellite – and this could mean live broadcasts from aircraft, for example.
Last September, Sky Sports did just that when it aired a live interview with the Alfa Romeo Formula 1 team from onboard a Singapore Airlines plane at 35,000ft, with signals relayed over Inmarsat’s Ka-band satellites. The in-flight entertainment industry wants network slicing too. However, there is a fly in the ointment.
“Perhaps we will see new business models emerge where companies lease and manage spectrum and then offer it back as network slices for broadcasters and producers when they need it”
Ian Wagdin, BBC
Broadcasters and production teams currently use radio spectrum on a daily basis to produce both recorded and live content. That spectrum will be affected by 5G rollout throughout the production chain, from radio mics to satellite contribution.
“Unless you think broadcast production in future will look like a Skype or Facetime call, we need to maintain access to this spectrum to meet the needs of quality broadcast,” says Wagdin, who chairs the EBU’s 5G in content production group.
“Perhaps we will see new business models emerge where companies lease and manage spectrum and then offer it back as network slices for broadcasters and producers when they need it.”
Working in the cloud
Several sources expect high-speed internet and access to the cloud will drive innovation around working practices way beyond live streaming. “We will no longer need expensive GPU [graphics processing unit]-accelerated machines,” says Faraz Osman, founder and managing director of indie Gold Wala.
“Instead, it will be possible to turn around edits with browser-based access to software. Rather than rendering material locally, you will be able to lease processing horsepower from a server farm.”
With less need to invest in physical equipment such as workstations, indies and post houses will be freed from dependence on Soho.
“Small indies like mine will be able to afford to create massive edits that are today only open to the biggest production companies and film studios,” says Osman. “It will level the playing field and allow creativity to rise to the top.”
In practice, productions could scale equipment, freelance talent and computer programmes as required, keeping costs lower, while production teams could be easily distributed across geographies and time zones.
“If a production shoots in Ireland and the unedited original takes are uploaded to a cloud-based platform, a post house could edit the takes in New York within hours, which saves valuable time in releasing a project,” says Ray Panahon, head of technology at IT services provider InterVision.
Rather than having to travel to work in the same suite, an edit producer in Leeds could instruct an editor who might be in London, or even another country, in real time. Osman says such connectivity could make offline assembly “significantly cheaper” by making it possible to outsource work.
With cameras connected to the cloud, footage can be automatically ingested, logged and accessed far quicker than previously – and potentially by anyone working on a production. “New working models will emerge,” says Osman. “Editors will instantaneously receive shots and send back rough cuts before shoots even wrap.”
That goes for location shoots too. A production in rural Devon might pay to ring an area with a temporary private 5G network for the benefit of transferring rushes to the cloud and receiving feedback from executives.
“Producers have a massive fear of things not being recorded and use dailies to ensure they don’t miss a shot,” says Wadgin. “The ability to instantly send, store and review material will close that gap down.”
“There’s a risk of a murky waterfall of notes and change requests as clients’ ability to remotely quarterback productions becomes easier, but the opportunity 5G can offer is thrilling,” says Osman.
The timelines between production and post are already being compressed on film sets – for example, in allowing directors to make better creative decisions by enabling them to see CG animated characters or backgrounds blended with live action.
As functions like colour grading also get closer to the camera, post houses themselves might morph into new creative agencies.
“Collaborative working processes will revolutionise the way we make programmes, but it’s not clear at this stage what a collaborative edit timeline looks like,” says Osman.
Artificial intelligence (AI) can be used to expand coverage at a live event, while keeping budgets low. After setting up cameras at a venue, it will be possible to switch between camera feeds and publish the stream to different outlets at the touch of a button, says lead BBC R&D engineer Matthew Brooks.
BBC R&D staff are training algorithms on archive material to understand underlying cinematography decisions and how directors, camera ops and vision mixers use their skills to create live programming. Brooks emphasises that the focus for the BBC is on audience reach.
“We don’t expect the kind of quality you would get from a highly trained human crew,” he says.
Such a system works best on formulaic presentations such as match highlights or panel shows. “The AI can be trained to look for faces, identify who is talking and to use the rules of how panel shows are shot to cut sequences with the right pacing,” says Brooks.
“Any production that requires interaction between the subjects and the producers is beyond the scope of AI development at this point. Nobody has come up with good way for computers to tell compelling stories.”
Nonetheless, assisted automation using algorithms for speech-to-text and facial recognition will be increasingly used to drive human edit decisions. Footage will be easier to find once the system has classified it, informing the editor what kind of shot it is and who’s in the frame.
The industry has begun a trend towards personalising content with recommendations and targeted ads. The endgame is for every individual to receive streams of content finely tuned to their tastes, devices and viewing environment – this is formally known as ‘object-based broadcasting’ (OBB).
For example, content could be produced at variable lengths to suit user requirements for programming that more exactly fits commute times. “We’re moving away from 30-minute and one-hour media chunks towards a temporal version of content,” says the BBC’s Wadgin.
In July, the BBC gave the 1,000th episode of technology magazine show Click an interactive twist using object-based media tech, which allowed online viewers to choose the stories they wanted to watch and how much detail they wanted to receive.
BBC R&D experiments include production tool Squeezebox, which aims to assist production teams in the rapid re-edit of content to run to a shorter or longer duration, or to target multiple durations from a single edit.
“When you produce content that is not one rendition but many millions of versions all subtly different, then AI has a role to play in ensuring the transition between segments always works to give the user a good experience,” Wadgin says.
The BBC’s OBB toolkit is now at a point where indies can use it. “It’s a cloud-based suite that we are looking to open source,” says Brooks. This includes tools for creating interactive shows, which allow viewers to make narrative choices.
However, production teams will need to change their existing working practices to gain the benefits. “We wonder how conquerable that is,” Brooks says. “Everyone has their own workflows but if we’re going to make new interactive experiences, we need to embrace new production methods.”
Netflix created bespoke software to manage narrative branching for its interactive drama Black Mirror: Bandersnatch. It helped editor Tony Kearns simulate the viewing experience by clicking on links for options embedded in the script.
“Anyone embarking on interactive show production should assign someone with a knowledge of coding to the edit suite,” Kearns says.
The more interactive and personalised content becomes, the more questions around rights ownership arise, however.
“Is it good or bad for piracy if everything is an individual version of content?” asks Wagdin. “What is the commercial value of a product created with a user’s personal data? Who owns the copyright to content if an individual’s own data has had a creative input into it?”
The expense and variable graphic quality of virtual sets has meant their use has been limited so far, but this could be set to change by integrating complex games engines.
“The speed at which you can now build virtual set environments allows more productions to create fantastic spaces at a fraction of the time and cost”
Andy Waters, Dock 10
“In the past, broadcasters would spend huge amounts of money on virtual set infrastructure, which would be permanently installed and tied to delivery of one show,” says Dock 10 head of studios Andy Waters.
“The speed at which you can now build virtual set environments allows more productions to create fantastic spaces at a fraction of the time and cost.”
BBC1’s Match Of The Day uses Dock 10’s virtual studio on the weekend and shares it with Blue Peter during the week. The MediaCityUK facility plans to roll out the tech across all its studios.
“We can pop up a full virtual studio in any of our spaces in a day and put those tools into the hands of the entertainment community,” says Waters.
It may be challenging for a live audience to sit in front of an entirely greenscreen studio, but the virtual set can feature physical and augmented-reality elements – a technique Fremantle and The Future Group (TFG) pioneered for mixed-reality gameshow Lost In Time.
TFG recently went one better and delivered the first live broadcast containing real-time ray tracing – an image rendering technique that produces highly realistic CG lighting effects but consumes huge quantities of rendering power – and real-time facial animation.
Using souped-up graphics cards, and by blending CG with standard video frame rates, TFG animated and live streamed an augmented-reality character who was interviewed by a human presenter during a regional esports finals in Shanghai.
8K AND ULTRA HD
With high-end drama and blue-chip documentaries now likely to be commissioned with a 4K high dynamic range finish as standard, the next frontier is 8K UHD. Japanese broadcasters have pioneered such transmissions, mostly in the sport and factual space.
“We’re shooting with multiple Red cameras in 8K, but what surprised us is that the amount of data we have to manage is only 50% more than 4K,” says Tom Cooper, technical operations manager at Bristol indie Icon Films.
The company is producing feature-length doc Okavango with Botswana’s Natural History Film Unit for Japanese public broadcaster NHK.
“It’s being recorded in a compressed format out of the camera, which reduces the bitrate and means we don’t have to buy special equipment or change our workflow,” says Cooper.
Japanese pay-TV service Samurai Drama Channel opened the door to 8K drama with its production of The Return, which screened at Mipcom. As with the leap from SD to HD, artificial elements such as costume and make-up will also need an upgrade to cope with the higher resolution.
“That’s not the case with wildlife filming, where the more realistic the better,” says Cooper.
The demand for 8K distribution should be put into perspective. In Japan, only 0.1% (62,000 people) of the population will have an 8K TV set to watch the Tokyo Olympics next summer, according to IHS Screen, and even by 2023, the bulk of the country will rely on HD and SD transmissions.
Strategy Analytics predicts that by 2023, 1.7% of homes in western Europe will own 8K displays.
The 8K format is, however, making headway in production. Information about pictures acquired in the highest possible resolution will trickle down to lower-resolution outputs. “An 8K UHD image down-converted to HD will benefit from the better-quality source,” says Cooper.
In live production, 8K cameras will initially be deployed for the economies they can deliver. A single 8K camera, for example, has enough resolution for a production to extract multiple lower-resolution images, saving on cameras and camera operators.
Similarly, BT Sport has demonstrated how it might capture a football match in 8K and produce 4K and HD streams from that source to avoid having separate, costly outside broadcast chains.
The union of 8K capture and 5G delivery will also give live 360-degree applications their best shot at taking off. Key proofs of concept here include this year’s French Open tennis, where France Télévisions and Gallic pay-TV service Orange streamed live 8K content to mobile devices.
The largest such production to date will occur next summer when Intel and NTT DoCoMo will help distribute 8K VR coverage of multiple Olympic events from Tokyo.