Broadcast Tech spoke to senior figures at the conference about what’s next for the industry

SIGGRAPH 2022

SIGGRAPH returned this month, with the conference taking place in Vancouver, Canada from 7-11 August this year.

Broadcast Tech was in attendance to find out the latest in VFX and virtual production, with the latter a big talking point for this year’s show. A round up of senior figures from across the industry and their thoughts can be found below.

Amazon Web Services

The AWS stand at SIGGRAPH showed almost an entire post-production workflow from start to finish, being completed in the cloud. AWS worldwide strategy and business development leader for content production in media and entertainment explained: “We have a strategy to shift the narrative from the traditional thinking around cloud for post-production, where it’s largely been centred around a couple of individual piecemeal workloads… to looking at holistic post-production. What would it take to post an entire show in the cloud?”

Predictably colouring has been one of the issues, and King added: “One of the first gaps we recognise is that the finishing phase is the most difficult one to do in the cloud… so we started a project that we call Colour In The Cloud, we announced it at NAB, and in tandem with FilmLight, we’ve created a pipeline to enable colour grading and finishing in the cloud.”

Overall, fully cloud post-production will be an ongoing strategy for the company: “We’re definitely seeing trends, customers starting to move more workloads to the cloud, and customers showing interest in seeing more end to end post-production.”

2021 F1 Insights - Start Analysis - 0-200kph

Autodesk

Standardisation is a big push for Autodesk, which did not have a stand at SIGGRAPH this year, but did have a presence. VP of media and entertainment strategy at the company, Maurice Patel, revealed: “The kinds of problems we’re trying to solve on the production management side of things are how do we tie this together? So that at least if you’re building on things, such as Unreal Engine, you’re building on something that’s common that can be shared.”

With this in mind, Autodesk has open sourced its review tool, RV. It is one of a number of tools being considered to become the standard review tool across the industry by the Academy Software Foundation, alongside other options from Sony and DNEG – which is mentioned below.

In addition, Autodesk is “re-architecting” the entire Autodesk software portfolio to be more hybrid and keep up with the move to cloud, as well as continuing to expand creative tools such as Maya.

Moving on, Patel identified: “The biggest problem now is that even though we saw increases in production budgets, the amount of content SV created for that budget is increasing faster than the actual budget.

“One of our goals is trying to make that easier so you don’t have to always throw more people at it. Can it be automated? Can AI help? Can we remove the redundancies? At the same time, can we make it more predictable so that at least you can make better estimates of demand resources? And give more transparency on the real production cost is?”

DNEG

DNEG xStudio

DNEG is making its own effort at releasing a standardised tool for review and playback with xStudio. Chas Jarrett, creative director and senior VFX supervisor, explained the creation process: “We decided to write a new, more modern playback system using our experience over the years growing other ones. It tries to consolidate all the features that we use or want from all those other tools and consolidate it all down into one.”

It plans to open source the tool in December, “no matter where we’re at with it,” Jarrett confirmed.

On the impact of virtual production on VFX, and vice versa, Jarrett believed it could be a big opportunity for both sectors: “Real time tools have always been this tantalising thing in the future…now there’s this great group of people in virtual production, who are making beautiful images that need to go into our movies and TV shows, but they’re focused on doing it in real time, or in an on-set environment. Suddenly, they’re putting together teams and bringing together a lot of expertise and kind of figuring out how to do that.”

DNEG has virtual production stages in London and Los Angeles, as well as a large virtual art department to take advantage of this.

Framestore

Michael Stein, CTO of Framestore, had similar emphasis on the importance of virtual production and VFX’s tie ins. He said of the challenges: “When you go on set, there’s 5, 10, 15 companies that own bits of it, and so we’re all trying to figure out how we work in this really flexible environment… I think the next trick is a little bit on the tech side and the tools which we’re doing a lot on, especially in real time, but also how do we go on set and interact with this variety that you’re going to find?”

Michael Stein Framestore

In addition to the headline production, Framestore has also been “doing a lot of work” on virtual cinematography and scouting, which allow you to, “bring in geographical data or stuff from Google Maps, etc. and let directors or productions scout and actually, when you’ve got those sets digitally, start to plan out their shots and do that work.”

For Framestore, it will tie it tightly to its VFX work: “We’re not going to build out a separate part of the company with 1000s of artists that are doing this while we also have our visual effects artists, this just becomes the way we work. Just like artists sometimes have to know Maya, Houdini, Nuke and other tools, Unreal just becomes another tool in that toolbox for us.”

Going forward, Stein, who has a background in gaming, pointed to that sector for the future of VFX: “Instead making a linear movie, or an immersive experience or an AR experience, in the future, you build a world and you tell a story in that world. That story might be told by a Hollywood director who’s crafting the world, but you could also take that world and turn it over to fans and users and say, ‘tell us what stories you would tell, here are the assets’, or advertising, or other stories, or episodic, or immersive marketing experiences, or even games out of that world. It’ll be interesting to see whether visual effects companies almost have to become game companies, because in some ways, game companies are there already. They may have to migrate more the other way, in a sense. I think we’ll probably end up meeting in the middle.”

Epic Games

Epic Games made a presentation at SIGGRAPH on the latest updates to Unreal Engine, with MetaHumans one of the major additions. Further broadcast-specific updates are expected in the next 12 months, with business development manager Miles Perkins telling Broadcast Tech that the company sees the industry as a “massive part of what we do.”

Unreal Engine 5 MetaHumans Epic Games

Perkins pointed to both AR and in-camera effects that have been used by the likes of ESPN, the Weather Channel, and Nickelodeon in the US, as well as predicting that mixed reality could be the next step: “I think we’re going to see that a lot more in the future,” he said. “I think we’ve got to take baby steps and getting there because there are some other hurdles that you have to get through to be able to do these things simultaneously.”

However, he did have some words of caution, pointing out that younger audiences have a, “different way of consuming media. I think that if you think about that in the context of having a broadcast with this mixed reality thing, the only thing that I’m going to ask is, what the audience is that for? What are we solving in terms of what they want?”

Foundry

Foundry head of research Dan Ring was fresh from testing its in-camera VFX and post-production processes on naval feature film Commandante. The production had to use a low-res LED volume to deal with the amount of water on set, leaving Foundry with the work of replacing and improving this low-res background.

Dan Ring Foundry

While this tested Foundry’s virtual production workflows, Ring said of the split between virtual production and traditional VFX: “It’s definitely not one or the other. The way that we’re addressing this is by building extra tools and components to build out the continuum of it.”

He added: “The line between production and post-production has always been challenging, and what we’re building are tools to help bridge those worlds together. The key way that we’re doing this is through the use of metadata… We’ve built tools for capturing metadata and storing it to a single source of truth, and then delivering it to the artist, into Nuke in our case, but it can be delivered to whoever needs it - capturing camera information, lens metadata, mocap information, decisions on lights etc.”

Ring also backed machine learning to have a big impact on the industry: “It’s mind blowing, these are things that an artist could not do in the past, and so we’re certainly seeing machine learning in these cases. It’s assisting artists, not replacing them. You still need that creative vision.”

He added: “The tension between artists and AI is a big challenge. We believe that the future is assisting the artist, but what ethical safeguards do we need to put in place to make sure that we don’t accidentally invent that make the movie button.”