The third in our series of videos providing invaluable insight into how to make the most of virtual production

Screenshot 2022-03-29 at 08.56.21

In the third part of the Virtual production Innovation series, Dimension Studios provides an explanation of what Simulcam is and how it benefits the actor and director to see the 3D CGI assets in real-time, compared to traditional production where CGI assets are added during post-production.

“Simulcam aids live-action filmmakers with the integration and interaction of 3D CGI assets.”

The project was directed by Paul Franklin, creative director at DNEG. He said: “Speaking as a filmmaker, the Sky Virtual Production Innovation Project was a dream come true. Working with a state-of-the-art LED volume, I was able to explore a wide range of dramatic scenarios ranging from urban locations through natural landscapes to deep-space sci-fi worlds – all on the same stage. The photographic realism that can be achieved is astonishing – I can’t wait to shoot in the Volume again.”

Screenshot 2022-03-29 at 08.56.14

VIRTUAL PRODUCTION INNOVATION SERIES – SIMULCAM

Broadcast teamed up with virtual production and volumetric capture specialist Dimension to present this six-part series of short films providing insight into shooting in LED Volume stages.

The series is focused on the ‘Virtual Production Innovation Project’, which was designed to explore the state-of-the-art in virtual production, testing different workflows and scenarios.

The project was conceived and produced by Dimension, DNEG, and Sky Studios, and filmed at Arri’s Uxbridge studio with support from partners Creative Technology and NEP Group. The aim is to share insights and expertise to the creative community, while helping upskill the next generation of creative talent.

The third episode in the series is below:

Callum Macmillan, CTO, Dimension

Dale McCready, cinematographer

The series of six films explaining different learnings from the project will be released weekly throughout March and April, exclusively on Broadcastnow.

Each episode discusses the tests the team has undertaken and demonstrates some of the advantages of virtual production in capturing in-camera VFX and final pixel shots using an LED volume.

Alongside the film, we present an exclusive interview with one of the key members of the production team, talking about the insights offered in that week’s episode and the lessons they learned during the making of the Virtual Production Innovation Project films.

The series will cover lighting, tracking, simulcam, in-camera effects, scanning, and 360 photospheres. Broadcast’s interview with Dimension joint managing director Steve Jelley about Simulcam is below.

Screenshot 2022-03-29 at 08.56.29

Broadcast interview with Dimension joint managing director, Steve Jelley

Steve_BW

What is Simulcam?

Simulcam is a solution that helps live action filmmakers handle the integration of and interaction with virtual content.

What is Simulcam’s role in making filmmakers and actors feel more comfortable and confident about what they are shooting in a virtual production environment? How does Simulcam work?

Virtual content tends to come in the form of 3D CGI assets, such as virtual environments, characters and objects. Traditionally, these CGI assets are added to a film at the post-production stage as a visual FX composite, where the CGI is blended with its corresponding live-action plates captured during principal photography.

What Simulcam does is bring forward the compositing process from post-production into production, in real-time. The composite is not final pixel quality but acts as a placeholder to help with the framing of shots where some key elements are virtual. It also provides a timing guide for interaction with virtual characters and objects that the real actor traditionally cannot see at the time of their performance.

With Simulcam, all of this is done in real-time, with the final output being no more than 5 or 6 frames behind the live-action that is being recorded.

The benefits for filmmakers are many. On-set, it enables the director and DOP to capture the shots they want to see in the final film. The composites can be viewed and edited straight away, rather than after the shoot has ended. The camera information can also be imported back into the VFX pipeline, delivering faster temporary VFX shots and accelerating the director’s cut.

You can find out more about Dimension Studio and its work with virtual production, by clicking here.