An explanation of how 360 photospheres work, the benefits, drawbacks and the uses of 360 photospheres in virtual production


In the sixth and final part of the Virtual Production Innovation series, Dimension Studios explains how 360 photospheres work, focusing on the benefits in enabling reality to be projected onto the background, without having to build it in a 3D engine; the drawbacks – it’s essentially a 2D image, so a perspective shift won’t be reflected in the background – and an interesting virtual production use case, creating city/mountain landscapes.

“Using photography means you are capturing images of the real-world with all the complexity and frame of real lighting. So you get reality without having to build everything from scratch inside the 3D engine”

The Virtual Production Innovation Series was led by Paul Franklin at DNEG. He said: “Photospheres and photographic panoramas are great because you get the detail and lighting taken directly from the real world but without the need for a complex – and computationally heavy – build. In a photosphere, you can capture a complex environment such as a cityscape in a quick and efficient manner without having to go to the trouble of building everything in the computer. 

“They are relatively lightweight to manipulate, which is great too. The downside of the approach is that, in the final analysis, they’re just 2D images, albeit mapped onto the inside of a 3D sphere, so there’s no parallax to be had within the image. As a consequence, you have to be careful with how you move things around within the realtime engine – you can’t really do much more than pan and tilt around the central nodal point from which the image was originally shot. If you move off that nodal point you’ll quickly start seeing unwanted distortions and warping within the image.

“My feeling is that photospheres are best suited to occasions where you want to recreate the real world, which I guess seems obvious given that the photography has come from the real world. You also have the option of doing additional paintwork/compositing with the image before it’s used on set, but you’ll need to do extensive testing to ensure that it’s doing what you expected.”



Broadcast teamed up with virtual production and volumetric capture specialist Dimension to present this six-part series of short films providing insight into shooting in LED Volume stages.

The series is focused on the ‘Virtual Production Innovation Project’, which was designed to explore the state-of-the-art in virtual production, testing different workflows and scenarios.

The project was conceived and produced by Dimension, DNEG, and Sky Studios, and filmed at Arri’s Uxbridge studio with support from partners Creative Technology and NEP Group. The aim is to share insights and expertise to the creative community, while helping upskill the next generation of creative talent.

The sixth episode in the series is below:

Featured in the video:

Paul Franklin, director, DNEG


The series of six films explaining different learnings from the project will be released weekly throughout March and April, exclusively on Broadcastnow.

Each episode discusses the tests the team has undertaken and demonstrates some of the advantages of virtual production in capturing in-camera VFX and final pixel shots using an LED volume.

Alongside the film, we present an exclusive interview with one of the key members of the production team, talking about the insights offered in that week’s episode and the lessons they learned during the making of the Virtual Production Innovation Project films.

The series will cover lighting, tracking, simulcam, in-camera effects, scanning, and 360 photospheres. Broadcast’s interview with Steve Griffith, exec producer, DNEG about 360-degree photospheres is below.

Broadcast interview with DNEG exec producer, Steve Griffith


What is a 360-degree photosphere?

It is a full panoramic, photographic capture of an environment. A photographic map, of a particular position or location, at a given time of day. Like any photograph, they vary in resolution, quality, dynamic range, etc.

How do you capture them, and make them work in a virtual production stage?

For virtual production stages, we need them at high resolution and dynamic range. On an LED volume stage, we are mapping the image to the LED wall, much like a screensaver or desktop background image. The wall has a predefined image resolution, and we want to have our source content match that as best as possible.

Some cameras have a wide enough lens to capture full panoramic content, but there are often fidelity issues. We will often custom stitch images together to create our photosphere content.

You can find out more about Dimension Studio and its work with virtual production, by clicking here.