The company created near latency-free real-time creature animation

Screenshot 2021-12-13 at 12.02.37

Virtual production specialist Final Pixel has issued the results of a research project to incorporate live-action body and facial mo cap of a creature animation into virtual production workflows.

The snappily titled research – At the edge of the Metaverse: Live Body and Facial Motion Capture for LED Wall Virtual Production, with Rendering of High-Quality Digital Characters in Real-time – is available for free download from here.

The ‘industry-first’ research enabled Final Pixel to understand the limits of the software and workflow to give future clients the opportunity to incorporate detailed motion capture digital characters in their virtual productions.

Screenshot 2021-12-13 at 12.03.14

Final Pixel successfully achieved live facial and body motion capture streamed to Unreal Engine and played through Disguise. It used cluster rendering to render a high-quality bespoke 3D character built using a traditional CG pipeline with what it said was an extremely high level of detail.

The team was able to create real-time interactions between the characters in-camera with relatively little latency.

Screenshot 2021-12-13 at 12.04.35

The project was shot at Digital Catapult’s Virtual Production Test Stage, a joint venture with Target3D.

Michael McKenna, CEO of Final Pixel, said: “As a company specialising in virtual production for film, tv and advertising, we are excited by the opportunities working in real-time game engines can provide for the creative process when everything can be captured in-camera while shooting live-action. The next evolution of this technology is to look at the elements which are still considered too heavy or complex to move out of the post-production workflow. Digital characters and creature work is a big area for this and also extremely important for storytelling narrative.”

Screenshot 2021-12-13 at 12.04.16