Goran Milic, Nils Lerin, Anton Söderhäll, and Jonas Törnqvist from VFX studio Goodbye Kansas Studios discuss the evolution of digital humans and how digital humans will affect identity, cinema, and society in the future
The tools for creating digital humans have continued to evolve, and the team at Goodbye Kansas has borne witness firsthand to the developments in the creation of digital humans. Goran Milic, head of the facial animation department at Goodbye Kansas, commented: “We’re creating digital humans very similarly to how they were made fifteen years ago, but the technology in each stage has improved and moved us forward.” With multiple processes in the pipeline to create digital humans, including scanning, modeling, rigging, and animation- any development in the technology to create digital humans can contribute to a very different visual experience.
Nils Lerin, head of facial rigging at Goodbye Kansas, explained how 3D scanning has elevated the quality of digital humans: “3D scanning was not as widely used as it is now. It lifts the quality of the digital humans because it means that you’re basing the models on real people- even if you have to adapt and change the features a bit. You’re starting with a real, believable basis and not trying to bring to life a reference image.” Having more information as the actor moves - with a 3D scan - means that the authentic movements and expressions of the actor can be attached to the digital human.
The film industry’s casting of digital people
The progression of digital humans is evident in films, especially over the last thirty years. The team explains how early digital humans in film laid the foundation for the integration of visual effects in film. In The Curious Case of Benjamin Button, Benjamin, played by Brad Pitt, needed to look aged. The visual effects team on the project created a digital human of Brad Pitt, used for almost half of the movie. Jonas Törnqvist, head of facial modelling at Goodbye Kansas, explained how revolutionary this digital human was: “They created an older Brad Pitt but they did it seamlessly. He wasn’t just in the background - he was acting, talking, and crying. It was not that long ago, but it was still very early for digital humans. I remember being blown away by that.”
On the opposite end of the aging process, the team was impressed by the movie Blade Runner 2049. The character of Rachel, played by Sean Young in the 1982 Blade Runner movie, has been remade from the neck up by combining a present-day scan with reference images from the past, as she is now over sixty. “You can’t just scan her today and rely on that data alone, because she’s older now. So you have to look at reference images from when she was younger and then try to match that - you may still be able to tell it’s CG if you look closely but considering the work that was done, the results are really good,” said Lerin.
After early experimentations with digital humans in the film industry, the next big step was having audiences accept a completely digital human without question, explained Anton Söderhäll, executive producer at Goodbye Kansas. This was evident in James Cameron’s 2009 film, Avatar, which, to Söderhäll, was “proof we’ve entered an era where we can emotionally connect and accept digital humans as we do other characters.” With all the advancements - and those to come - in the creation of digital humans, Söderhäll imagines there will be more film storylines where digital humans are fundamental in the decades to come.
Existing evolutions in technology
A new technology developed in the last few years called deepfake leverages machine learning, a subset of AI. Deepfakes are a 2D solution that replaces a person in captured footage with someone else’s likeness. The technology is likely to advance to 3D in the coming years. “I was impressed with how deepfake technology was used for The Book of Boba Fett TV series. The technology is now ready for changing the appearance of the actor shot on set, even making it look like a different actor. It’s not ‘there yet’ for creating a digital human from the ground up that would allow for creative changes or use in games,” Törnqvist explained.
As for the future of technologies like deepfake, “I think machine learning tools will be used in production for animation and rigging more than they are today. You’ll also be able to take an actor’s voice from existing material and generate a perfect match for that actor when creating completely new audio,” Milic added. The team would also like to use technologies like deepfake for stunt doubles, as it is a much quicker solution than creating a digital human.
Establishing both hopes and boundaries for the future
As digital humans evolve, there is a growing need to establish boundaries or legal restrictions around their creation. The use of deepfake technology on digital humans, for example, may create a problem whereby a digital human is indistinguishable from humans; this is something actors signing away rights to their likeness may not realize. The ownership rights for digital humans have not yet been established. Milic explained the issue: “There is a very blurry line. We have to look up what the laws are when creating. They seem to be very different from country to country - sometimes states in the US have completely different laws on this issue. Who owns the rights to someone’s likeness, especially when you can synthesize the voice and the performers?” Restrictions are gradually coming into place, but there are significant legal gaps in who owns the rights to synthesized, artificially created speech, for example.
New technologies pose advantages for creators. The Goodbye Kansas team looks forward to the opportunities that could arise with the development of digital humans and the metaverse. Milic commented, “I would love to sit on a chair, put some good headphones into a VR set and watch Johnny Cash in Folsom Prison performing one song or being at a digital Pink Floyd concert. In that kind of environment with a great digital double - that’s something I would love to see.” Adding to this, Lerin said, “it would be so cool to have the Beatles playing in your home, in your living room, it’s like an AR experiment.” With digital humans evolving, revisiting the past in film and the metaverse is possible.
The evolution of digital humans, and the technologies that create them, pose opportunities to build stories and push creative boundaries. Crucially, we need to preserve the art of the craft itself; appreciate the nuance of acting; and commit to understanding what it means to view one’s likeness as digital property. Digital humans in the future of film is a fascinating and inspiring tool that will continue to drive the narrative of entertainment.
Clockwise from top left: Goodbye Kansas Studios’ Jonas Törnqvist, head of facial modelling; Goran Milic, head of the facial animation department; Anton Söderhäll, executive producer; and Nils Lerin, head of facial rigging.