Advances in rendering technology pioneered by photorealistic video games could add a new dimension to TV VFX and animation.

Computer games are becoming increasingly sophisticated as the technology that runs them becomes exponentially more powerful with every new generation of console or PC.

The result is hyper-real vehicles and environments, convincing physical effects and eerily lifelike humanoid characters - all rendered in real time as you are playing the game. Of course, such advances in computer game effects raise the bar for VFX on TV, so faced with time constraints and ever-shrinking budgets, some TV producers are looking closely at the techniques and technology used in game production.

The demands of the platform and the interactive nature of games mean that TV and games have traditionally adopted very different approaches to developing CG animation. The biggest difference is that games are built around dedicated software “engines” which introduce the elements of control and interactivity to the game. They render out graphics in real-time, unlike VFX on TV, which is usually composed of pre-rendered animation that has been composited into live-action footage.

Patrick Jocelyn, EMEA director for Media & Entertainment at Autodesk, says TV companies have already been investing in technology taken directly from games, but with mixed results. “TV uses real-time graphics in the form of virtual sets and some low-cost children's programmes use game engines to drive the graphics,” he says. “They use real-time graphic systems in order to quickly put a story together ready for playout. However, the quality is very poor compared with the properly post-produced and rendered output that you see in TV visual effects or commercials.”

Jocelyn does foresee a time when real-time game engine technology will evolve into something more complex. For example, he describes how hitting the red button during a programme could, instead of bringing up interactive features, drop viewers into a fully rendered game based on the show they are watching.

Embracing technology
Established animation companies are looking to raise the stakes in this area by integrating game engine technology into the production process. “One of our goals has always been the production of real-time animation at HD resolutions,” says Neil Marsden, technical director at Hibbert Ralph Animation & TV. “Our current game engine of choice is Epic's Unreal Engine 3, which combines stunning visual quality with a comprehensive toolset. We're also using high-end Nvidia gaming cards such as the 8800GTX and 9800GTX, which we've recently integrated along with a PlayStation 3 into our HD video production systems. This allows us to watch full 1080p output from the PS3 or our PC gaming workstations on our HD production monitors and play HD footage into our editing systems.”

Marsden says creating animation from existing game content should be relatively straightforward, as the production has already been completed for the game. He cites Digital Domain creating a 60-second cinema spot for the Epic games title Gears of War, which combined in-game footage with the Tears for Fears song Mad World, as a good example. Since then, several games ads have used elements from the games themselves such as Codemasters' Race Driver GRID by Axis Animation.

However, Marsden says creating totally new content for a TV series or film production using game engine technology is much more challenging. “Our experience to date has shown that producing animation for TV using game engine technology requires a very different production pipeline from that used in traditional animated TV series and film production,” he explains. “In some respects it is more akin to working in live action. Game engines also provide an opportunity to consider radically different production methods.”

Marsden highlights applications such as digital puppetry and the use of Unreal Engine 3 in the production of LazyTown [a CGI-heavy Icelandic series shown on CBeebies] as examples of the increasing importance of game-based technology to different areas of TV production.

CGI challenges
VFX supervisor at Rushes, Hayden Jones, says that the technical challenge of creating CGI can be approached from different angles. “The film industry wants to create the most realistic visuals and it spends a great deal of time, money and processing power on achieving this,” he says. “The games industry has the limitation of needing to ensure it renders images at 30fps [while TV VFX uses pre-rendered and composited animation]. Virtually all the techniques devised by the games industry are short cuts or clever approximations of more complex algorithms. These are designed to keep the visual quality high while allowing the effects to be rendered in real time.”

Such effects include ambient occlusion, a technique that provides photo-realistic lighting for characters and virtual environments by simulating much more complex and time-consuming rendering, and normal mapping, which uses high-quality rendered textures placed on more simple in-game structures to give the appearance of high detail but without the demands on processing power. Effects like these have been developed by research scientists in companies such as Autodesk or by the R&D departments of big VFX players like Industrial Light & Magic, but they have come to the forefront with their extensive use in games.

Animators play the game
“I think animation studios are learning a lot from games studios in terms of efficiency, especially how you can achieve certain effects without excessive render times,” says Will Adams, of Glasgow-based animators Once Were Farmers.

“At the moment we're working on a series with three animated characters set against the live action backgrounds. The turnaround from the shoot to the edit is very short so we had to start animating before anything was filmed or the set was even built! This makes no sense in VFX terms so we had to approach it a bit more like a computer game where we are building a library of interlinking animations for each character so that longer animations can be created very quickly from the constituent parts.”

Hayden Jones believes that the high-quality effects now commonplace in games raise the standard for VFX houses. “People are now more visually savvy and will instantly notice low-quality TV work,” he says. “The answer is to use visual effects in a controlled manner - to create the best work, you need time and money. Film and game companies have the luxury of long production schedules, allowing them to experiment with techniques and fine-tune their pipeline. The turnaround for visual effects in broadcast TV is considerably shorter, giving the artists very little time to create excellent work.”

So which technologies found in gaming and elsewhere are finding their way into TV VFX?

Motion-capture has been used extensively in both game applications as well as the character effects seen in recent movies like Pirates of the Caribbean and King Kong, but is increasingly being used in TV vfx. According to Nick Bolton, chief executive of motion-capture specialist Oxford Metrics Group, there are numerous examples of this. “In 1999 a children's programme called Starship Troopers was created completely with motion capture,” he says. “There are several children's TV series in production right now using motion capture - there is an emphasis on realism and fast production times so motion capture is brought in. The explosion in games has brought more experienced motion capture personnel into the general VFX market so once they have finished a production/game they can move on to a TV series, game or a movie with the generic assets of motion capture available to them.”

Photorealism has made big advances in gaming - and subsequently raises the bar for TV VFX. Such progress in rendering technology is also plain to see in car advertising. Most of the car adverts on hoardings and in magazines do not use photographs, but 3D models rendered at ultra-high quality by specialist companies such as Burrows, which uses banks of computers with enough power to render still images and 3D animation in convincing detail.

As computers become more powerful, Autodesk's Patrick Jocelyn is confident we will see such detail used in moving images on TV, especially as Burrows uses the same Autodesk 3D software (such as Maya and 3ds Max) available to games companies and TV post-production alike. “We've got the technology, we're just waiting for the hardware performance to increase so we can use it,” he says.

Key drivers
It's a view shared by Courtney Vanderslice, head of production at VFX studio Cinesite. “In the past, the cinema has certainly been a key driver for TV effects, but, more recently, multimedia platforms, including the internet and gaming, have become very strong influences,” she says. “Another big driver is the decreasing cost of the technology used in effects, allowing much wider access.”

Cinesite's next big TV project, for which it will create all the 3D and 2D effects, is forthcoming HBO mini-series Generation Kill. Expected to be shown on the BBC this year, it follows a US Marine battalion during the invasion of Iraq in 2003. The visual effects are intended to recreate the epic scale of battle with the utmost realism.

“On Generation Kill, we are creating photo-real, close-up tanks, helicopters and military vehicles,” explains Vanderslice. “These are effects which are common in computer games and for which technology has evolved greatly over recent years. They would certainly have been more time consuming and therefore costly to create five years ago.”

There is certainly an increasing overlap between games production and TV and film production which, according to Marsden, provides an opportunity to stimulate development, rather than stifle it.

“This relationship between games companies and the TV/film and post sector will become increasingly important if the games companies want to exploit their intellectual property in other market sectors,” he adds.