VFX and grading system now comes with a machine learning-powered feature

image001

Autodesk is to showcase the latest update to Flame at NAB this week, which comes with a new machine learning-powered feature.

The Flame 2020 release adds machine learning analysis algorithms to isolate and modify common objects in moving footage, helping to speed up VFX and compositing workflows.

“Machine learning has enormous potential for content creators, particularly in the areas of compositing and image manipulation where AI can be used to track and isolate objects in a scene to pull rough mattes quickly,” said Steve McNeill, director of Flame Family Products at Autodesk, Media and Entertainment.

image002

Flame 2020 highlights include:

· Z Depth Map Generator— Enables Z depth map extraction analysis using machine learning for live action scene depth reclamation. This allows artists doing colour grading or look development to quickly analyse a shot and apply effects accurately based on distance from camera.

· Human face Normal Map Generator— Since all human faces have common recognisable features (such as relative distance between eyes, nose, location of mouth) machine learning algorithms can be trained to find these patterns. This tool can be used to simplify accurate color adjustment, relighting and digital cosmetic/beauty retouching.

· Refraction— With this feature, a 3D object can refract, distorting background objects based on their surface material characteristics. To achieve convincing transparency through glass, ice, windshields and more, the index of refraction can be set to an accurate approximation of real-world material light refraction.