Tool can dub in the accent of the original actor

DeepDub Ai dubbing accents

DeepDub has launched Accent Control - an AI dubbing tool that can match the accent of the original actor.

It uses custom generative AI models to manipulate and control the accents of characters in dubbed content, allowing content creators to either retain the original accents or adapt them to better fit the dubbed material and the target audience. It is powered by Deepdub’s emotional text-to-speech (eTTS) 2.0 model, a multimodal Large Language Model that supports over 130 languages. 

The company is now working on further localisation, with the aim to be able to match regional accents.

The technology is accessible through the Deepdub GO platform, a virtual AI dubbing studio designed for post-production editors, as well as a white glove dubbing service offered by Deepdub’s in-house team of post-production and localization professionals. You can see it in action below.

Accent Control will first be used by MHz Choice, a streaming service that brings international television to North American audiences.

Ofir Krakowski, CEO and co-founder of Deepdub, said: “Audiences crave genuine experiences and our Accent Control technology marks a significant milestone in achieving that. It reflects our commitment to breaking down language barriers while respecting and preserving the cultural essence of content. This innovation not only enhances the viewing experience but also underscores our leadership in AI-driven localization solutions.”

Lance Schwulst, EVP of content strategy at MHz Choice, added: “Our past collaboration with Deepdub has enabled us to dub numerous popular shows from around the world into English while retaining the original emotional expressivity and performance. Deepdub’s new accent control technology is taking this to the next level. We’re excited to leverage this to bring content to audiences that is unparalleled in its engaging and immersive viewer experience.”