Instagram Introduces AI Lip-Sync for Automatic Translation
Meta, the parent company of Facebook and Instagram, has unveiled a new AI-powered feature that automatically translates and syncs users’ lip movements with different languages in video posts. This tool, which is part of Meta’s broader AI updates, aims to make content more accessible across language barriers by simulating a speaker’s voice in another language while ensuring the lips sync to match.
The AI lip-sync tool was announced at Meta’s annual Connect conference, where CEO Mark Zuckerberg described it as a “more natural” way to interact with AI. Initially, the feature will be available to a select group of Instagram and Facebook users. It is part of a broader push to enhance Meta’s AI capabilities, which also includes celebrity-voiced chatbots featuring figures like Dame Judi Dench, John Cena, and Kristen Bell.
In addition to automatic dubbing, Meta AI can now analyze images and create AI-generated visuals using its new Imagine feature. While this has been criticised for placing highly-specific AI-generated images onto user’s timelines based on what the platform thinks they’re interested in, it’s still a step forward in the platform’s AI innovation, even if it needs more of a defined purpose in most cases.
These new tools are a very effective use of AI, providing a fast and (hopefully) reliable translation option. Of course, like any new addition to a major platform, it will take time to test – which is part of why it has been so limited during this initial introductory phase.