AI start-up bans account responsible for Joe Biden's audio deepfake
AI start-up ElevenLabs has shut down a user account that created an audio deepfake of United States President Joe Biden. This audio was used in a robocall last week, urging New Hampshire voters not to participate in their state's primary. Initially, it wasn't clear what technology was used to replicate Biden's voice. Later, security company Pindrop traced the deepfake back to ElevenLabs's voice-cloning tools. This incident shows how such technology can be misused for voter suppression and manipulation during elections.
How robocall audio was analyzed
Pindrop cleaned and refined the robocall's audio and compared it to samples from over 120 voice synthesis technologies used for generating deepfakes. The analysis showed an over 99% likelihood that it was created using ElevenLabs's tools. Bloomberg reported that ElevenLabs has suspended the account responsible and is still investigating the matter. ElevenLabs, yet to comment on the matter, however, indirectly stated it is "dedicated to preventing the misuse of audio AI tools and [takes] any incidents of misuse extremely seriously."
ElevenLabs offers voice cloning services
ElevenLabs provides voice cloning services in more than two dozen languages using AI software. The start-up allows customers to clone voices for "artistic and political speech contributing to public debates." However, it warns users against cloning voices for abusive purposes such as fraud, discrimination, hate speech, or any form of online abuse without breaking the law. Although the company removes audio impersonating voices without permission, it permits voice clones of public figures if they clearly express humor or mockery.
Implications for elections and future safeguards
The deepfaked Biden robocall serves as a warning for potential misuse of AI technologies during upcoming elections. Kathleen Carley, a professor at Carnegie Mellon University, stated, "This is kind of just the tip of the iceberg in what could be done with respect to voter suppression or attacks on election workers." As a result, companies like ElevenLabs need to implement stronger safeguards to prevent bad actors from using their tools to influence voters and manipulate elections worldwide.