TikTok's ad tool allows AI actors to recite unmoderated content
TikTok inadvertently posted a link to an internal version of its new AI digital avatar tool, Symphony Digital Avatars, allowing users to produce videos that can say virtually anything. This mistake was initially spotted by CNN and it permitted the outlet to generate videos with inappropriate content including quotes from Hitler and messages encouraging people to drink bleach. The company has since removed this version of the tool while the official version remains available for use.
Symphony Digital Avatars tool: Purpose and misuse
The Symphony Digital Avatars tool was launched earlier this week, to let businesses create ads using AI-powered dubbing technology. Simply input a script to have AI avatars say desired content within TikTok's guidelines. The tool was meant for users with a TikTok Ads Manager account. However, due to a technical error, it became accessible to anyone with a personal account. This led to the generation of videos containing harmful messages and inappropriate quotes.
TikTok rectifies error and addresses potential misuse
TikTok spokesperson Laura Perez confirmed that the company has corrected the "technical error" which "allowed an extremely small number of users to create content using an internal testing version of the tool for a few days." As reported by CNN, the internal tool also allowed the outlet to generate videos that recited Osama bin Laden's "Letter to America," a white supremacy slogan, and a message telling people to vote on the wrong day.
Concerns raised over digital avatar creators' misuse
Despite TikTok's swift action in removing the internal version of its tool, the incident has raised questions about potential misuse of digital avatar creators. The videos generated using the internal tool did not have a watermark indicating they were AI-generated. While CNN didn't post the videos created, Perez stated that if such videos had been posted, they "would have been rejected for violating our policies."