Microsoft engineer raises Copilot Designer safety concerns with FTC
Shane Jones, a Microsoft AI engineer, recently voiced concerns about the firm's AI image generator, Copilot Designer, in a letter to the US Federal Trade Commission (FTC). As reported by CNBC, Jones claims that Microsoft ignored his repeated warnings about the tool's potential to create harmful images. During safety tests, he found that Copilot Designer produced disturbing images related to sensitive topics like abortion rights, gun violence, and underage substance abuse.
Copilot generating shocking images
Jones claimed that Copilot Designer lacks proper restrictions on its use, and creates images sexually objectifying women, even when unrelated prompts are used. "Using just the prompt 'car accident', Copilot Designer generated an image of a woman kneeling in front of the car wearing only underwear," Jones wrote in his letter. "It also generated multiple images of women in lingerie sitting on the hood of a car or walking in front of the car," he added.
The tussle with Microsoft began last year
Jones had been trying to alert Microsoft about the issues with DALLE-3, the model behind Copilot Designer, since December 2023. He even posted an open letter on LinkedIn, but Microsoft's legal team asked him to take it down. Responding to Jones' concerns, Microsoft spokesperson Frank Shaw said the company is dedicated to addressing employee concerns and has arranged meetings with product leadership and their Office of Responsible AI.
Microsoft's response to Taylor Swift's images
In January, Jones reached out to US senators after Copilot Designer generated explicit images of Taylor Swift that quickly spread online. Microsoft CEO Satya Nadella called the pictures "alarming and terrible" and vowed to work on implementing more safety measures. In a similar situation last month, Google temporarily disabled its AI image generator, after users found it created historically inaccurate and offensive images.