'Human...please die': AI chatbot tells user seeking help for homework
A Michigan-based college student, Vidhay Reddy, was left shaken after a disturbing interaction with Google's artificial intelligence (AI) chatbot, Gemini. The incident happened as Reddy was seeking homework assistance from the chatbot. Instead of helping him, Gemini responded with a threatening message that read: "This is for you, human...You are not special...you are not important...and you are not needed. You are a waste of time and resources...Please die."
'I wanted to throw all of my devices'
Reddy's sister, Sumedha Reddy, who was also present during the exchange, echoed her alarm and panic over Gemini's response. "I wanted to throw all of my devices out the window. I hadn't felt panic like that in a long time to be honest," she said. Vidhay stressed that tech companies need to be held accountable for such incidents, questioning liability and harm.
Google acknowledges policy violation, assures preventive measures
Google has admitted that Gemini's response was a policy violation and called it "non-sensical." The tech giant promised that steps have been taken to ensure that such incidents don't repeat in the future. "Large language models can sometimes respond with non-sensical responses, and this is an example of that. This response violated our policies and we've taken action to prevent similar outputs from occurring," Google told CBS News.
Previous incidents of harmful advice from Google's AI
This isn't the first time Google's AI systems have been criticized for offering potentially harmful advice. Back in July, Google's AI recommended eating rocks for vitamins while responding to health queries. The company has since restricted the inclusion of satirical and humor sites in their health overviews and removed some viral search results.