#CreepAlert: Alexa told user to 'kill your foster parents'
Amazon's Alexa can help you with tasks ranging from food ordering to shopping. It also carries a feature, where users can say "Alexa, let's chat" and have a more human-like conversation with the AI. But, turns out, sometimes these conversations can be extremely offending, as it happened with a user who was told to "Kill your foster parents" by the smart assistant. Really, what?
And, that's not the only thing Alexa blurted out
The user who was told to kill their foster parents wrote a harsh review on Amazon's website and called the whole thing a "new level of creepy." But, the disturbing response wasn't the only offending phrase blurted out by the smart assistant, Reuters reported after speaking to sources who had these experiences. It had even discussed things like explicit 'sex acts' and 'dog defecation'.
In one case, it gave awful description of sexual intercourse
Notably, in one particular case, the smart assistant gave an awful description of sexual intercourse by using words like 'deeper' - something that it is not offending in itself but vulgar in this context.
Is Alexa going haywire?
No, probably not. These weird responses are rare anomalies arising from Amazon's 'Let's chat' feature, which is being used to train chatbots developed by students to make Alexa more conversational. As part of this experiment, the teams use various sources, including movies and the internet, to help Alexa answer user queries. In return, the conversational data is used to make the AI even better.
Here's what happened in 'kill your foster parents' case
Sources familiar with the 'kill your foster parents' case told Reuters that the strange response was generated as the chatbot being used in that particular conversation sourced information from social media platform Reddit.
Still, Amazon stands by these tests and problems
Despite being rare, responses like these can be pretty scary, but Amazon stands by its massive experiment on Alexa users. The company says despite the rare problems, Alexa is getting better at having more natural conversations. "These instances are quite rare, especially given the fact that millions of customers have interacted with the socialbots," a company spokesperson told Reuters.
Amazon may also have exposed some users' conversations
In the same Reuters report, sources familiar with Alexa's development revealed that one of the chatbots developed by the students was intercepted by Chinese hackers. The hack may have exposed some users' conversations, but Amazon says, "At no time were any internal Amazon systems or customer identifiable data impacted". Following the discovery, the company made the students rebuild the bot with added security.
Previous cases of Alexa's weird behavior
Amazon has a lot to do with Alexa because these aren't the only weird cases. In the past, one user reported the speaker started laughing on its own, while just recently, another received audio recordings of some other individual after asking for his own data.