Inside British Parliament, Siri 'interrupts' MP, announces results on Syria
The task of digital assistants Siri, Alexa and Google Assistant, is to make our lives easier. But in this case, a minister of British Parliament was left embarrassed when Apple's Siri rudely interrupted his speech. The defense secretary Gavin Williamson was updating his colleagues on the battle against Islamic State (ISIS) when Siri did an online search on Syria and announced the results.
The video of Williamson's speech is now viral
Siri may have picked up Syria from minister's speech
Just when Williamson was speaking about ousting terror group ISIS from Iraq, Syria, and giving insights on areas, the phone in his pocket announced, "I found something on the web for 'Syrian democratic forces supported by premonition'." Siri may have ticked off after the minister said "Syrian democratic forces, supported by coalition air power" in his address. He was obviously left humiliated.
Have been heckled by my phone, Williamson lightens mood later
As he switched his phone off and kept it away, Williamson said "It is very rare that you're heckled by your own mobile phone. On this occasion, it is a new parliamentary convention, without a doubt." He then sought permission from the speaker to continue his speech. Though he thanked forces for the fight against terror, Williamson said the fight against Daesh is not over.
In speech, Williamson outlines terrorist threats to the UK
"We are entering a new phase, as the terrorists change their approach, disperse and prepare for a potential insurgency. Daesh remains the most significant terrorist threat to the United Kingdom due to their ability to inspire, direct and enable attacks on our interests," claimed Williamson.
Earlier, Alexa freaked out a user in San Francisco
The Williamson incident is not the standalone one. Earlier, Alexa freaked out a user in San Francisco A user Shawn Kinnear reported the Alexa placed in the living room said, "Every time I close my eyes, all I see is people dying." This obviously horrified him but security expert Chris Boyd said Alexa may have recorded the words and played it at 'worst possible time'.