Smart speakers listen to users up to 19 times per day because they hear random words heard on TVs

A new study found that smart speakers are turning on for up to 19 times per day for an average of 43 seconds each time by words misunderstood from people speaking in the same room or heard from televisions. 

That means the speakers using popular virtual assistants like Apple’s Siri, Amazon’s Alexa, Microsoft’s Cortana and the unnamed digital helper on Google Home and Nest devices are listening to unknowing users. 

The Northeastern University study discovered that smart speakers were triggered by words they often misheard and that they often delivered bizarre responses anywhere from about 2 to 19 times per day. 

Google Home Mini

Apple’s Home Pod and the Google Home Mini smart speakers were included in a Northeastern University study that found such devices randomly go on to record people when they mishear words, sometimes even uttered from television programs as they broadcast into a room

Amazon's echo devices also were included in the Northeastern University study that found smart speakers randomly start recording people for up to 19 times per day

Amazon’s echo devices also were included in the Northeastern University study that found smart speakers randomly start recording people for up to 19 times per day

A Northeaster University study of smart speakers, including the Harman Kardon Invoke speakers by Microsoft, found such devices could randomly switch on to record people after misunderstanding wake commands for up to 43 seconds each time

A Northeaster University study of smart speakers, including the Harman Kardon Invoke speakers by Microsoft, found such devices could randomly switch on to record people after misunderstanding wake commands for up to 43 seconds each time

The most confused, according to the February 2020 study, was Siri, which responds from Apple’s Home Pod.

While designed to respond to ‘Hey Siri,’ the report says that ‘Apple Homepod, activations occurred with words that rhymed or sounded similar to the term.

Examples include, ‘He clearly’, ‘They very’, ‘Hey sorry’, ‘Okay, Yeah’, ‘And seriously’, ‘Hi Mrs’, ‘Faith’s funeral’, ‘Historians’, ‘I see’, ‘I’m sorry’, ‘They say’.  

A more alarming discovery in the report found that the speakers turned on randomly and stayed on for 20 to 43 seconds.

That means a smart speaker could be listening in to users for almost an entire minute. 

‘Voice assistants such as Amazon’s Alexa, OK Google, Apple’s Siri, and Microsoft’s Cortana are becoming increasingly pervasive in our homes, offices, and public spaces,’ the report says. 

Apple's Siri

Amazon's Alexa

Apple’s Siri (left) and Amazon’s Alexa were among those studied in a Northeastern University study that found the virtual assistants were listening to unknowing users

Google Home

Microsoft's Cortana

Google Home and Microsoft’s Cortana also were included in a Northeastern University study that found digital assistants eavesdropping after they misheard words they thought were wake commands

‘While convenient, these systems also raise important privacy concerns—namely, what exactly are these systems recording from their surroundings, and does that include sensitive and personal conversations that were never meant to be shared with companies or their contractors?’

When Apple’s Home Pod hears these words, it sometimes thinks it’s hearing ‘Hey Siri’

‘He clearly’ 

‘They very’ 

‘Hey sorry’ 

‘Okay, Yeah’

‘And seriously’ 

‘Hi Mrs’

‘Faith’s funeral’

‘Historians’

‘I see’

‘I’m sorry’

‘They say’

The report notes that these ‘aren’t just hypothetical concerns from paranoid users.’

‘There have been a slew of recent reports about devices constantly recording audio and cloud providers outsourcing to contractors transcription of audio recordings of private and intimate interactions,’ it explains.

The study highlights how Google admitted in 2017 that its Google Home Mini speaker, newly unveiled at the time, was guilty of eavesdropping on users.

Recordings found by Dutch broadcaster VRT revealed two years later that the same speakers were again set off after misunderstanding certain words. 

To make matters even worse, the devices were discovered listening in on private, sometimes intimate conversations. 

Recordings of pillow talk, couples fighting, confidential business calls and even discussions with children were being transcribed by Google contractors in what the tech giant said was an effort to understand different languages spoken. 

A team of researchers at Northeastern wanted to go beyond anecdotes and turned binged on 125 hours of Netflix, watching shows like original shows on the streaming service like ‘Narcos, NBC’s ‘The Office’, the WB’s ‘Gilmore Girls’ and other programs that contained ‘reasonably large amounts dialogue.’

The researchers also used video feeds of the devices to know when they lit up, indicating they were set off and recording in response to a word that was heard. 

Researchers at Northeaster University set up several smart speakers in a controlled environment (pictured) to watch when they would misunderstand a wake command in a study that found the devices were eaves dropping on unknowing users

Researchers at Northeaster University set up several smart speakers in a controlled environment (pictured) to watch when they would misunderstand a wake command in a study that found the devices were eaves dropping on unknowing users

The researchers also used video feeds of the devices to know when they lit up, indicating they were set off and recording in response to a word that was heard

The researchers also used video feeds of the devices to know when they lit up, indicating they were set off and recording in response to a word that was heard

Specifically tested were the following:

– Google Home Mini 1st generation, which uses the wake up words ‘OK, Hey, or Hi, Google’

– Apple Homepod 1st generation, which relies on the wake up words, ‘Hey, Siri’

– Harman Kardon Invoke by Microsoft, which uses the wake up word, ‘Cortana’

– Two Amazon Echo Dots, 2nd generation, which use the wake up words, ‘Alexa, Amazon, Echo and Computer’

– Two Amazon Echo Dots, 3rd generation, which uses the wake up words, ‘Alexa, Amazon, Echo, Computer’

None of the devices was found to have been constantly recording conversations. ‘The devices do wake up frequently, but often for short intervals (with some exceptions),’ the report says. 

The Office and Gilmore Girls were found were responsible for the majority of activations. 

‘These two shows have more dialogue with respect to the others, meaning that the number of activations is at least in part related to the amount of dialogue,’ the report says. 

NBC's 'The Office' was among shows that set off smart speakers more often than others in a Northeaster University study that discovered the devices eaves dropping on unknowing users

NBC’s ‘The Office’ was among shows that set off smart speakers more often than others in a Northeaster University study that discovered the devices eaves dropping on unknowing users

Shows like the WB's 'Gilmore Girl's' with more dialogue with respect to others were found in a Northeastern University study to set off smart speakers, which recorded unknowing users

Shows like the WB’s ‘Gilmore Girl’s’ with more dialogue with respect to others were found in a Northeastern University study to set off smart speakers, which recorded unknowing users

 

 

 

 

 

Read more at DailyMail.co.uk