Amazon staff review thousands of audio recordings made by Alexa each day — including snippets of couples arguing and having sex — an investigation claims.
The clips were accidentally captured by the popular digital assistant — confusing the noises for the commands it should be listening to — and sent off for analysis.
Staff at the tech firm review one in every five-hundred recordings made by Alexa, whether of deliberate commands to the assistant or accidental recordings.
According to a privacy expert, the revelation is a reminder of the extent of the personal information that the tech firm has on its users.
Amazon staff review thousands of audio recordings made by Alexa each day — including snippets of couples arguing and having sex — a Sun investigation claims
Amazon has an English-speaking team monitoring thousands of Alexa recordings daily based in Bucharest, Romania, the Sun claims, along with similar setups in Boston, Costa Rica and India.
Members of these team are allegedly privy to manifold personal moments accidentally captured by the digital assistant, including arguments, money talk, frank discussions of medical issues and the sounds of Alexa users having sex.
‘It’s been said that couples having sex and even what sounded like a sex attack have been heard by staff,’ a former Amazon analyst told the Sun.
‘There were times when I heard couples arguing at home and another when kids were trying to teach Alexa to swear.’
‘We were told to focus on Alexa commands but it was impossible not to hear other things going on.’
‘Amazon told us every one we were listening to had consented so I never felt like I was spying.’
It is estimated that Amazon’s Echo and Echo dot smart speakers — on which Alexa operates — can be found in around 6.5 million homes in the UK alone.
‘All Amazon Alexa owners will likely be stunned by the news that Amazon employees can listen to intimate moments,’ Kaspersky principal security researcher David Emm told the MailOnline.
‘While digital assistants such as the Amazon Echo are supposed to be activated by a specific ‘wake-up’ word, time and time again recently these devices have hit the headlines for intruding on people’s privacy, so the question is, where does this end?’
‘These devices are meant to enhance our lives, not spy on us!’
On Twitter, various Alexa users report their smart speaker devices being accidentally activated by random sounds — or even to no apparent cause.
On Twitter, various Alexa users report their smart speaker devices being accidentally activated by random sounds — or even to no apparent cause
‘While digital assistants such as the Amazon Echo are supposed to be activated by a specific ‘wake-up’ word,’ Kaspersky security expert David Emm told the MailOnline
‘Time and time again recently these devices have hit the headlines for intruding on people’s privacy,’ Kaspersky security expert David Emm told the MailOnline
With the number of smart speakers in homes and businesses continuing to grow, Mr Emm added, consumers need to be made aware ‘that what goes on at home doesn’t necessarily stay at home.’
‘It is an alarming wake-up call to owners of all digital assistants that the vendor has access to a huge volume of your personal information,’ he said.
This information is of potential value not only to cyber-criminals who could conceivably gain access to the data, but advertisers as well.
‘Such information could be used nefariously, even by trusted parties,’ said Mr Emm.
‘Amazon, Google and any other manufacturer must act to prevent continued intrusions of privacy.’
‘They should make clear what data is collected and how it is stored and offer people the ability to opt out of such storage.’
Audio clips of users were accidentally captured by the popular digital assistant — confusing the noises for the commands it should be listening to — and sent off for analysis
‘We take the security and privacy of our customers’ information seriously,’ an Amazon spokesperson told MailOnline.
‘We label a fraction of one percent (0.2 per cent) of customer interactions in order to improve the customer experience.’
This information, they explained, helps Amazon to train its speech recognition and natural language interpreting systems, with the aim of making Alexa better understand user requests.
‘We have strict technical and operational safeguards in place to protect customer privacy, and have a zero tolerance policy for the abuse of our system,’ the spokesperson continued.
‘Data associates do not receive information that can identify customers, access to internal tools is highly controlled, and customers can delete voice recordings associated with their account at any time.’
In addition, Amazon agents listening to recordings are bound by strict confidentiality rules, the Sun reported.
WHY ARE PEOPLE CONCERNED OVER PRIVACY WITH AMAZON’S ALEXA DEVICES?
Amazon devices have previously been activated when they’re not wanted – meaning the devices could be listening.
Millions are reluctant to invite the devices and their powerful microphones into their homes out of concern that their conversations are being heard.
Amazon devices rely on microphones listening out for a key word, which can be triggered by accident and without their owner’s realisation.
The camera on the £119.99 ($129) Echo Spot, which doubles up as a ‘smart alarm’, will also probably be facing directly at the user’s bed.
The device has such sophisticated microphones it can hear people talking from across the room – even if music is playing.
Last month a hack by British security researcher Mark Barnes saw 2015 and 2016 versions of the Echo turned into a live microphone.
Fraudsters could then use this live audio feed to collect sensitive information from the device.
The Sun’s investigation comes after similar privacy issues were revealed with Apple’s equivalent digital assistant, Siri — which is so sensitive that even the sound of a zipper being pulled can set it off.
‘There have been countless instances of recordings featuring private discussions between doctors and patients, business deals, criminal dealings, sexual encounters and so on,’ an anonymous source reportedly told the Sun.
In particular, it has been suggested that Siri is especially disposed towards unintended activation on Apple Watch devices.
‘The regularity of accidental triggers on the watch is incredibly high,’ the whistleblower reportedly told the Sun.
‘These recordings are accompanied by user data showing location and contact details.’
‘You’ll hear a doctor and patient, talking, or people engaging in sex.’
A spokesperson for Apple told the Sun that, in the UK, less that one per cent of voice-activated recordings are listed to by staff.
‘User requests are not associated with the [user’s] Apple ID,’ they added.