Alexa eavesdropped on my family’s gossip, conversations about new jobs and insurance details

A MailOnline investigation into how much personal information Alexa is recording and storing on its users has revealed the smart assistant eavesdrops on housemates’ gossip, private conversations about insurance policies – and even the family dog. 

Amazon insists Alexa can only be activated when the allocated ‘wake word’ is uttered – being Alexa, Computer or Echo. 

The tech giant – along with Apple’s Siri and, until recently, Google’s Assistant – says it saves every single interaction a person has with the device to improve the service – with some ‘unintentional’ snippets also being recorded if it mistakes another noise for a ‘wake word’.

However, evidence seen by MailOnline shows this cannot be the case, or the process is fundamentally flawed, as a host of sounds and conversations were recorded without a clear or legitimate wake word being uttered – some when there was not even a human nearby. 

Smart assistants are now commonplace in many homes but users remain unaware of the treasure trove of private data they store – and that they can access it themselves to hear what has been recorded from their own everyday lives.   

A MailOnline investigation into these ‘secret’ archives has revealed an eerie snippets of users’ friends, families and children being recorded while they were completely unaware –  and without a clear or legitimate wake word being uttered.

One user found his Alexa repeatedly activated to record the same guest in their house gossiping about work colleagues, while another was recorded in a private discussion about their insurance policy – and another about their dream job. 

Bizarrely, in one household, Alexa seems to have developed an obsession with the family dog – waking up 13 times to record it barking. 

In a worrying twist, this was often when there was no one in the house who could possibly have ordered it to activate. 

You can find out what your device knows about you and what it has been listening to here and read on to see what we unearthed. 

SCROLL DOWN TO SEE HOW YOU CAN LISTEN TO YOUR OWN ‘SECRET’ ALEXA ARCHIVE 

Trawling through the archive available in the Alexa app, many of the entries were labelled as ‘Text not available – audio was not intended for Alexa’ (pictured). But it still recorded anyway, and saved it

Amazon's device - along with Apple's Siri and, until recently, Google's Assistant - saves every single interaction a person has with the device, with some unintentional snippets also being recorded. Now, the Echo and the Echo Dot which dominate the market

Amazon’s device – along with Apple’s Siri and, until recently, Google’s Assistant – saves every single interaction a person has with the device, with some unintentional snippets also being recorded. Now, the Echo and the Echo Dot which dominate the market

HOW TO FIND OUT WHAT YOUR ALEXA HAS RECORDED ABOUT YOU?  

Open the Alexa app which the devices are synced to or go to this link. 

Select the icon in the top left corner – often dubbed the ‘hamburger’

Press ‘Settings’ at the bottom of the menu 

Select ‘Alexa Account’ located at the top of the menu 

Press ‘Alexa Privacy’ at the bottom of the menu 

In this section a range of options will appear in a different looking menu – select ‘Review Voice History’

Here all the entries of all Alexa-enabled devices attached to an account will be listed in reverse order, with the most recent at the top. 

To view all entries, select the ‘All History’ option from the drop down menu and scroll through the pages. 

It will show all entries and those that it claims were recorded but not meant to be for Alexa are not transcribed, instead it reads ‘Text not available – audio was not intended for Alexa’.

These can still be listened to by selecting the drop down arrow on the right hand side and pressing play – locate don the left. 

For users who want to remove all trace of these recordings – pressing the ‘Delete All recordings for All History’ button will do so. 

There is currently no way of saving the data yourself and taking it off Amazon’s servers.  

Joe Pinkstone, 23, London – Owner of Echo Dot 3rd Generation and an Amazon Fire 7 tablet 

An early Christmas present, I have kept my Echo Dot on the coffee table in the flat I share with two friends. 

We use it mainly to tell us when our food is cooked, what the weather is expected to be like for the upcoming weekend and to play the radio while we wait. 

When trawling back through my own archives I was stunned to hear just how many snippets from my day-to -day life the device is recording – even when I can’t possibly have triggered it to ‘wake up’ by saying the word Alexa first.

It has encroached an unnerving amount into the lives of the flat’s inhabitants and our visitors, and we didn’t even know it was happening.

A friend was picked up on two separate occasions gossiping about colleagues, and private conversations of my flat mate on the phone to family members were recorded with fantastic clarity – in some case with recordings several seconds long. 

While they won’t have meant much to anyone unconnected listening in – for some reason Alexa captured a lengthy rant by my Liverpool FC-supporting friend as he went about his weekly ritual of watching his beloved team. 

While Alexa struggles to pick out recordings from crowds of people talking at once, it often wakes up and captures the babble of large groups, it seems she does have a sense of humour at least.

The device captured the moment a friend said: ‘You’re only little though, Joe’ as part of a rather cruel tête-à-tête during his visit.  

Sifting through the archive available in the Alexa app, many of the entries were labelled as ‘Text not available – audio was not intended for Alexa’. But it still recorded anyway. And saved it, permanently. 

Fortunately, there was noting incriminating or sensitive about the information captured by my Alexa-enabled devices.  

But it did give me a strange chill listening to the snippets of conversation that it knew, by its own admission, it was not supposed to be listening to. 

Sally and Paul Pinkstone, both 53, Northamptonshire, England – Owners of an Echo Dot 2nd Generation 

The Echo sits on the counter in the kitchen of our home, an anniversary gift from last year which serves to play music and give daily news briefings.  

Going through the data was eye opening – we had no idea it was recording such a host of noises and conversations – and how bizarre that it was picking up on and recording many random moments. 

Since being set up in July it clocked 13 separate occasions where the family dog, Cocoa, barked at upcoming visitors, be it the postman, the widow cleaner or an irrational response to passing birds. 

Worryingly, there was often nobody in the house who could possibly have triggered it. 

It also woke up several times on Christmas and Boxing Day when there were big crowds in the room – mainly having recorded gibberish but these recordings could hide any matter of conversation not intended for outside ears.

Some of the recordings, however, were far less obscure and infinitely more concerning.  

A private conversation about insurance policies and the level of excess was recorded and saved by Alexa – despite no wake word ever being consciously muttered. 

It also captured Paul discussing renovating our daughter’s bedroom, a slight disagreement about where the pots should go and weighing up the pros and cons of whether or not to go for a new job.  

Stephen Matthews, 24, London – Owner of an Echo Dot 3rd Generation and an Echo 2nd Generation  

I have two Alexa-enabled devices, the Dot in the living room and the larger device in the kitchen. Between the two of them the device captures pretty much any noise in the flat. 

Our archive revealed it is used almost exclusively for tuning in to BBC Radio 1, but it did capture some moments of gossip which we had no idea it was listening to.  

While hosting a pancake party we were discussing the love-life of a friend and the record captured my girlfriend, Lizzy, saying: ‘This is my favourite type of topic!’ in response to a juicy piece of gossip.

While it didn’t capture anything that could cause an issue, it is disconcerting that it has the ability to save these private moments without us waking up the device.  

Annie Palmer, 25, New York – Owner of an Echo Dot 2nd Generation 

I recently started using Google Home devices throughout my home, but before I made the switch, we had Echo Dots in a few different rooms. 

When I reviewed my archived recordings, most of it wasn’t suspect, ranging from typical voice commands like checking the weather and asking it to play music on Spotify, to my more bizarre requests, such as asking Alexa to sing me Happy Birthday. 

However, there were a few audio snippets that gave me pause. 

A snippet of a phone conversation from March was picked up and recorded by my Echo Dot, while other similarly inane phrases from a chat in the kitchen with my boyfriend were archived. 

In other cases, Alexa recorded short clips of me talking when the phrase started with certain words like ‘Do you’ or ‘How do you,’ which Amazon’s assistant must have assumed preceded a request or command. 

It’s widely known that Alexa will record audio snippets when it thinks it hears a ‘wake word,’ which users can also change to ‘Computer’ in their device settings, but in many of these cases, the word ‘Alexa’ wasn’t uttered at all. 

After reviewing these recordings, I felt compelled to take back control over my data, by downloading it for my records then wiping it from the device. 

Unfortunately, Amazon doesn’t give users the option to download their data. For all the rhetoric from Amazon and other tech giants about safety and security, it sure doesn’t make it easy for users to take back their privacy. 

HOW DOES ALEXA WORK? 

Any time audio is sent to the cloud, a visual indicator appears on the Echo device – a light ring on Amazon Echo will turn blue or a blue bar will appear on Echo Show.  

Amazon also says that voice recordings are kept until a customer chooses to delete them. 

The recordings are used to increase the diversity with which Alexa is trained to help it better understand customer requests.

For example, differentiating between YouTube and U2 and using historical context, such as the Olympics, to know what the user is referring to.   

Amazon maintains the deice is not activated until the wake word is said: this can be configured to be Alexa, Echo or Computer. 

It also records when the microphone button is manually pressed.  

Ashley Clements, 29, London – Owner of an Echo Dot 3rd Generation 

I got the latest Echo Dot in November and it is used mainly for playing music and providing little chunks of information while I’m getting ready. 

In the archive, worryingly, was a private a conversation about someone with a similar name to the wake word, which, although understandable, is still concerning as it is several seconds of a private discussion which should never have been saved.  

What is baffling however, is how it recorded several snippets of football matches despite not being near the TV and there being no players called Alexa in the line-up. 

For example, it recorded a section around the 60th minute of the game between Manchester City and Leicester City last week, with Gary Neville’s commentary stored in my back-catalogue of data. 

Inexplicably, it also picked up on a political TV show from April where someone said: ‘Do speak up so we can hear you.’ 

The most surreal moment in the record though, shows the time I caught Alexa listening in when it shouldn’t have been. 

It activated when I said ‘what’ – I can only assume it thought it heard ‘Alexa’. It then continued to record even after I clocked its blue ring. The next entry in the archive reveals me being rather befuddled and saying: ‘Oh, Alexa just came on’.

It recorded this entry but fuzzed it out as it considered to it to be audio that was not intended for Alexa. If it knows this, it seems odd that it would continue to record and save it. 

Chillingly, the main takeaway from this is not what it has recorded, but what it could record. 

If it is easily triggered and taking brief snippets of random conversation, there’s nothing to say it won’t catch something more significant in the future. 

Under 'Alexa Privacy' a range of options will appear in a menu called 'Review Voice History'.  A host of data entries emerges which marks every time any Alexa-enabled device synced up to an account is activated and used.it also reveals what was said

Under ‘Alexa Privacy’ a range of options will appear in a menu called ‘Review Voice History’.  A host of data entries emerges which marks every time any Alexa-enabled device synced up to an account is activated and used.it also reveals what was said 

Joseph Curtis, 30 – Owner of an Echo Dot 3rd Generation  

Alexa came into my life two months ago and I’ve been having great fun with it and trying out its range of abilities. 

For example, asking the omniscient device ‘how many roads must a man walk down?’

It did provide the suitably correct response, saying: ‘The answer, my friend, is blowin’ in the wind.’ 

What is considerably less amusing, however, is how it already seems to be picking up on things it shouldn’t. 

A discussion in another room was detected as I discussed TV programmes and on who’s Netflix account it had been found – something Alexa had no business listening to. 

It seems to have been activated by a female voice saying ‘I watched that once’, which it must have interpreted to be ‘Alexa’.  

Why does Amazon keep all this data?    

Amazon did not comment directly on the investigation when approached by MailOnline, but did send a comment via a spokesperson regarding the privacy settings of its voice recordings. 

The firm claims customers have ‘complete control’ of their recordings and can delete them at any time. 

While this is true in as much as users can delete their oral history relatively easily, it assumes they know of the archive’s existence in the first place. 

It remains to be seen what real benefit there is for Amazon in keeping this data. 

The statement reads: ‘Alexa is always getting smarter, which is only possible by training her with voice recordings to better understand requests, provide more accurate responses, and personalise the customer experience. 

‘Training Alexa with voice recordings from a diverse range of customers helps ensure Alexa works well for everyone. 

‘Customers have complete control over the voice recordings associated with their Alexa account. They can review, listen, and delete voice recordings one by one or all at once by visiting in the Alexa app or at https://www.amazon.com/alexaprivacy.’

Following the recent revelation that Amazon’s recordings taken from Alexa are being listened to by human employees at the firm’s headquarters, there is a heightened sensitivity towards the use of private audio clips. 

It was also found the location of some users was available to these Amazon staff members. 

The introduction of legislation and stricter guidelines may be the only thing that could stem the advance of big tech’s snaking tendrils into our private lives.

But now they are already in our homes and handheld devices, it may be too late to eradicate them completely. 

WHY ARE PEOPLE CONCERNED OVER PRIVACY WITH AMAZON’S ALEXA DEVICES?

Amazon devices have previously been activated when they’re not wanted – meaning the devices could be listening.

Millions are reluctant to invite the devices and their powerful microphones into their homes out of concern that their conversations are being heard.

Amazon devices rely on microphones listening out for a key word, which can be triggered by accident and without their owner’s realisation. 

The camera on the £119.99 ($129) Echo Spot, which doubles up as a ‘smart alarm’, will also probably be facing directly at the user’s bed. 

The device has such sophisticated microphones it can hear people talking from across the room – even if music is playing. 

Last month a hack by British security researcher Mark Barnes saw 2015 and 2016 versions of the Echo turned into a live microphone.

Fraudsters could then use this live audio feed to collect sensitive information from the device.   

 

Read more at DailyMail.co.uk