Advertisement

Alexa, are you alone? Amazon staff may be listening to your recordings

Click to play video: 'Amazon staff can listen to private conversations through Alexa'
Amazon staff can listen to private conversations through Alexa
WATCH: Amazon staff can listen to private conversations through Alexa – Apr 12, 2019

Amazon staff can listen to commands and questions users pose to the Alexa voice assistant — and they sometimes do.

The company acknowledged that the conversations aren’t totally private in a statement to Global News after the news was first reported by Bloomberg.

“We only annotate an extremely small number of interactions from a random set of customers in order to improve the customer experience,” Amazon said in the statement.

Amazon explained that the company uses samples collected to better train “speech recognition and natural language understanding systems.”

READ MORE: Alexa recorded one family’s conversations and sent them to a friend, without them knowing

Bloomberg reported Wednesday that Amazon has “thousands” of employees who are trying to improve Alexa’s speech recognition technology. They do this by listening to and transcribing recordings, often sharing them in internal chats.

Story continues below advertisement

The news outlet said it spoke to some workers at Amazon anonymously, who explained they’ve signed non-disclosure documents preventing them from talking about the program.

Amazon noted in its statement that voice recordings can only be sent back to staff if the consumer says a “wake word,” such as Alexa, Amazon, computer or Echo, which prompts the device to start listening.

WATCH: Amazon Alexa suffered a Christmas crash in Europe amid a surge in new users

Click to play video: 'Amazon Alexa suffered a Christmas crash in Europe amid a surge in new users'
Amazon Alexa suffered a Christmas crash in Europe amid a surge in new users

“The device detects the wake word by identifying acoustic patterns that match the wake word. No audio is stored or sent to the cloud unless the device detects the wake word (or Alexa is activated by pressing a button),” the statement explained.

But Amazon employees that spoke to Bloomberg said the devices often got triggered by sounds or words similar to the “wake words” — which meant recordings were collected unintentionally.

Story continues below advertisement

READ MORE: Amazon’s Alexa is randomly laughing at people, and the company is trying to fix it

They cited the example of a woman singing in the shower or a child screaming. Two workers told Bloomberg they picked up sounds that appeared to be sexual assault.

Despite these reports, Amazon said in its statement that privacy concerns are paramount to the company.

“We have strict technical and operational safeguards and have a zero-tolerance policy for the abuse of our system,” it said. “Employees do not have direct access to information that can identify the person or account as part of this workflow.”

Amazon does not explicitly tell customers their voice recordings may be listened to and used, but it does note the information on its website in the FAQ section.

WATCH: New Hampshire judge wants Amazon to turn over possible recording of double homicide

Click to play video: 'New Hampshire judge wants Amazon to turn over possible recording of double homicide'
New Hampshire judge wants Amazon to turn over possible recording of double homicide

“…We use your requests to Alexa to train our speech recognition and natural language understanding systems. The more data we use to train these systems, the better Alexa works, and training Alexa with voice recordings from a diverse range of customers helps ensure Alexa works well for everyone,” it reads.

Story continues below advertisement

The practice is not entirely unique to Amazon. Apple also notes similar practices in its iOS Security documents.

“A small subset of recordings, transcripts and associated data without identifiers may continue to be used by Apple for ongoing improvement and quality assurance of Siri beyond two years,” it reads.

Google’s privacy policy also says it uses “voice and audio information when you use audio features” on various devices and apps. It also notes on its Google Home FAQ that it may “save your conversations to make our services faster, smarter and more useful to you.”

Voice assistant privacy concerns

This is far from the first time that Amazon’s Alexa voice assistant has come under the microscope due to privacy concerns.

In May 2018, a Portland woman said her family’s Amazon Echo recorded her conversations then sent them to a random contact without any human direction.

She said she only found out about the recording when she got a phone call from the person who received the recordings, an employee of her husband’s.

READ MORE: Amazon Alexa sent a user’s 1,700 audio files to a stranger due to ‘human error’

In December last year, another user of the voice assistant in Germany got access to more than a thousand recordings from another user because of “a human error” by the company.

Story continues below advertisement

The customer had asked to listen back to recordings of his own activities made by Alexa, but he was also able to access 1,700 audio files from a stranger when Amazon sent him a link.

And the privacy concerns aren’t just about Alexa.

Researchers at the University of California, Berkeley, recently found that Alexa and Apple’s Siri can be tricked to follow commands that are inaudible to the human ear because they are at a high-noise frequency.

WATCH: Google’s new Duplex personal assistant sounds human, but it’s not

Click to play video: 'Google’s new Duplex personal assistant sounds human, but it’s not'
Google’s new Duplex personal assistant sounds human, but it’s not

That means a seemingly normal song could be embedded with phrases or commands that the devices can pick up — but humans can’t hear. They said the findings are concerning as they open up a greater possibility of audio security attacks.

Story continues below advertisement

Another report by Norton found that consumers affected by cybercrime in 2017 were largely users of smart home interfaces and emerging security features.

The report said that of the 10 million Canadians impacted by cybercrime last year, over a third owned some kind of smart device they used for streaming content.

Smart speakers, including the Amazon Echo and the Google Home, offer consumers several options for streaming content through the devices.

—With files from Global News reporters Rebecca Joseph and Jessica Vomiero 

Curator Recommendations

Sponsored content

AdChoices