In yet another gouge to privacy practices and smart devices, Amazon has confirmed that Alexa doesn’t always delete your voice-recorded data.
The retail giant revealed this week that the handy digital assistant stores all voice recordings and interactions unless they’re manually deleted.
The revelation came in a letter response to Delaware Democratic Senator Chris Coons, who called on CEO Jeff Bezos to explain the company’s privacy practices related to Alexa back in May.
Coons’ questions stemmed from a CNET report that found Amazon keeps transcripts of voice interactions with Alexa. He pressed the company on how long it holds onto those voice recordings and transcripts, and what it does with the data.
“We retain customers’ voice recordings and transcripts until the customer chooses to delete them,” the letter from Amazon’s vice president of public policy, Brian Huseman, reads.
Huseman said the transcripts of requests and Alexa’s response are deleted from “primary storage systems,” but not necessarily everywhere.
He said the company is engaged in “an ongoing effort to ensure those transcripts do not remain in any of Alexa’s other storage systems.”
“We do not store the audio of Alexa’s response,” he went on. “However, we may still retain other records of customers’ Alexa interactions.”
In other words, even after hitting delete, your history might not be gone entirely.
Amazon says that what remains are records of transactions or requests, such as food delivery or ride-sharing orders, and not voice data or transcripts.
Other requests like alarms and reminders are also saved.
Huseman said scrapping that data would hinder the customer experience.
WATCH: Amazon’s Alexa records family’s conversation, sends it to random contact
“Customers would not want or expect deletion of the voice recording to delete the underlying data or prevent Alexa from performing the requested task,” he wrote.
The company claims these personal details are what help Alexa get to know you better.
It’s also why the voice data Amazon collects is not anonymized, Huseman said, which means the pieces of data tracked by Alexa is tied to you, uniquely, as a user.
“We use the transcripts to provide a transparency to our customer about what Alexa thought it heard and what Alexa provided as a response,” Huseman wrote.
“… Customers can play the actual audio that was streamed to the cloud, review the text transcript of what Alexa thought the customer said, and review Alexa’s response.”
The U.S. Senator wasn’t exactly satisfied with the response.
Coons said it “leaves open the possibility that transcripts of user voice interactions with Alexa are not deleted from all of Amazon’s servers, even after a user has deleted a recording of his or her voice.
“What’s more,” he went on, “the extent to which this data is shared with third parties, and how those third parties use and control that information, is still unclear.”
WATCH: Amazon staff can listen to private conversations through Alexa
What does this mean
Stephanie MacLellan, a cybersecurity and digital policy analyst in Waterloo, Ont., says it’s hard to define the implications.
MacLellan said the letter “doesn’t say nearly enough about how they might use the data” now or in the future.
She noted that as the sophistication of these devices grows, the overlap of where and how the data is used has “endless” possibilities.
“One potential example is, if someone uses Alexa to set alarm clocks and order alcohol or fast food, and Amazon stores that information, it’s possible to use that data to create a profile of their health status, which could potentially be sold to health insurance companies and affect how much they pay for insurance,” she said.
The concerns stretch to third parties, as well, MacLellan said.
“They seem to be leaving a lot of the responsibility in the hands of third-party skill developers, and we might not know how responsible their individual privacy policies are,” she said.
When reached for comment about the data storage, Amazon pointed to the details laid out in Huseman’s letter.
But this isn’t the first time Amazon has found itself in a privacy predicament.
Back in April, a Bloomberg investigation found that thousands of Amazon employees have access to customers’ voice recordings and text transcripts of interactions with Alexa. Amazon later said employees don’t have “direct access” to information that can identify the person, but that humans review some data to improve device functionality.
“We take the security and privacy of our customers’ personal information seriously,” an Amazon spokesperson said in a statement to Global News.
The company said they have “strict technical and operational safeguards” to prevent “abuse of the system.”
“We only annotate an extremely small number of interactions from a random set of customers in order to improve the customer experience.”
WATCH: How vulnerable is your personal information?
Consumers aren’t all that comfortable
According to a survey by the Consumers International and the Internet Society, 63 per cent of participants said they found it “creepy” that smart devices collect their personal data and track their behaviours.
The security concerns are enough to deter almost a third (28 per cent) of people who do not already own a device from buying one.
Those respondents who do own one aren’t all that savvy at protecting themselves.
The survey found that 50 per cent of respondents knew how to change their device settings to disable personal data collection.
Amazon is leaving a good chunk of privacy protection up to the Alexa user — something MacLellan sees as unfair.
Alexa is meant to make things convenient for the consumer, she said, something manually deleting data doesn’t reflect.
“It puts an unreasonable amount of responsibility on consumers,” MacLellan said.
“Research shows most people don’t read privacy policies or terms of service… so it creates a false idea of consent,” she continued.
“Legally, companies can point to these agreements to say that users consent to their data being stored or shared, but users can’t truly consent to this if they don’t really understand what they are consenting to.”
WATCH: Is your personal data becoming “weaponized?”
So what can users do?
Amazon’s Alexa has some features that consumers can alter to better protect their data, including keeping the device on mute when not in use and setting up a voice PIN for purchases.
MacLellan’s advice is simpler.
“Do a little bit of homework on the trade-offs of using a device like Alexa,” she said. “It’s always a trade-off and for some people, the benefits may be worth the potential risks to their privacy.”