May 11, 2018 5:28 pm

Alexa and Siri can hear hidden commands that you can’t, researchers warn

Reseachers say voice assistants may be able to hear commands not audible to humans.

Elaine Thompson/AP

Voice assistant technology used in Amazon Alexa and Apple’s Siri is growing in popularity. But as that happens, consumers and researchers are finding flaws in the devices.

The latest one, found by researchers at the University of California, Berkeley, may leave some users nervous.

READ MORE: Amazon Echo mistakenly orders cat food after hearing TV commercial

Researchers Nicholas Carlini and David Wagner say the devices can be tricked to follow commands that are inaudible to the human ear because they are at a high-noise frequency.

That means a seemingly normal song could be embedded with phrases or commands that the devices can pick up — but humans can’t hear.

WATCH: Amazon’s Alexa looking to become a personal health tool

Story continues below

The method is dubbed a “dolphin attack,” because the water creatures can hear sounds that humans often cannot.

The researchers tested this by using speech recognition software to hide inaudible commands into other pieces of audio, such as music.

Using Mozilla’s DeepSpeech voice-to-text translation software, they were able to hide the phrase, “OK Google, browse evil dot com,” into another recording of someone talking.

They also embedded other commands into music clips.

READ MORE: Will voice-activated assistants make us spend more?

The findings are concerning, as they open up a greater possibility of audio security attacks.

“My assumption is that the malicious people already employ people to do what I do,” Carlini told The New York Times.

The researcher added that he’s confident he and his colleagues will eventually be able to attack any smart device.

WATCH: Alexa, Google Home, Echo and more — what to know about smart speakers

But Carlini explained their purpose is to flag the security problem — and then try to fix it.

Carlini isn’t alone in flagging the problem of smart devices hearing things (and acting on them) without actually being asked by users.

READ MORE: In just 3 years, more people might turn to Amazon’s Alexa for their banking

Similar problems have been reported by consumers.

In February, British Advertising Standards Authority (ASA) asked for a Purina cat food TV commercial to be taken off air, which prompted one man’s Amazon Alexa to place an order for the product.

And in September 2017, an episode of TV show South Park, which featured a character repeatedly yelling prompts to the machine, also prompted several Amazon and Google Home products to place orders.

READ MORE: Amazon’s Alexa is randomly laughing at people, and the company is trying to fix it

In a February email to Global News, Amazon explained that purchases made through its smart speakers must be confirmed by customers before being processed.

But that’s not the only problem that’s been reported with the smart speakers.

Earlier this year, several owners of Alexa reported that the voice-assistant devices were bursting out into laughter for no reason.

© 2018 Global News, a division of Corus Entertainment Inc.

Report an error


Want to discuss? Please read our Commenting Policy first.