Voice-activated assistants vulnerable to hackers

Voice-activated assistants like Amazon Alexa, Google Home and Apple’s Siri are prime targets for sabotage, big and small

By Brennen Schmidt
ALEUS Technology Group
and Allan Bonner
Troy Media columnist

Whether listening to the radio in its digital or analogue form, fans will know the rush associated with contests. A station’s telephone lines can soon become jammed with requests from callers hoping to be “lucky caller number five.”


While this may sound innocent, in the heyday of radio, these contests could shut down the telephone grid for a neighbourhood or even a small city for a while. Good luck if you needed help.

Authors Richard A. Clarke and R.P. Eddy document a real case in their book Warnings. They point to Christmas 2015, when a Ukrainian city of a 250,000 people lost power. A hacker got into the electric utility’s control board and turned off the breakers. Many residents stumbled in the dark to phones to call for help or clarification.

Most couldn’t get through – not because of the volume of their calls but because of robo-calls from outside Ukraine. This low-tech followup to high-tech terrorism can have equally disastrous consequences and may be even more dangerous.

Here’s why:

We’ve known for a long time that a standard terrorist technique is to trigger a small event (fire, bomb, crash) and stampede victims to an apparently safe spot. Then the terrorists take advantage of the crowded scene and set off a larger event.


An even less obvious technique today would be to let the first event happen naturally and then take advantage of it.

Think this is part of a far distant future? Think again.

Voice-activated assistants – including Amazon Alexa, Google Home and Apple’s Siri – are prime targets. And with more of us opting to buy the latest and greatest ‘smart home’ technology, we’re inching ever closer to a similar event with several technologies.

That’s because each of these new devices can execute a host of commands using normal human speech. All users need to do is say a trigger word or phrase. The device will either blink and/or light up, then listen for a command.

In Amazon’s case, saying the word “Alexa,” activates the digital assistant’s listening capabilities. From there, the user has the ability to ask Alexa to turn off specific items connected as part of the Internet of things. Today, this includes lights, security systems and appliances in our homes. Alexa can do even more, including completing an order of something online (through Amazon, of course).

In the not-so-distant future, voice commands could be given via similar devices that control computers in our cars, or control traffic lights in our cities.

In January 2017, a little girl reportedly ordered a doll house and cookies using a digital assistant. The girl and her family soon became an Internet sensation. The transaction resulted in a charge to the credit card on file, with delivery of the items a short time later. While this may seem humorous, the girl’s parents learned a valuable lesson when it comes to securing devices. They’ve since implemented parental controls – a security code must be provided to authorize specific instructions.

Let’s turn back to radio.

Imagine someone going on the air with the intent of activating a command. The command from the radio could trigger action in a dozen devices. Even if there were parental controls in place on the affected devices, each would still be constantly listening for the word or phrase that would cause it to listen for a subsequent command.

Now imagine if loved ones came to rely on this technology for medical monitoring purposes. If their device happened to be triggered, it might focus on carrying out an action directed by the person on the radio, rather than doing its job of catering to its human.

The combination of interruptions and misdirection could have disastrous consequences. The worst is that we’re still not at a point where these digital assistants can distinguish between their owner or another person’s voice.

They’re simply programmed to listen to a trigger word, no matter who – or what – it’s from. Sure, there’s a mute button, but who’s going to be diligent enough to turn it on and off regularly? These devices are all about convenience.

They’re just doing their job, exactly what they were programmed to do.

It’s time to begin a conversation with legislators and regulators to see how we can prevent such things from happening. Is this a matter for privacy legislation in Canada? Should legislation about the sale of goods apply? In the U.S., is it interstate commerce? Can the U.S. Federal Communications Commission (FCC) or Canadian Radio-television and Telecommunications Commission (CRTC) get involved?

While these technologies may be convenient, they present a real and present danger. Left uncontrolled and unregulated, these devices won’t just fall victim to executing innocent commands. They may also be vulnerable to becoming zombies, sending massive amounts of data to targets, rendering telephones or other essential electronic systems unusable.

Voice activation makes access a lot easier for hackers. And it makes it harder for device owners to protect themselves. At least with passwords, they could be made more complex with symbols and changed regularly. Hackers have to try hundreds of combinations of common passwords, once they get through the obvious (1,2,3,4 or the word password).

With voice activation, cyber criminals or mischief-makers only need say the word and they’re in your device. As with passwords, open trigger words should be hard to simulate.

Dr. Allan Bonner, MSc, DBA, is a crisis manager based in Toronto. His forthcoming book is Cyber City Safe. Brennen Schmidt (BEd, Certiftied PR, CUA) is principal of the ALEUS Technology Group, a boutique digital communications firm in Regina.

voice-activated assistants hackers

The views, opinions and positions expressed by columnists and contributors are the author’s alone. They do not inherently or expressly reflect the views, opinions and/or positions of our publication.

You must be logged in to post a comment Login