Is Your Smart Assistant Undermining Your Security?

Lately, it seems like smart assistants have been appearing everywhere, whether that’s in the office or at home. At this point, it seems like the novelty has worn off and it’s a simple common appliance. Just like any other appliance, it can be a bit annoying and hard to deal with. Today, we’re going to talk about troubleshooting smart assistants and smart assistant security.

What Do Smart Assistants Actually Hear?

You should probably be aware of how smart assistants work by now. Basically, a small device lives in your home or your office and listens for a voice command. When this voice command is said, the assistant can do whatever is asked, whether it’s playing music, putting something on your calendar, or reciting something from online. Here are a few of the voice commands used by different assistants:

  • Amazon’s Alexa uses “Alexa,” “Computer,” “Amazon,” or “Echo,” based on your choice.
  • Google’s Home devices use “Okay Google” or “Hey, Google.”
  • Apple’s Siri uses “Hey Siri.”
  • Microsoft’s Cortana assistant uses “Cortana” or “Hey, Cortana.”

Even though these assistants have very specific sounds, they also sometimes pick on something that sounds a little too close to these commands. Haven’t you heard someone talking and then being interrupted by their smart assistant on accident? Because of this, there can be some serious security concerns. These incorrect wake words have actually even inspired academic research.

The Research

In this report, “Unacceptable, where is my privacy? Exploring Accidental Triggers of Smart Speakers,” the researchers used a variety of different smart devices to listen to audio samples, including that of television shows like Game of Thrones and Modern Family, news broadcasts, and other professional audio data.

With this approach, they analyzed the terms that activated each assistant, and then they created a list out of them. Here are a few examples:

Alexa devices responded to “unacceptable” and “election,” while “tobacco” could stand in for the wake word “Echo.” They also heard “and the zone” which was mistaken for “Amazon.”

  • Google Home devices woke up to “Okay, cool.”
  • Apple’s Siri reacted to the phrase “a city.”
  • Microsoft’s Cortana could be activated by the word “Montana.”
  • Beyond English, these devices also found issues in German and Chinese. One example is that the German phrase for “On Sunday” or “Am Sonntag” was confused for “Amazon.”

What This Means to Privacy

This study is really fascinating, but it’s actually deeper than that. Once an assistant is woken up, it begins listening to everything that is said after this. This data is also transcribed and reviewed to check for accuracy, and people may wonder what else it could be used for.

One example of this being an issue is that you’re on the phone with a coworker, talking about a client’s data. While you’re talking, your assistant is turned on and records access credentials. This data is now in the cloud and is being reviewed by someone. This isn’t all an assistant could be picking up.

We’re not trying to scare you away from using smart devices, but we are trying to show you that you should always use them mindfully. Unfortunately, there’s not an option to customize the words that they pick up, so you’ll have to be mindful of what you say around them. You might also want to think about disabling your device if you are going to be discussing sensitive information.

For more tips like this and to learn more about smart assistant security, keep tuning in to the MyTek blog.

Table of Contents

HUMANIZING IT AND CREATING IT HAPPINESS IN ARIZONA

Our goal is to reinvent the managed IT experience for growing Arizona businesses through a partnership with no long-term commitments, technology options that are flexible to meet your needs and infrastructure and strategy that position your technology as a competitive advantage.

Download Our Price Sheet