It’s time to play devil’s advocate for a little while. You’ve seen the reviews for all of the available virtual assistants. Maybe your favorite smart persona doesn’t have a product housing just yet, but you know it’s coming. Besides, there are smartphone apps that give you a taste of the future, but it is just a taste. For a full-on smart home assistant experience (SHA), you need to purchase a home appliance.
The Amazon Echo family is about to become a proper dynasty, an all-encompassing smart friend who feels like a member of the family due to her natural language inclinations. She’s equipped with thousands upon thousands of skills. The Google Assistant voice-activated wireless speaker is similarly decked out with a synthetic IQ that can stream music, podcasts, audiobooks, and every other conceivable form of audible media. Their cloud translation abilities extend to cover other members of a burgeoning product family, so the Fire TV platform listens to Amazon Alexa, Chromecasts listen to Google Assistant, and so on, ad infinitum. As marvelous as all of this is, there’s more to talk about. These digital companions go beyond the capabilities you’d assign to a normal wireless speaker. Thanks to that Chromecast or Fire TV, there’s video content to command, and then there’s the smart home, an Internet of Things (IoT) that’s yours to command if you own certain pieces of compatible hardware.
Okay, this is the moment that we call in our devil’s advocate. What about privacy & security? That’s the first question, and it’s a good question. When you connect every part of your home to a computer interface, doesn’t your whole property then become hackable? Here’s a rebuttal, an answer that will show the smart home assistant’s non-compromising distaste for the hacker philosophy.
To Be Concerned or Not?
First of all, your natural inclinations are going to push your perceptions towards criminal activities. You’re remembering the warnings and cautionary tales you heard about computer hackers. The Smart Home is just one big computer with lots of distributed “things” connected to it via numerous data protocols, right? Perhaps, but that’s a tale for another chapter. Instead of tapping the digital perimeter to see if it has any chinks, you should first take a look at your virtual assistant and its penchant for listening to your every word. Of course, the voice-activated brain doesn’t record your spoken words, doesn’t eavesdrop on your every conversation, but it does store your user data, including the requests and commands you voice throughout the day. That’s the crux of the issue, this potential privacy concern. Your voice-activated assistant does wake from its slumber, does run your words out to its remote servers, and it does store those words. But is this really a problem? That’s a question that’s likely to trouble the peaceful sleep of hardware designers, software programmers, police investigators, and lawmakers for many years to come.
Don’t Forget The Smart Speaker Wake Word!
Again, and this point must be reiterated, Alexa, Cortana, Google Assistant, and Siri are voice-activated listeners. They’re dead to the world until their Wake Word brings them into the land of the living. That means Siri isn’t crouching inside your iPhone, waiting to record your Game of Thrones spoilers. It also means Google Home and Alexa have absolutely no interest in either hearing or recording your out-of-tune renditions of the most recent U2 hit single. That Wake Word feature should be enough to relax the tension you’re holding in your shoulders, but this technology isn’t perfect. There’s always the chance, the barest chance, that you could awaken your virtual helper by mistake. It’s this scant possibility that’s absorbing the attention of certain authorities, however, for they perceive any possibility, no matter how small, as a potential case breaker. Imagine this plausible mode of thinking being used as leverage, as a means of forcing a tech company to release cloud-stored data. Never mind the lawbreakers, for the lawmakers could be the ones to take control of your private information, including all of your recorded utterances.
Smart Speakers and the Law!
In theory, what you’re hearing here is a worst case scenario, but that’s not exactly true. There’s a precedent for this conspiracy-level intrigue. Certain authority figures have asked for data from Alexa’s servers, which leaves us to believe our favourite personal assistants could become tattletales, just like the ones described in that Orwellian novel known as 1984. However, this police state mentality, despite the words of some conspiracy theorists, doesn’t exist in the free world. Yes, the investigators in any criminal case must nobly seek every resource before they head to court. But if they do gain access to voice-activated data, what’s to stop a lawyer from making this request (demand?) again in the near future. Admittedly, the information won’t be the property of the public domain, but someone will still have access to it, which means an end to your absolute privacy if this invasive, perhaps lawful, entreaty is upheld.
In the end, you probably desperately want to cooperate with your public defenders, to help solve murder cases and be a first-rate witness, but not at the cost of your own personal space. Google has and still continues to fight this battle. Apple has taken the same stance, so iPhone data can’t just be pulled into court whenever the case needs digital evidence. Amazon has obviously followed suit, which is wonderful, yet you’re still left to wonder “What can I do?” Happily, you’re not powerless. Take charge of your privacy right now by securing your Amazon Alexa.
Privacy Concerns: Lockdown Your Amazon Echo
Security is an integrated feature. Locally, there’s very little stored inside your voice-activated Amazon Echo. That’s because it’s all out there, suspended in the cloud as floating ones and zeros. Remote storage features like this will stymy the hackers. However, any legitimate request, one perhaps made by a court order, could force Amazon to release the data. Stop this short-circuiting of your personal information by following this next guide. First of all, conjure up the faithful smartphone App. The Alexa App has a History entry, but you’ll need to tap on the Settings menu to access History. Do so by hitting Settings –> History, and then take a breath as you scan the voice-activated commands. Every request uttered by you and your family is in here. Delete specific recordings, one at a time, by tapping the software-transcribed request. Alternatively, use your web browser to head on over to the “Manage Your Content and Devices” page. From here, you’ll quickly discover the Device tab. Mouse over, click on your Device, then select your Amazon Echo. Your decision to delete all recordings can be enacted here when you hit “Manage Voice Recordings,” although you’ll want to be sure before you tap that final confirmation pop-up.
Remember, all of these stored transcriptions are there for a reason. They help Amazon’s software designers to improve their product. Alexa’s own natural language recognition capabilities also rely on this recorded input. She may lose the ability to precisely recognize your voice if you delete this stored library. Still, if some news headlines do suggest virtual assistants will be forced to divulge their cloud-stored contents, you may just want to hit that button from time to time.
Privacy Concerns: Gagging Your Google Home
Incidentally, Google software engineers have added a near identical feature to Google Home. Use it if you feel the spirit of paranoia throbbing in your frontal lobe. This particular electronic muzzle is activated by accessing the My Activity (myactivity.google.com) section of your Google account. In here, you’ll see recent text-based web browser search strings, manage your visited websites, and keep track of your voiced requests. The information is intuitively structured. New searches show up in chronological order. Target the Filter By Date and Product link. It’s located under the search bar. Now you’re going to place a check mark in Voice and Audio box. Mouse over the search icon and activate the filter.
Every voice command recorded by your Google Assistant is pulled up by the search, with everything still locked into the chronological view. Select a particularly embarrassing request, perhaps that query you made about Taylor Swift tickets (What were you thinking?), and reload the page. Hey presto, the transcribed request is gone. If you want to expand this ability to include entire days, click on the three vertical dots on the right of each day. Alternatively, if you want a clean voice-activated slate, click the three dots to the right of the search results. Select Delete, and enjoy a virginal voice-activated assistant once more. Happily, this convenient path to an amnesiac virtual assistant also exists in the Google Home App. In order to get there, open the App, access Settings –> More Settings –> My Activity, then follow the same guide to either selectively delete content or initiate a wholesale deletion policy.
Security Tips for Every Smart Home Assistant Owner
As a wake-up call or reality check, whichever adage you prefer, this chapter was never meant to create alarm. There are no immediate security concerns to worry about, so stop giving your virtual assistant the cold shoulder. If feelings of paranoia should persist, mute the microphone by tapping the small button on top of your Amazon Alexa. On the rear of a Google Home, the same function is found on a large button, so silencing voice-activated companions isn’t an issue. An “OK Google, Turn of the Microphone,” or “Alexa: Mute” also does the job. Remember, Wake Words are easy to change on Amazon’s smart platform. If the little neighbor girl is called Alexa, you might want to consider changing your own Wake Word to “Computer” or “Echo,” especially during the summer months. Little Alexa is playing with her friends in the garden, your windows are open, and your Echo is responding to every call for the little munchkin to come in for supper. Frankly, even if the kid is a boy, a little Alex, it might still be a good idea to make the change.
Common sense is still your strongest ally when technology gets uppity. For instance, if you’re worried about little Alex or Alexa playing outside, don’t place your Amazon Echo near an open window. Better yet, dip into the Settings panel, all the way down to the Sounds and Notifications section. While in here, activate feedback notifications so that you ‘ll hear a tone whenever Alexa wakes from her slumber. Finally, as skills and Actions are developed by banking institutions, turn on any two-level authentication settings or deal with the consequences of a child who accidentally orders a dream toy. There’s nothing worse than opening the front door to see the delivery guy has brought your tiny tyke a new pony because your voice-activated companion listens to everyone in the house with equal devotion. Fortunately, there’s also an integrated feature within Amazon’s smart home assistant that’s designed for just such times. To take care of inadvertent purchases, turn off voice purchasing in Alexa’s web portal (alexa.amazon.com) or the mobile App. A familiar journey through the Settings panel and a scroll down to Voice Purchasing will do the trick. In here, you have the option to either entirely disable the feature or assign a 4-digit confirmation code.
Arguably, this is a slippery slope into hacker territory. All banking transactions are obviously heavily encrypted, then there’s the Wake Word and a 4-digit code protecting your shopping portals. Every possible security protocol is in place in the Amazon shopping ecosystem. Google Express, the search giant’s own home shopping platform, naturally mirrors this security model by leveraging the power of one of the world’s most secure online infrastructures. Still, stop for one minute and think before putting all of your eggs in a home shopping or banking basket, for there are other ways to circumvent near-impenetrable online defenses. We leave you with this thought, the fact that all the passcodes and authentication protocols in the world are next to useless if an immoral neighbor is standing within earshot of your virtual assistant as you sound out your banking passcode or you voice purchase PIN.