Alexa, I Thought You Were Off – Are You Still Listening?

Posted on

May 29th, 2018

by

Alexa - amazonAlexa, I thought you were off, are you still listening to me? No really, are you?!

In a May 25th NY Times article, Niraj Chokshi discussed a long time warning from privacy advocates, that smart home devices and virtual personal assistants can record and share our most private thoughts and conversations. “They’re always listening. They’re on the internet. But what happens when digital assistants like Alexa go rogue? Could they share our private conversations without our consent? Privacy advocates have long warned this could happen, and now it has.”

The story revolved around a woman in Portland, Oregon who told a television news station, that her Amazon Echo device had recorded her conversation then shared it with one of her husband’s employees in Seattle. Well that’s more than a little concerning! Amazon responded by saying that the “Echo woke up due to a word in background conversation sounding like “Alexa.” Then, the subsequent conversation was heard as a “send message” request. At which point, Alexa said out loud “To whom?” At which point, the background conversation was interpreted as a name in the customers contact list. Alexa then asked out loud, “[contact name], right?” Alexa then interpreted background conversation as “right”. As unlikely as this string of events is, we are evaluating options to make this case even less likely.”

Of course terms like unlikely and rare don’t really cut it when it comes to private and confidential personal information. The recording and distribution of confidential information takes this concern to an entirely new level. For those who remember, does this seem vaguely reminiscent of HAL (the HAL 9000 computer in 2001: A Space Odyssey). In this groundbreaking movie, HAL is an artificial intelligence computer which (or is it who) goes rogue and kills astronauts to protect itself.  And though I don’t think my Echo is out to get me (yet), there are things you can do to better protect your privacy.

On Amazon’s site, they offer a variety of Alexa history options and settings. You can view a transcript and listen to interactions with your Alexa device. You can delete specific or all recordings. Note that Amazon says that when you use an Alexa device, they keep the voice recordings associated with your account to improve the accuracy of the results provided. If you delete these recordings, it may degrade the use experience when using voice features.

To listen to your dialog history in the Alexa app:

  1. Go to the menu and select Settings.
  2. Scroll to the General section and select History.
  3. Select an interaction from the list, and then select the Play icon to listen to the interaction (note that this did not work for me when I tried it).

To delete individual recordings, select Delete voice recordings. This removes the audio files, as well as the Home screen cards related to that interaction. If you only want to remove a Home screen card in the Alexa app, find that card on the Home screen and select Remove card.

To delete all of your interactions:

  1. Go to Manage Your Content and Devices on the Amazon website.
  2. Select the Your Devices
  3. From the list of devices registered to your Amazon account, select your Alexa device.
  4. Select Manage voice recordings.
  5. Select Delete.

Of course, if you happen to want to splurge on some superfluous luxury item and are looking for a good scapegoat, everyone can now say, “I don’t know what happened, Alexa went rogue and ordered it!”

Skip to content