Amazon’s Alexa is recording you every time it thinks it hears a “wake” word
Hey Alexa, butt out.
We all love our smart speakers, displays, and thermostats, right? They’re the first step towards the omniscient, omnipotent AIs from science fiction shows like Star Trek. You’d be forgiven for thinking that the market leading voice assistant, Amazon’s Alexa, would be leading the pack for user privacy as well. That appears to not be the case, with multiple reports casting light onto some haphazard data protection policies.
With how ubiquitous the various voice assistants are, we tend to forget that convenience comes at a cost. In this case? It’s our privacy, with the companies behind the AI using our voice recordings for various other tasks to “improve their services.” The team behind Alexa also uses our recordings in this manner, although it doesn’t appear as cut-and-dry as Amazon would like to make out.
The Washington Post has a recent report going into more detail, outlining a pattern of issues with how Alexa handles our voice recordings. Amazon has confirmed to KnowTechie parts of the reporting, namely that Alexa, on whatever device she’s installed on, records audio. I mean, it goes without saying that a voice assistant needs to record us in order to function, right?
Sure, but Amazon insists that Alexa only records when the wake word (Alexa by default) is said
That seems to not be the case, with a different WashPoreport saying that “the Echo records a second-long snippet of ambient sound which it ‘constantly discards and replaces,’ until a wake word starts the recording process.” That means if Alexa mishears something that sounds like “Alexa,” she might keep on recording, with potentially disastrous effects.
Those longer recordings are also sent to different sites worldwide, where Amazon auditors are authorized to listen in to train the AI behind Alexa. Amazon emailed us a statement (below) which also went on to say that Alexa customers have complete control over listening to, deleting, or reviewing the voice recordings associated with their account, either in the Alexa app or at www.amazon.com/alexaprivacy.
Alexa is always getting smarter, which is only possible by training her with voice recordings to better understand requests, provide more accurate responses, and personalize the customer experience. Training Alexa with voice recordings from a diverse range of customers helps ensure Alexa works well for everyone.
The WaPo article referenced earlier seems to have missed the mark on one point, namely that there’s no way to opt-out to having your Alexa recordings used in this way. You can set your Alexa devices to not use your voice recordings to develop new features or improve transcription accuracy, which are both functions handled by the Amazon Auditing teams.
You can stop sending voice recordings to Amazon, here’s how:
- Open the Alexa app on your phone.
- Tap the menu button -> Settings on the top left of the screen.
- Select Alexa Account
- Choose Alexa Privacy
- Select Manage how your data improves Alexa
- Turn off the button next to Help Develop New Features
- Turn off the button next to your name under Use Messages to Improve Transcriptions
That puts Alexa in the same camp as the other big voice assistants, with a way to keep your recordings yours. Google’s Assistant lets you ban it from saving commands after they’ve been processed and used, and Apple’s Siri is content to just talk between your iPhone and smart speaker. At least, in theory. All three voice assistants have human “auditors” in what seems to be a standard practice in the industry.
- Your private posts aren’t private to those training Facebook’s AI
- Amazon says fully-automated warehouses are at least a decade away
- Google is trying to introduce shopping links in YouTube videos
- Freaking laser beams were used to shoot down missiles by the US Air Force
- Apple’s annual Worldwide Developers Conference is coming, here’s what to expect