Voice Search and Alexa Privacy

Artificial intelligence is entwined with our daily lives. What was once science fiction is now a very normal part of our day. Alexa wakes us up in the morning, reads us the weather, plays our music, helps us cook dinner, and even locks our doors at night. For many users, she’s a part of the family.

Amazon has sold us on the idea of home automation. “Instantly connect to Alexa to play music, control your smart home, and get information, news, weather, and more using just your voice.” The thing is, millions of users are sacrificing privacy for convenience.

As it turns out, thousands of Amazon AI specialists around the world are listening in on your Alexa device—voice search queries, conversations, and more. This information is buried deep within the product and service terms. And let’s face it—nobody is reading that. She’s always listening, and up until recently, we didn’t know. But is it as unsettling as it sounds?

Alexa’s voice search functionality is entirely based on Artificial Intelligence (AI). Although Alexa assistant is already intelligent, she isn’t exactly intuitive or conversational in all languages. That’s where the eavesdropping comes in.

AI needs humans to become smarter when it comes to speech patterns and recognition. We have to remember that voice search has a lot of complexities and nuances. Languages, tones, accents, slang, and background noises differ from user to user. So, Amazon needs a massive pool of data to work with.

In a statement to Fortune, an Amazon spokesperson said the company uses “an extremely small number of interactions from a random set of customers,” for voice search studies. That being said, more than 100 million Alexa devices have been sold so even a small sample could include millions of users.

Amazon’s AI specialists collect data on natural speech patterns from users all over the world. They feed the Alexa AI system raw data from voice searches, and as a result, she becomes more familiar with the speech. This is known as supervised learning—humans help machines learn about variables so they can calculate an output. In this case, AI specialists help Alexa learn about human communication so she can give more accurate search results.

Is Alexa’s data collection an invasion of privacy?

Well, sort of. On the one hand, voice search customers are continuously being recorded. While they may be listening to you sing badly in the shower, they won’t know it’s you. Voice recordings remain anonymous to the employees who are listening in.

That means Alexa’s data collection isn’t much more intrusive than typing something into Google. Whether you like it or not, Google catalogs everything. We’re talking about billions of searches each day from all across the globe.

Why? Because with your search data has refined Google’s algorithm and overall improved the search experience. There’s a definite benefit to recording this type of data. This is the exact type of thing Alexa staff is trying to do with voice search.

The thing is, Alexa’s voice search sparks a new type of vulnerability because most customers don’t realize they’re being recorded. According to The Verge, “Amazon has downplayed the privacy implications of having cameras and microphones in millions of homes around the globe.” But to put things into perspective, we’re always surrounded by microphones and cameras.

Think about it. Our phones, computers, tablets, car assistants, and video game consoles have audio/visual capabilities. Gmail developers read your emails to deliver targeted ads. Facebook and Instagram track your past browsing history to provide relevant posts. You’re being monitored whether you like it or not.

While this isn’t necessarily a Big Brother situation, it’s safe to assume that your search data has been recorded somehow, somewhere. Siri and Google Assistant already use the supervised learning method to improve their voice search results. As these commercial AI systems continue to grow, they will make the switch to semi-supervised learning—where human research is slim to none.

So sure, in some ways, Alexa is currently infringing on your privacy. But not any more than the rest of the voice search technology we surround ourselves with. We have welcomed cameras and microphones into our homes regardless of what type of device we own. It just so happens that Alexa users have unwillingly enrolled themselves into a research program.

Amazon’s Alexa Policy

“We take the security and privacy of our customers’ personal information seriously. We only annotate an extremely small sample of Alexa voice recordings in order to improve the customer experience. For example, this information helps us train our speech recognition and natural language understanding systems, so Alexa can better understand your requests, and ensure the service works well for everyone. We have strict technical and operational safeguards, and have a zero tolerance policy for the abuse of our system. Employees do not have direct access to information that can identify the person or account as part of this workflow. All information is treated with high confidentiality and we use multi-factor authentication to restrict access, service encryption, and audits of our control environment to protect it.”

To opt out of being an AI trainer:

  1. In the Amazon Alexa app, tap the menu button in the upper left corner of the screen.
  2. Then select “Alexa Account” and “Alexa Privacy.”
  3. Choose “Manage how your data improves Alexa.”
  4. Next, click off the buttons next to “Help Develop New Features” and “Use Messages to Improve Transcriptions.”
  5. The settings will keep Amazon from using raw recordings to train its software.