So, you’ve bought a new smart assistant and it sits proudly in the center of your coffee table.
Whether you have an Amazon Echo, a Google Home, or are thinking about buying a new Harman Kardon Cortana speaker, it will keep your life much more organized.
But at what cost? What security and privacy threats are you exposed to? Here are five safety pitfalls of using speaker-based smart assistants.
1. Who is listening?
If a Google representative shows up on your doorstep and asks you to listen to your house with mini microphones, I think most of you will respond with profanity.
What’s so special about the new class of speaker-based smart assistants? It’s like deliberately allowing a giant corporation to attack your personal space. It’s amazing how quickly public opinion changes. In early 2015, there was anger over TV . Today we are asking companies to listen.

And let’s be clear: Speakers are always listening. Of course, they only react when they hear the activation phrase, but they listen to that phrase all the time. Google acknowledged that some of this background audio is stored locally, but declined to say how long it was stored.
Can a hacker tune in and listen to everything in your house? With the recent controversy over baby monitors, the threat seems very real. You certainly shouldn’t put the device in a place where you discuss highly sensitive topics.
2. Data storage
At least the devices aren’t broadcasting all your conversations (yet), so let’s get to the fact that it’s always listening. What happens when you actively participate in it?
The presenter collects data and sends it to the company’s central servers, which process your request. Sounds good, but what about data storage?
The answers may surprise you. Google Home and Amazon Alexa save audio snippets and register them to their account. You can hear all your previous requests by logging into your account. What if someone gains unauthorized access? There may be a lot of personal information stored there.

At least you can delete those stories. But you can’t do anything with all the aggregated data stored on the servers of Google or Amazon. Companies use it to improve the assistant by scanning for updates several times a day.
As for Apple, it keeps Siri requests tagged with your device IDs for six months and then keeps raw audio for another 18 months.
3. Surround sound
The audio snippets that the speaker sends to Google or Amazon contain more than just your queries. The nature of the devices means that they are going to perceive background «ambient» audio.