So, you’ve bought a new smart assistant and it sits proudly in the center of your coffee table.
Whether you have an Amazon Echo, a Google Home, or are thinking about buying a new Harman Kardon Cortana speaker, it will keep your life much more organized.
But at what cost? What security and privacy threats are you exposed to? Here are five safety pitfalls of using speaker-based smart assistants.
1. Who is listening?
If a Google representative shows up on your doorstep and asks you to listen to your house with mini microphones, I think most of you will respond with profanity.
And let’s be clear: Speakers are always listening. Of course, they only react when they hear the activation phrase, but they listen to that phrase all the time. Google acknowledged that some of this background audio is stored locally, but declined to say how long it was stored.
Can a hacker tune in and listen to everything in your house? With the recent controversy over baby monitors, the threat seems very real. You certainly shouldn’t put the device in a place where you discuss highly sensitive topics.
2. Data storage
The presenter collects data and sends it to the company’s central servers, which process your request. Sounds good, but what about data storage?
The answers may surprise you. Google Home and Amazon Alexa save audio snippets and register them to their account. You can hear all your previous requests by logging into your account. What if someone gains unauthorized access? There may be a lot of personal information stored there.
As for Apple, it keeps Siri requests tagged with your device IDs for six months and then keeps raw audio for another 18 months.
3. Surround sound
The audio snippets that the speaker sends to Google or Amazon contain more than just your queries. The nature of the devices means that they are going to perceive background «ambient» audio.