While the smart speakers in our home They make our lives much easier when it comes to finding information without having to access our mobile phone or computer, they can also be a major security breach if we end up neglecting it, and a recent study makes things very clear about it.
Is it likely that you will ever smart speaker You have heard a word or phrase that you should not have heard, and that can be a serious privacy problem and a violation of your security if you have also been heard by a human engineer on the other side of the server.
As you well know, the operation of smart speakers does not keep much mystery since this type of devices first they wait for some activation word or phrase, and from there is when we ask them our question, a question that goes to the servers, to the automatic algorithm of the device in question, and from there we will receive an answer. The problem comes that to perfect this automatic algorithm many times these questions also reach the ears of human engineers.
Now a study by the Ruhr Bochum University and the Max Planck Institute for Security and Privacy has found more than 1000 words and phrases that assistants Alexa, Siri and Google Assistant It has often been mistakenly identified as “activation commands”, something known as false positives.
An activation command is when our smart speaker or virtual assistant “wakes up to listen actively” to answer our questions and surely phrases like “Hey Google”, “Hello Cortana” or “Hey Siri”, among others.
What this study has come to point out is that there are words similar to these activation commands that are confused by the device in question, and that makes them activate without us noticing, being able to record later what we are saying.
According to the mentioned study, these are some false positives:
- Alexa: “unacceptable,” “election” and “a letter”
- Google Home: “OK, cool,” and “Okay, who is reading”
- Siri: “a city” and “hey jerry”
- Microsoft Cortana: “Montana”
For example, imagine that by mistake instead of saying “Cortana” you indicate “Montana”, and there the device is activated in listening mode, and then your next phrase is the PIN of your bank card to close a reservation to Montana. Well, this information reaches the device’s servers, and with a minimal possibility that it will be heard by an engineer.
That is why the study gives a series of recommendations so that we can prevent these failures in the activation commands:
- Change the word or phrase of activation of your smart speaker whenever possible, trying to choose a phrase or word less prone to error.
- In the case of Google Home you have an option to reduce the activation sensitivity.
- Certain devices also allow you to mute the device’s microphone.
- Don’t always have your smart speakers turned on if you’re not really using them regularly.
- Delete all recordings and update security options in each of the accounts of these electronic devices so that the audio is never saved, listened to or shared with engineers.
In this way, using the tools that these speakers put on the tray and also some common sense, we can avoid falling into one of the most common mistakes that continue to be made with these smart speakers.