+5 votes
79 views
in Smart Home by (242k points)
reopened
security of voice assistants: what you should pay attention to!

1 Answer

+3 votes
by (1.6m points)
 
Best answer

This is how voice assistants work
When the voice assistant becomes a risk
Data protection for voice assistants
Conclusion and tips

If you want to learn more about the security of voice assistants, we have important tips and hints for you in our article..

image image

Voice assistants are ubiquitous. Whether Apple's Siri, Amazon's Alexa, Microsoft's Cortana or Google Home - electronic helpers are finding their way into more and more households. The advantages of these systems are obvious: They simplify access to information or make it easier to control "smart" devices in the household. But what about security?

This is how voice assistants work

Voice assistants replace the manual entry of commands into electronic devices. Instead, voice commands are sufficient to perform functions or receive information. The information request is accelerated and the operation of technical devices is made easier. That works pretty well right now. Alexa, Siri and Co. don't understand everything yet, but they do understand a lot. Voice assistants react to a signal word such as “Hey Siri” or “Okay Google”. Then orders can be given to the electronic butlers. These commands are analyzed and processed by special software from the manufacturer. In response, information is sent to the device owner - either as a voice message or by performing an action.These actions include, for example, increasing the volume of a stereo system or ordering items in online shops. But this convenience also has a downside.

When the voice assistant becomes a risk

The more harmless kind of risk includes commands that are carried out by mistake. For example, if the assistant is activated unintentionally and unintentionally listens. Nocturnal police operations due to remote-controlled party music or unwanted orders could be the result. Caution is also advised with children in the household. After all, they too have access to the wizards. You could be listening to adult content or placing unwanted orders..

Even everyday things like an open window can become potential security gaps with a voice assistant in the house. After all, the commands also work if they are called from outside.

It becomes even more critical when potential security gaps in voice assistants are targeted by hackers. One possible scenario would be the interception of voice commands. These could be broken down into individual parts and, in the worst case, reassembled into harmful commands..

Data protection for voice assistants

Data protection is also a sensitive issue when it comes to voice assistants. They have to be activated all the time in order to be able to hear the corresponding signal word. Some manufacturers assure that data are not sent to the server until the signal word has been used, but it is difficult to check these statements. The German Federal Commissioner for Data Protection, Andrea Voßhoff, has also expressed her concerns.

Once the data ends up on the servers of Microsoft, Apple, Amazon or Google, it usually stays there. They are used for marketing purposes, for example, to be able to offer customized advertising with the help of individualized user profiles. Google at least enables its users to view and even remove the data collected on myactivity.google.com . Amazon also offers the deletion of voice data in the Alexa app.

But not only the manufacturers of the voice assistants save data. Third-party providers that are connected to the voice assistants also collect information. In this case, deletion is either not possible at all or is far more complicated than in the examples above.

Voice recordings can also be of interest to government agencies. In the United States, Amazon was asked to release Alexa voice recordings as part of a murder investigation. In the future, it might even be possible for smart home technology to be used for surveillance. Aside from this Orwellian dystopia, criminals could also gain access to the relevant data. The violated privacy quickly becomes a minor matter.

Conclusion and tips

Voice assistants are one of the biggest trends in consumer electronics right now. Especially in connection with other smart devices, they can be a noticeable gain in convenience. In addition, the new technology is of course fun for many users. As with any other Internet-related application, the downside is potential security loopholes. Finally, we briefly summarize the most important safety tips for you:

  • In general, the more often Alexa, Siri and Co. are used and the more diverse the areas of application, the more data is in circulation. The larger the amount of data, the greater the likelihood that sensitive data will also be included. So only use the voice assistant for applications that are really more convenient and easier to use thanks to the voice support.
  • Do not speak out sensitive information such as credit card or bank account numbers when a voice assistant is activated.
  • Switch off the voice assistant when you are not at home (here we explain how you can switch off the Amazon Echo) and do not place the devices next to open windows to prevent unauthorized access or misuse.
  • Don't leave children in the household alone with the electronic butlers and secure orders with a code.
  • If possible, delete data that should not be stored on external servers.

...