google assistant net mini kevin bhagat 9TF54VdG0ws unsplash scaled

Any links to online stores should be assumed to be affiliates. The company or PR agency provides all or most review samples. They have no control over my content, and I provide my honest opinion.

Since Amazon’s release of its signature smart speaker Alexa in 2014, having voice assistant products at home has become commonplace. Whether you own an Amazon Echo, Google Home or similar item, the convenience of these smart speakers is undeniable. For just $100 or less, you can get a personal assistant that plays any kind of audio such as music or a podcast, manages tasks and answers questions all at the command of your voice. With the hustle of the modern world, time savers are swiftly welcomed into our culture.

But if your voice-activated device is always ready to listen, what else could it be listening to (and recording)?

In 2019, the news broke that Amazon workers do actually listen to recordings of what users tell Alexa. Amazon, Apple and Google all have voice activated technologies, and while they don’t advertise it, the three companies have mentioned that they have human reviewers who analyze your recordings.

Amazon took another blow in 2019, when a German customer was sent 1,700 audio files from another person’s Echo by mistake. The files provided enough information to both name and locate the user and her partner. This mishap was excused by the company as an unfortunate case of human error.

Voice assistants may be helpful, but they’re also nosy. And having your conversations recorded is a definite threat to privacy.

If a hacker were to have access to these files, they could use the information to breach your home PC or laptop. This is a reason to invest in Windows antivirus software, too, since a criminal might be able to implant malware to corrupt or even ransom your files.

As these companies aren’t exactly forthcoming about where your recorded data goes, it’s best to keep as tight a rein on it as possible.

If this eavesdropping freaks you out, thankfully there are a few ways you can protect your information while still keeping your device. Below are some tips for upping your security on a smart speaker.

Delete Your History

When your voice is captured on an Alexa or a Google Home, these recordings are stored forever. Although, there is a way to delete them if you’d like by clearing your history through the device’s platform. With Apple’s Siri function however, the company claims to automatically analyze and delete the recordings itself.

  • For Alexa: Go to Settings -> History in the app to delete entries one-by-one, or head to Alexa Privacy Settings on the Amazon website to erase all recordings at once.
  • For Google: Open the menu via Then, navigate to Delete Activity where you can choose the date range you’d like to delete.

Turn Off the Mic

While this will affect your voice activation possibilities, it’s a sure way to get your plug your device’s ears. This is a good option to use when you aren’t planning to use your device or are having a particularly personal conversation. The Amazon Echo has a mute button on the top of the device, Google Homes have one on the side. With both Siri and the Apple HomePod, you can ask the assistant to stop eavesdropping by saying “Hey Siri, stop listening.”

Adjust Your Settings If Possible

There are privacy settings you are able to opt out of when setting up your device. For example, Google Devices ask you to consent to recordings (saying they improve your voice activation services, though). You can also update your settings once the smart speaker is use. For example, in the Alexa app you can click on Alexa Privacy, then Manage How Your Data Improves Alexa. Next, switch off both Help Develop New Features and Use Messages to Improve Transactions.

With a few extra security measures in place, you can enjoy more privacy on your smart device.

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *