Smart home, smart user?
Smart assistants are man’s new best friend, sorry Fido. But they should be treated as your most gossipy friend. Sure you get on, but there’s still a risk they’ll sell you out! Yes Alexa makes life easier, but she’s also another link in a chain that hackers can exploit.
Hackers aren’t the only risk. Do you know what your smart assistant’s data sharing settings are? Do you know Google employees might listen to your audio requests?
Here are a few quick tips to make sure you can trust your new best friend.
Your Google smart assistant data is linked to your Google account. From your phone to your smart speaker, everything is logged.
So if someone hacks your Google account they can see all data connected to your Google Assistant.
To see the data your Google Assistant logs:
- Log in to Google on desktop.
- > ‘Data & personalization’.
- > ‘Manage your activity controls’.
- > ‘Manage activity’ under ‘Web & App Activity’.
- > ‘Item view’ > ‘Filter by date & product’.
- Tick ‘Assistant’ and ‘Voice and Audio’
You can select the data you want to delete. Or you can set Google to automatically erase data from your account every 3 months (or more). Just go to the ‘auto delete’ option at the top of the Manage activity page.
Have kids or a business with lots of people using a Google Assistant? Maybe you might want to keep tabs on the audio request data for your smart assistants. You can opt to have Google record this data.
- Data & personalization.
- Choose Manage your activity controls.
- Look at the ‘Web & App Activity’ options.
- Tick ‘Include audio recordings’.
To play saved audio you click the three dots next to audio from the list. Then select ‘Details and View recording’ to play it back.
Google saves this data to help the assistant learn your voice and tailor recommendations to you. But Google also said that human reviewers listen to a percentage (0.2%) of snippets for quality control. Either way, it’s creepy!
Siri and Apple smart assistant.
Apple is a bit better with your data security. They don’t link your Siri data to an Apple ID, but it is kept on Apple’s servers.
Your Siri voice requests and data is assigned to a sort of fake identity for you for six months. Apple says they keep this data for up to two years. They say this is so that they can improve Siri’s voice recognition with a bank of different voices and accents. So basically they keep your audio and data but it’s anonymised.
You can opt out of having your Siri recordings listened to by Apple employees for quality control purposes.
From your iPhone go to:
- Then go to ‘Privacy, Analytics & Improvements’.
- Thn open ‘Improve Siri & Dictation’ to opt out.
You can also erase your Siri data stored on Apple’s servers by going to:
- Select ‘Siri & Search’
- Then select ‘Siri & Dictation History’
- Then choose ‘Delete Siri & Dictation History’
You will need to repeat these steps on all your Apple devices as they aren’t linked. The process is more or less the same for Macs, iPad and Apple Watches.
If you don’t want anyone else to be able to use or access your HomePod through voice command.:
Go to the Home App > Go to ‘Recognize my Voice’ and opt in.
This means Siri will only respond to your voice on your Apple smart assistant.
Amazon Alexa devices all have physical switches to mute the microphone or camera. Alexa also only records and reacts to voice commands if you say ‘Alexa’. However, Amazon also records you by default.
You can set Amazon to auto-delete after a period of time or stop them being recorded completely.
Just go to the Alexa app on your phone:
- Tap ‘More’
- Select ‘Settings, Alexa Privacy’
- Open ‘Manage Your Alexa Data’
- Then you can choose your recording settings.
To delete all your data go to:
- Then go to ‘Review Voice History’
- You’ll see a list of your recorded audio with the date, time and device it was captured on.
- You can delete individual recordings or select ‘Delete all of my recordings’.