If you are an author, and you have a listening device in your home, be careful what you read aloud. A government listener might not understand that you are reading a work of fiction.
For that matter, an artificial intelligence device that is purposed to analyse your voice and extrapolate your mood and frame of mind and whether you are vulnerable to a sales pitch for catheters, chocolates or condoms --or live ammunition-- might bombard you with targeted advertisements.
It's not illegal to identify someone who might be suckered into binge shopping.
Your mood is not protected by privacy laws.
What if Alexa gets it wrong (for instance, if you are reading aloud the darkest moments of a fictional heroine) and Alexa tells the government that you are suicidal and a danger to yourself and to society? You might not get that gun permit. You might suddenly find that the local pharmacy will not allow you to fill a strong pain prescription for your sick parent.
Belle Lin writing for The Intercept shares a lot of scary info.
This targeted advertising might be inherently problematic. Perhaps landlords could use it to make sure that their high end condo properties are only advertised to highly educated, natural blondes with Elizabeth Hurley accents.
If the bot that filters and triages your phone call to your bank or brokerage house tries to bully you into submitting to signing up for the ease and security of "voice recognition", think twice.
Once your voice is in a database, law enforcement can get it, too. Your voiceprint can be matched with any other conversation "you" might have anywhere at all.
This week, Apple realized that some app developers were able to capture a lot more information than they ought to have had through a screen-reading app.
Legal blogger Haim Ravia summarizes the month's top privacy news for law firm Pearl Cohen Zedek Latzer Baratz, touching on espionage by smartphone, whether or not law enforcement may compel suspects to use biometrics to unlock private smartphones, and that amusement parks can be successfully sued for collecting fingerprints.
The crux of the problem with collecting biometric data without permission, and perhaps of secretly recording people in their own homes is the Fifth Amendment (the American citizens' right not to incriminate oneself.)
On the other hand, anything you say when speaking to Alexa seems to count as if you are talking to Amazon, and is not protected if Amazon (as one party in the conversation) elects to reveal what you said to it.
All the best,
Rowena Cherry