If you are nervous about Amazon Alexa security, you are right. Amazon ‘human’ workers are listening to conversations to improve its responses says Bloomberg in a new report.
Now we as techies know the term Amazon Alexa Security is probably an oxymoron. Amazon transcribes all voice-to-text to enable it to respond to a question or command. You can see that text if you access your account and you can delete it as well (which we recommend you frequently do).
What you don’t know is that it also puts keywords (what interests you) in your shadow profile – err shopping profile.
The Bloomberg report found that thousands of Amazon staff are listening to selected conversations that have also been transcribed to annotate them with extra information. This is fed back into the software to eliminate gaps in Alexa’s understanding of human speech.
Amazon claims employees don’t have direct access to discover who they are listening too. But in making that statement it tacitly admits they do listen!
The big issue is that Amazon never told Amazon Alexa users that humans could listen to their conversations. Some of which have been distressing to its staff and some become comedic and are shared around – like the poor lady who was singing badly in the shower.
It states “No audio is stored or sent to the cloud unless
the device detects the wake word (or Alexa is activated by pressing a button)”
‘No audio’ forgets to mention it will be translated to text
and fed in the Amazon AI machine to target you with highly customised advertising.
Alexa is designed to get smarter every day. The more you use Alexa, the more the service adapts to your speech patterns, vocabulary, and personal preferences. For example, we use your requests to Alexa to train our speech recognition and natural language understanding systems. The more data we use to train these systems, the better Alexa works, and training Alexa with voice recordings from a diverse range of customers helps ensure Alexa works well for everyone.
Do you see any reference to Amazon staff listening to your
GadgetGuy’s take: Privacy should not be a casualty of
This shocking news comes on top of frequent reports that Amazon Alexa has sent someone’s conversation to total strangers and that Amazon Alexa is continuously listening, err recording. Tech analysts say that this is because Amazon Alexa does not require a dedicated hardware chip to wake the device and start recording – unlike some other voice assistants.
Loup Ventures shows the inherent bias/pedigree with this
question, “How much would a manicure cost?”
Alexa: “The top search result for a manicure is Beurer
Electric Manicure & Pedicure Kit. It’s $59 on Amazon. Want to buy it?
Google Assistant: “On average, a basic manicure will cost
you about $20. However, special types of manicures like acrylic, gel, shellac,
and no-chip range from about $20 to $50 in price, depending on the salon.”
Amazon wishes to create a full “360-degree view” of a
customer; they want to know everything there is to know about what each
customer buys online so that they can target every individual with more
relevant advertisements and recommendations.
Safety.com has an excellent article on Amazon Alexa safety and some simple steps to enhance privacy.
Amazon responds to our article “We take the security and privacy of our customers’ personal information seriously. We only annotate an extremely small number of interactions from a random set of customers in order to improve the customer experience. For example, this information helps us train our speech recognition and natural language understanding systems, so Alexa can better understand your requests, and ensure the service works well for everyone. We have strict technical and operational safeguards, and have a zero tolerance policy for the abuse of our system. Employees do not have direct access to information that can identify the person or account as part of this workflow. While all information is treated with high confidentiality and we use multi-factor authentication to restrict access, service encryption, and audits of our control environment to protect it, customers can delete their voice recordings associated with their account at any time.” – Amazon Spokesperson