Bloomberg reports that there are actually 1000s of Amazon employees worldwide whose job is to listen to people’s conversations when the Alexa speaker is on.

The reviewers know your acct #, device serial #, and your first name

AI is still a new technology so it is somewhat understandable that Amazon has these reviewers who will teach the AI to better catch the nuances of the things people tell Alexa. What’s troubling is that Amazon does not clearly say that they’re doing this on their marketing or FAQ pages.

The amount of info they reveal to the reviewers doesn’t sit right with us, either. Although your full name and address won’t be shared, the reviewers will still have access to your first name, acct #, and device serial #.

You can’t really opt out, either

You can choose to opt out on the Alexa app by not allowing Amazon to use your voice for new feature developments but the company says even the people who opted out still might get their voices analyzed by its reviewers as part of their process.

Amazon’s “Dark Pattern”

Amazon’s decision to not explicitly let customers know about this human reviewing process is a prime example of “Dark Pattern” UX design.

Simply put, “Dark Pattern” design is the practice of using deceptive tricks and nudges to get more personal data out of you.

Those seemingly never-ending paragraphs of disclaimers in small ass fonts you scroll through and click “yes” to? That’s Dark pattern design.

New law to forbid Dark Pattern design on the way

On Tuesday, a bipartisan Senate bill to block big tech from using Dark Pattern design was proposed after continued pressure from consumer rights groups (we’re rooting for you guys).

The bill also aims to forbid UI designs that go for “compulsive usage” in children under the age of 13 as well as blocking online platforms from doing “behavioral experiments” without informed user consent.

Share this