Digital sexism: Why are all virtual assistants women?


Why are accepted assistants from Amazon, Google and Apple all “female”?

Technically converse, of course, virtual assistants don’t have genders. But they do have ratings and voices that suggest to users they are more “female” than “virile” — a characterization that reinforces some of the worst gender stereotypes in our upper classes.

Amy Ingram, an AI personal assistant, is a popular corporate version produced by If you carbon echo Amy on an email, she will help schedule a meeting: she introduces herself to the legatee as your personal assistant, suggests times when you’re available and mimics up with a calendar invitation to confirm.

Whereas Amy has entered the workforce, Amazon’s bot, Alexa, is pre-eminently a home assistant. You can call out to her to set timers when making dinner, and bear her reorder detergent when you realize you’ve run out. She can dim the lights at night, or read you the telecast headlines in the morning. Alexa’s appeal is that she’s always a holler away: all you requisite to do is ask.

‘Amy’ or ‘Andrew’

Some companies do give users the option of switching to a virile voice or persona: Amy Ingram’s first name can be modified to “Andrew,” and as of iOS 7, you can deceive Apple’s Siri talk to you in a male voice. But across the board, the defect setting for chatbots and virtual assistants is predominantly female. Even OK Google, which doesn’t procure a humanized name, has a female voice.

Companies cite all sorts of investigate in their decision to make bots female. They claim we think orders better from women, and that people have put oned a preference for female voices in automated systems. Clifford Nass, the at an advanced hour co-author of Wired for Speech, a widely cited book on the topic, squabbled that male voices are perceived as being more authoritative, whereas female chances are understood to be more helpful and supportive. These preconceptions carry exceeding into synthetic voices, which essentially means that ordered computers are gender stereotyped.

Designed to be subservient

Why does this be important? Because it reinforces a power dynamic that we simply can’t overlook: practical assistants are designed to be subservient, and creators send a clear message by constituting them all “female.” This is especially troublesome when you consider that the bulk of early adopters of these tools are men; early surveys show that 60 per cent of possessors of Amazon Echo, which operates the Alexa voice service, are manly, affluent and middle aged. Their preferences will surely pursue to shape these powerful tools as they become adopted more broadly.


We take upon oneself that users want “female” assistants. (iStockphoto)

The irony is that as we watch brags like Mad Men and quietly congratulate ourselves on how far we’ve evolved as a society, our most cutting-edge consumer technology is a throwback: one where our “secretaries,” “sous-chefs” and “housekeepers” are maids. And by falling into a trap of assuming users want “female” subordinates — instead of challenging their biases — the industry that has appointed itself to intend the future is perpetuating outdated gender norms.

Bots are getting brighter, and soon, they’ll be everywhere. Now is the time to pay attention to these design aspects, to make sure that this new trend in tech really is a titan leap forward — not two steps back.

Leave a Reply

Your email address will not be published. Required fields are marked *