A UN report has found female AI assistants are reinforcing gender stereotypes and promoting gender-based verbal abuse.
Be it Apple's Siri, Amazon's Alexa, Microsoft's Cortana, or Google's Assistant, the vast majority of automated assistants have a female voice.
While voice-command technology may be the way of the future, UNESCO said it promotes an image of women from the dark ages to the hundreds of millions of people using the technology."It sends a signal that women are obliging, docile, and eager-to-please helpers, available at the touch of a button or with a blunt voice command like ‘hey’ or ‘OK’," the report sates.
Amazon's electronic home assistant Alexa is one of the popular voice-operated assistant's accused of promoting unhealthy gender stereotypes. Source: SBS
"As female digital assistants spread, the frequency and volume of associations between ‘woman’ and ‘assistant’ increase dramatically."
The report found voice command software promotes acceptance of sexual harassment and verbal abuse in how automated assistants respond to their users.For example, when a user calls Siri a "b---h", she responds "I'd blush if I could".
UNESCO's Saniye Gülser Corat said voice assistants like Siri and Alexa are teaching people how to speak to women. Source: YouTube ' TEDx Talks'
When given a same insult, Alexa replies "Well, thanks for the feedback".
It also concluded AI technology "makes women the face of glitches and errors" and forces the female personality to defer questions to a higher "and often male" authority.
UNESCO's director of gender equality Saniye Gülser Corat said this "hardwired subservience" was showing people how to speak to women and teaching women how to respond.
"Obedient and obliging machines that pretend to be women are entering our homes, cars and offices," Ms Corat said.
"To change course, we need to pay much closer attention to how, when and whether AI technologies are gendered and, crucially, who is gendering them.”
The UN is now calling on governments and tech giants to stop making digital assistants default to female voices and explore the possibility of developing a "neutral machine gender" that sounds neither male nor female.