Siri, Alexa Encourage Misogyny, Exemplifies Problem with AI Assistants—UNESCO


A recent report from the United Nations argues that the voices and the words that are being said by AI voices Siri of Apple and Amazon's Alexa not only magnify gender biases, but also encourage users to be sexist / Photo by: Stock Catalog via Flickr


Millions of people speak to artificial intelligence voice assistants like Apple's Siri and Amazon's Alexa, which talk back in female-sounding voices. While this may not seem like much of a deal, a recent report from the United Nations argues that those voices and the words they are programmed to say not only magnify gender biases, but also encourage users to be sexist.

While it's been years since these programs were launched, Futurism says it's "not too late to change course."

The United Nations Educational, Scientific, and Cultural Organization (UNESCO) spearheaded the report entitled "I'd blush if I could"—which is Siri's programmed response back in 2011 if ever a user called the AI assistant a "bitch."

Futurism reports UNESCO saying the way that Siri was programmed to say such a response to a degrading comment shows the problems with the current AI assistants.

"Siri’s submissiveness in the face of gender abuse — and the servility expressed by so many other digital assistants projected as young women — provides a powerful illustration of gender biases coded into technology products," the authors wrote on the report.

After the specialized UN agency shared a draft of its report with Apple in April of this year, it was only then that the company decided to change Siri's response to "I don’t know how to respond to that."

This willingness to change a years-old program is encouraging, but that's just a single phrase said by one assistant. In order to make a genuine difference, UNESCO said the tech industry would have to implement more comprehensive changes.

The agency added that a good way to begin these changes is if companies hired more female programmers and stop giving female voices to their assistants as the default, opting instead for gender-neutral voices.

Saniye Gülser Corat, Director of UNESCO’s Division for Gender Equality, told CBS News that the issue is a "Me Too" moment. "We have to make sure that the AI we produce and that we use does pay attention to gender equality."