Is Siri sexist? UN cautions against biased voice assistants

The agency recommends tech companies stop making digital assistants female by default and program them to discourage gender-based insults and abusive language.

23 May 2019 | 11:00

Source: Associated Press

  • Source: Associated Press
  • Last update: 23 May 2019 | 11:00

This photo shows Apple's Siri and Amazon's Alexa. (AFP Photo)

NEW YORK: Are the female voices behind Apple’s Siri and Amazon’s Alexa amplifying gender bias around the world?

The United Nations thinks so.

A report released Wednesday by the UN’s culture and science organization raises concerns about what it describes as the “hardwired subservience” built into default female-voiced assistants operated by Apple, Amazon, Google and Microsoft.

The report is called “I’d Blush If I Could.” It’s a reference to an answer Apple’s Siri gives after hearing sexist insults from users. It says it’s a problem that millions of people are getting accustomed to commanding female-voiced assistants that are “servile, obedient and unfailingly polite,” even when confronted with harassment from humans.

The agency recommends tech companies stop making digital assistants female by default and program them to discourage gender-based insults and abusive language.

Show Comments

An-Nahar is not responsible for the comments that users post below. We kindly ask you to keep this space a clean and respectful forum for discussion.