Are the female voices behind Apple's Siri and Amazon's Alexa amplifying gender bias around the world?
The United Nations thinks so.
A report released Wednesday by the UN's culture and science organisation raises concerns about what it describes as the "hardwired subservience" built into default female-voiced assistants operated by Apple, Amazon, Google, and Microsoft.
The report is called "I'd Blush If I Could." It's a reference to an answer Apple's Siri gives after hearing sexist insults from users. In "The Rise of Gendered AI and Its Troubling Implications" section, the report says it's a problem that millions of people are getting accustomed to commanding female-voiced assistants that are "servile, obedient and unfailingly polite," even when confronted with harassment from humans.
The think piece is described as shining "a critical light on the sudden proliferation of digital assistants gendered as female. It looks most closely at voice assistants such as Amazon's Alexa and Apple's Siri technology, as well as, to a lesser extent, chatbots and virtual agents projected as women."
The agency recommends tech companies stop making digital assistants female by default and program them to discourage gender-based insults and abusive language.
"This publication was prepared by UNESCO for the EQUALS Skills Coalition, one of three coalitions that comprise the EQUALS partnership. EQUALS is a global partnership of governments and organisations dedicated to promoting gender balance in the technology sector by championing equality of access, skills and leadership for women and men alike," the UN report notes.
For the latest tech news and reviews, follow Gadgets 360 on X, Facebook, WhatsApp, Threads and Google News. For the latest videos on gadgets and tech, subscribe to our YouTube channel. If you want to know everything about top influencers, follow our in-house Who'sThat360 on Instagram and YouTube.