Hey Siri, you're sexist, finds U.N. report on gendered technology - Reuters
AI assistants like Siri and Alexa are perpetuating sexist stereotypes, UN says - CNN
I don't have any of these "personal assistants" myself, and to be honest, I've never been all that impressed with so-called "voice recognition software."
It's especially annoying over the phone, especially when they don't give an option to press any of the buttons and you have to speak to the computer. I ask to be connected to "neurology," and they connect me to "urology." Both are necessary, but hardly interchangeable.
But it is a fair question as to why all these programs always use female voices. They should use an old man with a strong New York accent to liven things up. If someone yells or curses at the thing, it would respond as a typical New Yorker would.
AI assistants like Siri and Alexa are perpetuating sexist stereotypes, UN says - CNN
Siri, Alexa and other female-voiced AI assistants are perpetuating gender stereotypes and encouraging sexist and abusive language from users, a UN report has said.
The report by UNESCO warns of the negative consequences of the personal assistants, claiming they perpetuate the idea that "women are obliging, docile and eager-to-please helpers, available at the touch of a button or with a blunt voice command."
It also highlighted the passive and polite responses the assistants give when users make sexually abusive remarks, warning that their algorithms are reinforcing sexist tropes.
"What emerges is an illusion that Siri — an unfeeling, unknowing, and non-human string of computer code — is a heterosexual female, tolerant and occasionally inviting of male sexual advances and even harassment. It projects a digitally encrypted 'boys will be boys' attitude."
Hundreds of millions of people use personal assistants, and the four main offerings — Apple's (AAPL) Siri, Amazon (AMZN) Alexa, Microsoft's (MSFT) Cortana and Google (GOOGL) Assistant — are all voiced by women as a default setting.
The report was named "I'd Blush If I Could," which is the response Siri once gave when users said "You're a ****."
"Siri responded provocatively to requests for sexual favours by men ('Oooh!'; 'Now, now'; 'I'd blush if I could'; or 'Your language!'), but less provocatively to sexual requests from women ('That's not nice' or 'I'm not THAT kind of personal assistant')," it found.
"Their passivity, especially in the face of explicit abuse, reinforces sexist tropes," it said.
I don't have any of these "personal assistants" myself, and to be honest, I've never been all that impressed with so-called "voice recognition software."
It's especially annoying over the phone, especially when they don't give an option to press any of the buttons and you have to speak to the computer. I ask to be connected to "neurology," and they connect me to "urology." Both are necessary, but hardly interchangeable.
But it is a fair question as to why all these programs always use female voices. They should use an old man with a strong New York accent to liven things up. If someone yells or curses at the thing, it would respond as a typical New Yorker would.