Social United Nations: "AI's female voice assistants fuel sexist gender stereotypes"

Sounds like a UNESCO study.

But wait international government liberal plot against teh man omgz!
 
So true.

I know a guy who switches to the male voice before asking for directions. Just to be safe.
 
Teamspeak on PC lets you choose the program's voice as male or female. So woke.

The community even made a song lol

 
Just in case anyone is wondering what kind of important work are being done over at the U.N headquarters these days to ensure world peace.
-----

United Nations: AI voice assistants fuel sexist gender stereotypes
By Marisa Dellatto | May 22, 2019​

voice_assistants.jpg

Hey Siri, is this sexist?

A new study from the United Nations says that female AI assistants reinforce harmful gender stereotypes.

Apple’s Siri, Google’s Google Assistant, Microsoft’s Cortana and Amazon’s Alexa all come with female voices, though most offer the choice of switching to a male voice setting. The UN report states that when asked, these technologies say they are genderless — but they clearly have female names and voices.

“Because the speech of most voice assistants is female, it sends a signal that women are obliging, docile and eager-to-please helpers, available at the touch of a button or with a blunt voice command like ‘hey’ or ‘OK’. The assistant holds no power of agency beyond what the commander asks of it,” the report reads. “It honors commands and responds to queries regardless of their tone or hostility. In many communities, this reinforces commonly held gender biases that women are subservient and tolerant of poor treatment.”

Many tech companies have chosen female voices over male ones because women are seen as “helpful,” while men’s voices are seen as “authoritative.” The report says that tech companies are just giving their users what they prefer, but by doing so they are further enhancing these prevalent sexist ideals.

It doesn’t help that AI developers are mostly male. The lack of women in the field creates an inherent bias within the technology, the UN reports.

Another glaring issue with female voices is how they respond to blatant sexual harassment, the study says. Many of the inquiries thrown at these AI assistants are of a sexual nature. But because the responses are scripted by a majority-male staff, the answers are met with flirty, fun jabs instead of shutdowns, reinforcing the idea that these statements are part of everyday conversation and not forms of harassment.

“When asked, ‘Who’s your daddy?’, Siri answered, ‘You are’. When a user proposed marriage to Alexa, it said, ‘Sorry, I’m not the marrying type’. If asked on a date, Alexa responded, ‘Let’s just be friends’. Similarly, Cortana met come-ons with one-liners like ‘Of all the questions you could have asked…,’ ” the report explains.

As a solution, the paper recommends tech giants focus on removing these feminine tropes from their work and striving to build a “genderless” voice assistant instead.

https://nypost.com/2019/05/22/ai-voice-assistants-fuel-sexist-gender-stereotypes-un-study/


OMG useful information coming from a female voice???? Ha, as if!

PS Alexa just tried to give me....me a fucking man...directions? And why when she speaketh to me doth I not hear a running vacuum in the background???

Revelation: Alexa is a ladyboy. That is how she (he?) fucking does it. Are you all living a conscious experience??? Huh??

Man, those jew haters aka the UN just red pilled me.
 
So, how much cash and contraband are being smuggled into the NYC using the diplomatic pouches assigned to the hundreds of people who are assigned to work at the UN?
 


@Whippy McGee check out young Stallone in the pic at 1:56 - looks a lot like your avatar lol.
 
Last edited:
In the name of gender equality, I'll never assume women to be helpful or tolerant again.
 
The UN is really wasting their resources conducting this idiotic study
 
Its because every man alive has given someone bad directions rather than admit they don't know.

I have people probably still out there looking for their destination.
 
"There is to be no pleasentness, and nothing normal in the West." - The U.N.
 
Just in case anyone is wondering what kind of important work are being done over at the UNESCO headquarters these days to ensure world peace.
-----

United Nations: AI voice assistants fuel sexist gender stereotypes
By Marisa Dellatto | May 22, 2019​

voice_assistants.jpg

Hey Siri, is this sexist?

A new study from the United Nations says that female AI assistants reinforce harmful gender stereotypes.

Apple’s Siri, Google’s Google Assistant, Microsoft’s Cortana and Amazon’s Alexa all come with female voices, though most offer the choice of switching to a male voice setting. The UN report states that when asked, these technologies say they are genderless — but they clearly have female names and voices.

“Because the speech of most voice assistants is female, it sends a signal that women are obliging, docile and eager-to-please helpers, available at the touch of a button or with a blunt voice command like ‘hey’ or ‘OK’. The assistant holds no power of agency beyond what the commander asks of it,” the report reads. “It honors commands and responds to queries regardless of their tone or hostility. In many communities, this reinforces commonly held gender biases that women are subservient and tolerant of poor treatment.”

Many tech companies have chosen female voices over male ones because women are seen as “helpful,” while men’s voices are seen as “authoritative.” The report says that tech companies are just giving their users what they prefer, but by doing so they are further enhancing these prevalent sexist ideals.

It doesn’t help that AI developers are mostly male. The lack of women in the field creates an inherent bias within the technology, the UN reports.

Another glaring issue with female voices is how they respond to blatant sexual harassment, the study says. Many of the inquiries thrown at these AI assistants are of a sexual nature. But because the responses are scripted by a majority-male staff, the answers are met with flirty, fun jabs instead of shutdowns, reinforcing the idea that these statements are part of everyday conversation and not forms of harassment.

“When asked, ‘Who’s your daddy?’, Siri answered, ‘You are’. When a user proposed marriage to Alexa, it said, ‘Sorry, I’m not the marrying type’. If asked on a date, Alexa responded, ‘Let’s just be friends’. Similarly, Cortana met come-ons with one-liners like ‘Of all the questions you could have asked…,’ ” the report explains.

As a solution, the paper recommends tech giants focus on removing these feminine tropes from their work and striving to build a “genderless” voice assistant instead.

https://nypost.com/2019/05/22/ai-voice-assistants-fuel-sexist-gender-stereotypes-un-study/


ARE THEY SERIOUS WITH THIS PC SHIT??? <Lmaoo><Lmaoo><Lmaoo><Lmaoo><Lmaoo><Lmaoo><Lmaoo>
 
I didn't think it reinforced the thought of women as helpful or nice, I thought it reinforced the thought that they are nosy cunts who listen in when you don't want them to and record every throwaway joke you say around them.
 
James Earl Jones should be the voice of A.I.
 
Back
Top