Siri, Alexa and all the other well-known voice assistants are female. A new study argues that ordering these artificial intelligences around reinforces the idea that women should be obedient and uncomplaining, especially as they always respond meekly to insults.
The study authors urge Apple and creators of similar products to make male or gender-neutral voices more prominent.
It’s too easy to kick Siri around
“Because the speech of most voice assistants is female, it sends a signal that women are obliging, docile and eager-to-please helpers, available at the touch of a button or with a blunt voice command like ‘hey’ or ‘OK’, says the study, which is from Unesco (United Nations Educational, Scientific and Cultural Organization).
It’s titled, “I’d blush if I could,” as that used to be Siri’s meek answer to insults like “You’re a b****,” according to the authors. Even the current one, “I don’t know how to respond to that,” is very polite.
Also, AI assistants are still relatively primitive and often make dumb mistakes. The authors point out that these errors are spoken in the voices of young women, reinforcing negative ideas about female intelligence.
More voice options needed
The authors of this study urge the developers of voice-controlled assistants to make male voices an option. And to ask users to choose the gender of their assistant, not make female the default.
iPhone or iPad users wishing to change Siri’s gender just need to go to Settings > Siri & Search > Siri Voice. From here, users can change the voice from female to male, as well as give Siri a regional accent, like British or South African.
But the switch needs to go beyond the timber of the voice; separate scripts have to be used. As the study authors point out, “The male versions tend to use more definitive quantifiers (one, five), while the female versions use more general quantifiers (a few, some), as well as more personal pronouns (I, you, she). The trend is so pronounced that focus groups report finding it unsettling to hear a male voice using a female script and consider it untrustworthy.”
One way around this is to make the AI’s voice gender neutral. And it has other advantages: “Digital assistants carrying obviously synthetic voices, regardless of their fluency, announce themselves as non-human from the outset of an interaction and might even point the way towards the establishment of a new machine gender for technologies with human-like communication capabilities,” according to the authors of this study.
This article was originally posted here