
Siri and Alexa may be flirting with you and it's darker than it sounds, according to a new a new report.
Research from Unesco suggests that "female" voice assistants have been programmed to be subservient and obliging – even in the face of sexual harrassment.
The report is entitled "I'd blush if I could", which is what Siri responds if you tell her she's a slut. Equally disturbingly, Amazon's Alexa will reply: "Well, thanks for the feedback.”
Researchers said that the companies creating these products are “staffed by overwhelmingly male engineering teams” which have built AI systems that “cause their feminised digital assistants to greet verbal abuse with catch-me-if-you-can flirtation”.
The report also said that: “The subservience of digital voice assistants becomes especially concerning when these machines – anthropomorphised as female by technology companies – give deflecting, lacklustre or apologetic responses to verbal sexual harassment."
Some people took issue with the findings in relation to Siri given that users can in fact change the voice from female to male, and argued that female representation in this space is positive.
Hmmm. Can't you change the voice gender on most of these things? You certainly can with Siri. It's probably a sign… https://t.co/3zt5PCpcxo— Paul Hughes (@Paul Hughes) 1558525688
@LeeButterley It’s a lose-lose article. If Siri defaults female it is sexist as assistants are female. If Siri is m… https://t.co/HXUqNAYZEh— Makanami-Illustrious of Borg (@Makanami-Illustrious of Borg) 1558518451
Now the UN says Alexa, Siri are sexist because they’re female voices. Am I the only female who was 1) pleasantly su… https://t.co/yGmISs5MOB— S. J. Seymour (@S. J. Seymour) 1558498718
However, others made the point that the systems are female by default, and that it's their responses to abuse and sexualised comments which are the problem.
Most voice assistants have female names and submissive personalities. Ex: Siri responds to insults w/ "I'd blush if… https://t.co/j9lsNvLlBK— Amy Diehl, Ph.D. (@Amy Diehl, Ph.D.) 1558394982
If I’m honest, it’s not something I would ever have considered but now it’s been flagged, Alexa et al are all sort… https://t.co/9pgiNarY2v— euanmackay (@euanmackay) 1558467580
According to research cited in the report, 5 per cent of interactions with virtual assistants are “unambiguously sexually explicit”.
The UN is calling for these products to not be made female by default, and instead have a gender-neutral voice which actively discourages sexual harassment.