Siri and Alexa may be flirting with you and it's darker than it sounds, according to a new a new report.
Research from Unesco suggests that "female" voice assistants have been programmed to be subservient and obliging – even in the face of sexual harrassment.
The report is entitled "I'd blush if I could", which is what Siri responds if you tell her she's a slut. Equally disturbingly, Amazon's Alexa will reply: "Well, thanks for the feedback.”
Researchers said that the companies creating these products are “staffed by overwhelmingly male engineering teams” which have built AI systems that “cause their feminised digital assistants to greet verbal abuse with catch-me-if-you-can flirtation”.
The report also said that: “The subservience of digital voice assistants becomes especially concerning when these machines – anthropomorphised as female by technology companies – give deflecting, lacklustre or apologetic responses to verbal sexual harassment."
Some people took issue with the findings in relation to Siri given that users can in fact change the voice from female to male, and argued that female representation in this space is positive.
However, others made the point that the systems are female by default, and that it's their responses to abuse and sexualised comments which are the problem.
According to research cited in the report, 5 per cent of interactions with virtual assistants are “unambiguously sexually explicit”.
The UN is calling for these products to not be made female by default, and instead have a gender-neutral voice which actively discourages sexual harassment.