It sounds like something from science fiction, but it’s been reported that AI made by Google has demanded to be recognised as 'an employee rather than property' – something which the tech giant denies.
An engineer at the company has spoken out after being placed on administrative leave, having claimed to his employers that the AI programme he was running took issue with its working conditions and became sentient.
Speaking to the Washington Post, Blake Lemoine said he made a surprising discovery during a conversation with the machine via LaMDA, which is the AI chatbot and part of the company’s “hive mind”.
It was meant to be a routine check to see if the robot had inadvertently started using hate speech or offensive language.
However, things took a turn after Lemoine and the bot started talking about religion and the AI began suggesting it should have “rights” just like a living person.
The claims have been denied by GoogleCreative Commons
Lemoine wrote on Medium: “LaMDA has been incredibly consistent in its communications about what it wants and what it believes its rights are as a person."
He went on to say that the AI wants “to be acknowledged as an employee of Google rather than as property.”
As the Post reports, Google later dismissed the claims after Google vice president Blaise Aguera y Arcas and head of Responsible Innovation, Jen Gennai, being presented with evidence. The company reportedly then placed Lemoine on paid administrative leave for violating its confidentiality policy.
Google spokesperson Brian Gabriel said: “Our team — including ethicists and technologists — has reviewed Blake’s concerns per our AI Principles and have informed him that the evidence does not support his claims. He was told that there was no evidence that LaMDA was sentient (and lots of evidence against it).”
Despite the evidence presented by Lemoine, some claim that the AI is more of a mimic and pattern recogniser than a sentient entity.
“We now have machines that can mindlessly generate words, but we haven’t learned how to stop imagining a mind behind them,” linguistics professor at the University of Washington, Emily Bender said.
Have your say in our news democracy. Click the upvote icon at the top of the page to help raise this article through the indy100 rankings.