Kate Plummer
Aug 23, 2022
Indy
Google wrongly flagged medical images he took of his son’s groin as child sexual abuse material (CSAM) and now it has refused to reinstate his account.
According to the New York Times, the man called Mark took photos to send to a doctor after noticing his son's groin was inflamed. The doctor used the image for diagnostic purposes and prescribed antibiotics.
But when the photos were automatically uploaded to the cloud, Google flagged it as "harmful content" that "might be illegal" and closed down Mark's Gmail and other accounts.
He then discovered that the San Francisco police department opened an investigation into him, but he was cleared of any criminal wrongdoing.
Sign up to our free Indy100 weekly newsletter
“These companies have access to a tremendously invasive amount of data about people’s lives. And still they don’t have the context of what people’s lives actually are,” said Daniel Kahn Gillmor, a senior staff technologist at the ACLU told the Guardian.
“There’s all kinds of things where just the fact of your life is not as legible to these information giants.” He added that the use of these systems by tech companies that “act as proxies” for law enforcement puts people at risk of being “swept up” by “the power of the state.”
“We follow US law in defining what constitutes CSAM and use a combination of hash matching technology and artificial intelligence to identify it and remove it from our platforms,” said Christa Muldoon, a Google spokesperson told the publication.
Muldoon added that Google staffers who review CSAM were trained by medical experts to look for rashes or other issues. They themselves, however, were not medical experts and medical experts were not consulted when reviewing each case, she said.
Oops. indy100 has contacted Google to comment on this story.
Have your say in our news democracy. Click the upvote icon at the top of the page to help raise this article through the indy100 rankings.
Top 100
The Conversation (0)