News

Social media platforms failing to remove racist abuse despite committing to protect footballers, users say

Social media platforms failing to remove racist abuse despite committing to protect footballers, users say

Social media platforms are failing to remove racist abuse despite pledging to crack down on its spread across their sites, users claim.

According to Instagram and Twitter users, the platforms are allowing comments such as monkey emojis to remain online, which appears in stark contrast to their commitment to act following the racist abuse that has been hurled at black footballers Marcus Rashford, Bukayo Saka and Jadon Sancho online.

Instagram user Carla Gandy told indy100 that she had reported a number of comments posted underneath this photo of Marcus Rashford, which she saved as screenshots.

One said: “Ffs it had to be the black players missing”, while another showed a series of monkey emojis.

But after reporting these comments, she received notifications from Instagram informing her that they would not be removed.

Screenshots seen by indy100 show that Instagram replied: “Due to the high volume of reports that we receive, our review team hasn’t been able to review your report.

“However, our technology has found that this comment probably doesn’t go against our Community Guidelines.”

Another Twitter user shared screenshots alleging that a comment saying “Go to Africa” was not removed from Instagram for the same reason.

While others claimed they were also struggling to get their reports noticed:

Rashford, Saka and Sancho have received a torrent of racist abuse following the Euro 2020 final in which they missed penalties during a tense shootout, leading in part to Italy’s victory.

Politicians and public figures – including Boris Johnson and Gary Lineker – have spoken out against the abuse and backed the players, despite some like Priti Patel previously stating that taking the knee to support racial equality was “gesture politics”.

Meanwhile, social media companies have said they will do more to protect people from abuse.

Facebook – which owns Instagram – said it tries to remove harmful content as quickly as possible. It is understood that it’s policy does not prohibit emojis but examines the context in which they are used and removes them if they are shared in a hateful manner.

“No-one should have to experience racist abuse anywhere, and we don’t want it on Instagram,” a Facebook company spokesperson said.

“We quickly removed comments and accounts directing abuse at England’s footballers last night and we’ll continue to take action against those that break our rules.

“In addition to our work to remove this content, we encourage all players to turn on Hidden Words, a tool which means no-one has to see abuse in their comments or DMs.

“No one thing will fix this challenge overnight, but we’re committed to keeping our community safe from abuse.”

Twitter issued a similar statement: “In the past 24 hours, through a combination of machine learning-based automation and human review, we have swiftly removed over 1,000 Tweets and permanently suspended a number of accounts for violating our rules - the vast majority of which we detected ourselves proactively using technology.

“We will continue to take action when we identify any tweets or accounts that violate our policies.

“We have proactively engaged and continue to collaborate with our partners across the football community to identify ways to tackle this issue collectively and will continue to play our part in curbing this unacceptable behaviour - both online and offline.”

But according to users, this commitment is failing to convert to meaningful action:

Speaking to indy100 another user who wished to remain anonymous said she had unsuccessfully reported a number of Instagram accounts for posting “monkey emojis and abusive comments”. She said: “I had previously checked the accounts and they were mostly brand new - clearly made just for the purpose of sending hateful comments. Given my experience in the past with reporting on social media, I was disappointed but not surprised. It felt like a computer was making the decision, rather than a human but it’s humans who have to read these comments (and are hurt by them).”

indy100 has contacted Facebook and Twitter to comment on this story.

The Conversation (0)
x