Science & Tech

Even Meta’s own Oversight Board is annoyed with Instagram’s policy on nudity

Even Meta’s own Oversight Board is annoyed with Instagram’s policy on nudity

Related video: Meta 'breached EU antitrust rules' on Facebook ads

Cheddar/VideoElephant

Mark Zuckerberg’s Meta - which oversees the popular social media sites Facebook and Instagram, amongst others – has long attracted controversy over its policy on adult nudity, assuming body positive content featuring breasts constitutes pornography.

Midsommar actress Florence Pugh took aim at Instagram’s standards in July, when she shared a picture of her wearing a pink dress at a Valentino fashion show where her nipples were visible, but covered by the fabric.

“Technically they’re covered,” the caption reads.

And Pugh isn’t the only person who’s been exasperated by Instagram’s censorious approach to female body parts, after plus-sized model and influencer Nyome Nicholas-Williams successfully led a campaign calling on the platform to revisit its policy.

In one post from October 2020, the Black content creator – known online as ‘curvynyome’ - wrote: “Slim white bodies are praised for being nude and appreciating [their] form on this platform all the time and never have images taken down or asked as to why they are posting them.

“We all have bodies, and we can choose to celebrate them in any way we wish … How have we become so scared to share our art and celebrate our bodies without us being sexualised or being judged by others?”

Sign up to our free Indy100 weekly newsletter

Her comments come after she shared pictures of herself sitting on a chair against a purple flowery background covering her breasts with her arms. Despite the powerful photography, Instagram soon took it down.

In scenes we absolutely love to see, Instagram eventually backed down, with a representative telling Business Insider: “Hearing [Nyome’s] feedback helped us understand where this policy was falling short, and how we could refine it.

“With the new update, we’ll allow content where someone is simply hugging, cupping or holding their breasts.”

Instagram has also come under fire for taking down images of nipple tattoos on women who have had mastectomies following a diagnosis of breast cancer.

Such ridiculous over-policing has sparked online protests from individuals such as Kylie Jennier, which have formed part of the larger ‘Free the Nipple’ campaign against the sexualisation of female breasts across society.

Under its policy on “adult nudity and sexual activity”, parent company Meta says: “Our Nudity Policies have become more nuanced over time. We understand that nudity can be shared for a variety of reasons, including as a form of protest, to raise awareness about a cause or for educational or medical reasons.

“Where such intent is clear, we make allowances for the content. For example, while we restrict some images of female breasts that include the nipple, we allow other images, including those depicting acts of protest, women actively engaged in breast-feeding and photos of post-mastectomy scarring.”

It goes on to tell users not to post “uncovered female nipples”, except “in the context of breastfeeding, birth-giving and after-birth moments, medical or health context (for example, post-mastectomy, breast cancer awareness or gender confirmation surgery) or an act of protest”.

Now, it seems, Instagram has fallen foul once again, with Meta’s own Oversight Board – set up in 2020 to independently review the company’s policy decisions – overturning a decision by the photo and video-sharing platform to remove two posts from a US account shared in 2021 and 2022.

The profile is run by a couple who are trans and non-binary, and both uploads – showing the pair bare-chested with their nipples covered – were removed by Meta under its “sexual solicitation” policy, which ‘draws the line’ on content which “facilitates, encourages or coordinates sexual encounters or commercial sexual services between adults”.

What the post actually concerned was transgender healthcare, with one of the two individuals soon undergoing gender-affirming top surgery to create a flatter chest, for which the couple are fundraising.

Doesn’t sound at all sexual to us, Meta.

It was only after the Oversight Board accepted the two cases for review that Meta restored the posts after finding they were removed in error. However, that didn’t stop the board from ruling on its decision to take them down in the first place.

And while the two cases concerned a different policy to that around “adult nudity”, the board revealed that in “at least one of the cases”, an automated system “trained to enforce” the standard on nudity sent the post to a human to review it.

“This policy is based on a binary view of gender and a distinction between male and female bodies. Such an approach makes it unclear how the rules apply to intersex, non-binary and transgender people, and requires reviewers to make rapid and subjective assessments of sex and gender, which is not practical when moderating content at scale,” the board said.

While it went on to acknowledge Meta’s exceptions on issues such as top surgery and breast cancer awareness, the board said these were “often convoluted and poorly defined”.

“In some contexts, for example, moderators must assess the extent and nature of visible scarring to determine whether certain exceptions apply. The lack of clarity inherent in this policy creates uncertainty for users and reviewers, and makes it unworkable in practice.

“Here, the Board finds that Meta’s policies on adult nudity result in greater barriers to expression for women, trans, and gender non-binary people on its platforms. For example, they have a severe impact in contexts where women may traditionally go bare-chested, and people who identify as LGBTQI+ can be disproportionately affected, as these cases show.

“Meta’s automated systems identified the content multiple times, despite it not violating Meta’s policies,” it said.

As part of its decision the Oversight Board also advised Meta to define “clear criteria” to govern its policy on adult nudity, to ensure “all users are treated in a manner consistent with human rights standards”, without discriminating on the basis of sex or gender.

“Meta should first conduct a comprehensive human rights impact assessment on such a change, engaging diverse stakeholders, and create a plan to address any harms identified,” it said.

A Meta spokesperson said: “We welcome the board’s decision in this case. We had reinstated this content prior to the decision, recognizing that it should not have been taken down. We are constantly evaluating our policies to help make our platforms safer for everyone. We know more can be done to support the LGBTQ+ community, and that means working with experts and LGBTQ+ advocacy organizations on a range of issues and product improvements.”

Have your say in our news democracy. Click the upvote icon at the top of the page to help raise this article through the indy100 rankings.

The Conversation (0)
x