Oversight Board presses Meta to revise ‘convoluted and poorly defined’ nudity policy

Oversight Board presses Meta to revise ‘convoluted and poorly defined’ nudity policy

Meta’s Oversight Board, which independently evaluates difficult content moderation decisions, has overturned the company’s takedown of two posts that depicted a nonbinary and transgender person’s bare chest. The case represents a failure of a convoluted and impractical nudity policy, the Board said, and recommended that Meta take a serious look at revising it.

The decision concerned two people who, as part of a fundraising campaign for one of the couple, were hoping to undergo top surgery (generally speaking the reduction of breast tissue). They posted two images to Instagram, in 2021 and 2022, both with bare chests but nipples covered, and included a link to their fundraising site.

These posts were repeatedly flagged (by AI and users) and Meta ultimately removed them, as violations of the “Sexual Solicitation Community Standard,” basically because they combined nudity with asking for money. Although the policy is plainly intended to prevent solicitation by sex workers (another issue entirely), it was repurposed here to remove perfectly innocuous content.

When the couple appealed the decision and brought it to the Oversight Board, Meta reversed it as an “error.” But the Board took it up anyway because “removing these posts is not in line with Meta’s Community Standards, values or human rights responsibilities. These cases also highlight fundamental issues with Meta’s policies.”

They wanted to take the opportunity to point out how impractical the policy is as it exists, and to recommend to Meta that it take a serious look at whether its approach here actually reflects its stated values and priorities.

The restrictions and exceptions to the rules on female nipples are extensive and confusing, particularly as they apply to transgender and non-binary people. Exceptions to the policy range from protests, to scenes of childbirth, and medical and health contexts, including top surgery and breast cancer awareness. These exceptions are often convoluted and poorly defined. In some contexts, for example, moderators must assess the extent and nature of visible scarring to determine whether certain exceptions apply. The lack of clarity inherent in this policy creates uncertainty for users and reviewers, and makes it unworkable in practice.

Essentially: Even if this policy did represent a humane and appropriate approach to moderating nudity, it’s not scalable. For one reason or another, Meta should modify it. The summary of the Board’s decision is here and includes a link to a more complete discussion of the issues. (When I asked about previous times they had challenged this policy, they noted this 2020 case involving breast cancer awareness.)

The obvious threat Meta’s platforms face, however, should they relax their nudity rules, is porn. Founder Mark Zuckerberg has said in the past that making his platforms appropriate for everyone necessitates taking a clear stance on sexualized nudity. You’re welcome to post sexy stuff and link to your OnlyFans, but no hardcore porn in Reels, please.

But the Oversight Board says this “public morals” stance is likewise in need of revision (this excerpt from the full report lightly edited for clarity):

Meta’s rationale of protecting “community sensitivity” merits further examination. This rationale has the potential to align with the legitimate aim of “public morals.” That said, the Board notes that the aim of protecting “public morals” has sometimes been improperly invoked by governmental speech regulators to violate human rights, particularly those of members of minority and vulnerable groups.

Moreover, the Board is concerned about the known and recurring disproportionate burden on expression that have been experienced by women, transgender, and non-binary people due to Meta’s policies…

The Board received public comments from many users that expressed concern about the presumptive sexualization of women’s, trans and non-binary bodies, when no comparable assumption of sexualization of images is applied to cisgender men.

The Board has taken the bull by the horns here. There’s no sense dancing around it: The policy of recognizing some bodies as inherently sexually suggestive but not others is simply untenable in the context of Meta’s purportedly progressive stance on such matters. Meta wants to have its cake and eat it too: give lip service to people like the trans and NB people like those who brought this to its attention, but also respect the more restrictive morals of conservative groups and pearl-clutchers worldwide.

The Board Members who support a sex and gender-neutral adult nudity policy recognize that under international human rights standards as applied to states, distinctions on the grounds of protected characteristics may be made based on reasonable and objective criteria and when they serve a legitimate purpose. They do not believe that the distinctions within Meta’s nudity policy meet that standard. They further note that, as a business, Meta has made human rights commitments that are inconsistent with an approach that restricts online expression based on the company’s perception of sex and gender.

Citing several reports and internationally negotiated definitions and trends, the Board’s decision suggests that a new policy be forged that abandons the current structure of categorizing and removing images, substituting something more reflective of modern definitions of gender and sexuality. This may, of course, they warn, leave the door open to things like nonconsensual sexual imagery being posted (much of this is automatically flagged and taken down, something that might change under a new system), or an influx of adult content. The latter, however, can be handled by means other than total prohibition.

When reached for comment, Meta noted that it had already reversed the removal and that it welcomes the Board’s decision. It added: “We know more can be done to support the LGBTQ+ community, and that means working with experts and LGBTQ+ advocacy organizations on a range of issues and product improvements.” I’ve asked for specific examples of organizations, issues, or improvements and will update this post if I hear back.

Source @TechCrunch

Leave a Reply