Meta’s Oversight Board Calls for Changes in its Rules Regarding Nudity
The independent Oversight Board of Meta is pushing the company to make changes to it’s rules regarding the presentation
The independent Oversight Board of Meta is pushing the company to make changes to it’s rules regarding the presentation of nudity, particularly regarding as it relates to transgender and non-binary people, per their report.
The case, which is part of the board’s new ruling, is related to two posts on Instagram that depicted models with bare chests. Those posts where removed, and the two separate posts were made by the same user, which both featured images of a transgender/non-binary couple bare-chested with the nipples covered, aimed at raising awareness of a member of the couple seeking to undertake top surgery
Meta’s automated systems, and subsequent human review, removed both posts for violating its rules around sexual solicitation and the user appealed the decision to the Oversight Board, which eventually restored the posts.
According to the board: “The Oversight Board finds that removing these posts is not in line with Meta’s Community Standards, values or human rights responsibilities. These cases also highlight fundamental issues with Meta’s policies. Meta’s internal guidance to moderators on when to remove content under the Sexual Solicitation policy is far broader than the stated rationale for the policy, or the publicly available guidance. This creates confusion for users and moderators and, as Meta has recognized, leads to content being wrongly removed.”
The board also noted that Meta’s enforcement of its nudity rules are often ’convoluted and poorly defined’ and could potentially create more hurdles for expression from women, trans, and gender non-binary people on its platforms.
“This policy is based on a binary view of gender and a distinction between male and female bodies. Such an approach makes it unclear how the rules apply to intersex, non-binary and transgender people, and requires reviewers to make rapid and subjective assessments of sex and gender, which is not practical when moderating content at scale.”