PhotoFacebook published 27 pages of previously secret rules today on how the site’s moderators decide which photos, videos, and posts should be removed and which can stay online.

The company said it spots potentially problematic content by using either artificial intelligence or reports from other users. That information is then passed on to its 7,500+ human content reviewers who work around the clock in over 40 languages.

Detailed policies

Facebook said it does not allow hate speech about “protected characteristics,” including race, ethnicity, national origin, religious affiliation, sexual orientation, sex, gender, gender identity, serious disability, or disease.

It said there are “some protections” around immigration status and three “tiers of severity” by which posts are judged. Here are a few of the site’s rules:

  • The sale of marijuana is not allowed (even in states where it’s legal)

  • Sexual activity in general is banned unless “posted in a satirical or humorous context”

  • Photos of breasts are allowed if they depict an act of protest

  • Guns can only be shown to adults aged 21 or older, and sales between individual people are not allowed

  • Bullying rules don’t apply to comments made about public figures

Providing clarity

A shorter version of the guidelines had leaked before, but the full guidelines had not been released to the public until today.

In releasing the detailed guidelines (which include specific examples), Facebook hopes to provide transparency about its content-policing process, which has in the past been criticized for appearing to be inconsistent at times.

“We decided to publish these internal guidelines for two reasons,” said Monika Bickert, Vice President of Global Policy Management at Facebook, in a statement.

“First, the guidelines will help people understand where we draw the line on nuanced issues. Second, providing these details makes it easier for everyone, including experts in different fields, to give us feedback so that we can improve the guidelines – and the decisions we make – over time.”

"We want people to know about these standards, we want to give them clarity," Bickert said.

Getting user feedback

The company admits that its enforcement “isn't perfect.”

“We make mistakes because our processes involve people, and people are not infallible," Bickert said. For this reason, Facebook is also adding a way for users to appeal when one of their posts gets taken down because of sexual content, hate speech, or violence.

Users will get a message explaining why the post was taken down and can follow a link to request a review, which will be handled by a team member “typically within 24 hours.”

“We are working to extend this process further, by supporting more violation types, giving people the opportunity to provide more context that could help us make the right decision, and making appeals available not just for content that was taken down, but also for content that was reported and left up,” Bickert said.


Share your Comments