This Is How Facebook Decides What Explicit and Violent Content Is Allowed

4,500 content moderators decide what kind of nudity, death and violence is acceptable for your news feed.

  • · 2207 Views 
  • · Posted 6 months Ago
  •  BY Joe Reeve

Earlier this week, The Guardian got its hands on over a hundred manuals which Facebook’s 4,500 content moderators use to help them police the site for things like racism, violence, nudity, pornography, and even death.

The Guardian then shared the findings with the people over at VICE News who in turn featured the moderator's processes on their VICE News Tonight show using the simple animation you can see above. It explains the common sense thinking behind some of the decisions the moderators are trained to make, and also exposes a few of the bizarre and rather inexplicable ones too.

Advertisement - Continue Reading Below

Unsurprisingly, the video notes that Facebook’s content moderators move on regularly and suffer from things like anxiety and PTSD. It sounds like a very tough job to have.

You can catch VICE News Tonight on HBO Mondays through Thursdays.

Want more news like this? Receive our daily newsletter:
Source: Vice News
Follow HUH. on Instagram