Facebook has never before made public the guidelines its moderators use to decide whether to remove violence, spam, harassment, self-harm, terrorism, intellectual property theft, and hate speech from social network until now. The company hoped to avoid making it easy to game these rules, but that worry has been overriden by the public’s constant calls for clarity and protests about decisions. Today Facebook published 25 pages of detailed criteria and examples for what is and isn’t allowed.
Facebook is effectively shifting where it will be criticized from individual incidents of enforcement mistakes like when it took down posts of the newsworthy “Napalm Girl” historical photo because it contains child nudity, to the underlying policy. Some groups will surely find points to take issue with, but Facebook has made some significant improvements like no longer disqualifying minorities from shielding from hate speech because an unprotected characteristic like “children” is appended to a protected characteristic like “black”.
Nothing is technically changing about Facebook’s policies. But previously, only leaks like a copy of an internal rulebook attained by the Guardian had given the outside world a look at when Facebook actually enforces those policies. These rules will be translated into over 40 languages. Facebook currently has 7500 content reviewers, up 40% from a year ago.
Community Standards. These reports are reviewed by our Community Operations team, who work 24/7 in over 40 languages. Right now, we have 7,500 content reviewers, over 40% more than the number at this time last year
Facebook also plans to expand its content removal appeals process, It already let users request a review of a decision to remove their profile, Page, or Group. Now Facebook will notify users when their nudity, sexual activity, hate speech or graphic violence content is removed and let them hit a button to “Request Review”, which will usually happen within 24 hours. Finally, Facebook will hold Facebook Forums: Community Standards events in Germany, France, the UK, India, Singapore, and the US to give its biggest communities a closer look at how the social network’s policy works.
Facebook’s VP of Global Product Management Monika Bickert who has been coordinating the release of the guidelines since September told reporters at Facebook’s Menlo Park HQ last week that “There’s been a lot of research about how when institutions put their policies out there, people change their behavior, and that’s a good thing.” She admits there’s still the concern that terrorists or hate groups will get better at evading Facebook’s moderators, “but the benefits of being more open about what’s happening behind the scenes outweighs that.”