Facebook has been roundly criticized for inconsistencies in how it censors content on its platform. Now, a bombshell report from The New York Times has revealed that the company’s rules are just as confusing for its own content moderators.
According to the report, several dozen Facebook employees gather every two weeks to come up with the site’s content moderation rules, then send them to over 7,500 moderators around the world. These guidelines exist in a jumbled series of PowerPoint documents that the moderators are responsible for sorting through themselves. The Times reviewed over 1,400 pages from the rulebooks, and found many gaps and even “outright errors.” This sounds like a pretty disorganized process, but that’s not really the issue. Ultimately, Facebook is trying to boil down complex international issues into simple yes-or-no propositions — and the company itself admits that it can’t really win the battle.
“We have billions of posts every day, we’re identifying more and more potential violations using our technical systems,” Monika Bickert, Facebook’s head of global policy management, told The New York Times. “At that scale, even if you’re 99 percent accurate, you’re going to have a lot of mistakes.”
If you’re a close observer of Facebook, this report probably just confirmed what you already suspected about how the social media giant operates. But it’s troubling nonetheless to learn, for a fact, that the company polices itself in such a shoddy and disorganized way.
Bitdefender 2019 solutions stop attacks before they even begin. Try 90 days free of Bitdefender Total Security 2019
Private Internet Access is an award-winning, cost-effective VPN solution. The use of an anonymous and trusted VPN is essential to your online privacy, security and identity protection.