As the largest social media company in the world, Facebook faces a tough task when it comes to regulating content on its platform. And while the company has made strides in recent years, including dramatically expanding its staff of moderation contractors, many experts insist Facebook isn’t doing enough. That includes a study recently released by the NYU Stern Center for Business and Human Rights, which blasted the company’s content moderation efforts as “grossly inadequate.”
According to the study, Facebook employs about 15,000 content moderators through third-party vendors, and has partnered with 60 fact-checking organizations. But because of the sheer number of bad posts on the platform, the study’s authors say this is only a drop in the bucket of what’s required to combat the problem.
“These numbers may sound substantial, but given the daily volume of what is disseminated on these sites, they’re grossly inadequate,” the report said.
The study’s lead author, Paul Barrett, also said that Facebook’s content moderation problems are a side effect of its relentless drive for growth.
“You have a strategy to expand and grow,” he said. “But you don’t really have a parallel strategy for how to make sure that your offerings are not misused.”
Of course, anyone who’s used Facebook for an extended period of time could tell you that the company isn’t doing enough to clean up its platform. But it’s still striking to hear it put into such harsh words by experts who know best.
The Choice of Tech Experts Worldwide. Try 90 days free of Bitdefender 2020 and experience the highest level of digital safety.
Private Internet Access is an award-winning, cost-effective VPN solution. The use of an anonymous and trusted VPN is essential to your online privacy, security and identity protection.