Facebook Hires 1,000 More Content Moderators Amid Controversy

facebook_logo_flatSince the 2016 presidential election, Facebook has found itself in near-constant public relations turmoil regarding its content moderation. The latest chapter finds the social media company handing over 3,000 Russian-bought ads to Congress following public outrage over the site’s lack of transparency. In another move intent on currying public favor, Facebook also announced this week that it will hire 1,000 additional human content moderators.

In addition to the manual review these moderators will perform, Facebook said it will commit further to machine learning technology that will be capable of automatically flagging and removing ads. The company also said it will require more thorough documentation moving forward from businesses and organizations before they’re allowed to purchase advertising. And, perhaps most interestingly, the site said it will now examine ads with more nuance than ever. While Facebook already outlaws advertising that promotes violence, the site says it will now ban ads that feature “more subtle expressions of violence.”

Of course, all of this sounds great on paper. However, every solution put forward by Facebook features one commonality: Facebook serving as the judge of what’s right and wrong. There’s no clear answer on who should hold Facebook accountable, but it probably shouldn’t be Facebook itself.