Report: Facebook Content Moderators Are Underpaid, Psychologically Traumatized

mental_anguishFacebook has come under near-constant criticism for not doing enough to remove violent and disturbing content from its pages. The social media giant wants to eventually rely on Artificial Intelligence (AI) to handle the task, but for now it has hired 3,000 new employees to review content the old fashioned way. And, according to a startling new report from The Guardian, those employees are underpaid and constantly exposed to some of the most disturbing material the Internet has to offer.

According to the report, Facebook moderators make about $15 an hour to remove terrorist content from the site after going through a two-week training course. The things they see leave many moderators seeking mental health professionals.

“You’d go into work at 9am every morning, turn on your computer and watch someone have their head cut off. Every day, every minute, that’s what you see. Heads being cut off,” one anonymous moderator said. “Every day people would have to visit psychologists. Some couldn’t sleep or they had nightmares.”

Though Facebook does offer support and resources to its employees, another anonymous source said the site’s assistance doesn’t go nearly far enough. What’s more, the report points out that other tech giants have moderation policies that seem to do a better job. So what’s the solution for Facebook? It’s unclear from the outside, but it’s important to remember the human cost of Facebook’s privacy and content censorship efforts.