When it comes to content moderation, Facebook’s policies are often confusing. The social media giant allows some pretty heinous posts to remain active on the site, while removing ones that seem innocent. But if mistakes are made, it’s understandable — the content moderators at the frontlines of that decision-making process are under-paid and over-stressed. So stressed, in fact, that one moderator recently died of a heart attack on the job.
During the course of a workday, Facebook moderators see some of the most terrible content imaginable, from child abuse to pornography to murder. And by many accounts, the Facebook facility in Tampa where the moderator died is one of the worst. In a statement after the man’s death, a spokesperson for Cognizant, the third-party firm that runs the facility, basically said anyone who signs up for the work knows what they’re getting into.
“For our associates who opt to work in content moderation, we are transparent about the work they will perform,” the Cognizant representative said. “They are made aware of the nature of the role before and during the hiring process, and then given extensive and specific training before working on projects.”
While it’s terrible to put a human being through this kind of trauma, Facebook doesn’t have many options. It’s AI technology doesn’t always do a great job of catching offensive content, or it inadvertently censors innocent users. It’s a no-win situation, but it’s one that Facebook created for itself by becoming too big to handle.
Bitdefender 2019 solutions stop attacks before they even begin. Try 90 days free of Bitdefender Total Security 2019
Private Internet Access is an award-winning, cost-effective VPN solution. The use of an anonymous and trusted VPN is essential to your online privacy, security and identity protection.