Facebook has come under heavy criticism for the way it handles content moderation issues. When the company relies on artificial intelligence, its technology often makes mistakes that a human would not. However, according to a sweeping report from The Verge released this week, there’s also a downside when real people are tasked with reviewing its content.
In response to criticism that there was too much violent and hateful content on its platform, Facebook had added more than 30,000 employees to work on safety and security by the end of last year — and about half of those employees were content moderators. These moderators aren’t paid much, and they’re tasked with viewing some of the most extreme content on the web — from videos of stabbings to graphic sex to conspiracy theories.
And, according to this report, the job can traumatize moderators to the point that they rely on drinking, taking drugs or having sex at the office to numb the pain. The offices have a therapist on-site, but employees say it is not enough, and many former employees say they are now coping with PTSD symptoms from all the shocking content they’ve seen. Still, the report says that many moderators are proud of the work they do to keep Facebook safe.
“If we weren’t there doing that job, Facebook would be so ugly,” one moderator told The Verge. “We’re seeing all that stuff on their behalf. And hell yeah, we make some wrong calls. But people don’t know that there’s actually human beings behind those seats.”
This is one rare instance where there might not be a right answer to Facebook’s problems. But to spare this sort of suffering, perhaps it’s not so bad for the company to rely more on AI after all.
Bitdefender 2019 solutions stop attacks before they even begin. Try 90 days free of Bitdefender Total Security 2019
Private Internet Access is an award-winning, cost-effective VPN solution. The use of an anonymous and trusted VPN is essential to your online privacy, security and identity protection.