In August, two protesters were shot and killed at an event advertised on Facebook in what founder and CEO Mark Zuckerberg called “an operational mistake.” Facebook responded by strengthening its rules to restrict militant groups from organizing on the platform. However, according to a report in BuzzFeed News this week, Facebook already had a rule on its books that should’ve stopped the event. However, it was not enforced.
Most explosively, the BuzzFeed News report revealed that Facebook had not instructed content moderators to enforce the policy at all. On top of that, the moderators weren’t trained to recognize or handle the issue.
“Based on Facebook’s response, it became clear not only that the call to arms policy should have applied to the Kenosha Guard event page, but also that the people whose job was to receive complaints about the page were not trained to enforce the policy and did not know they were supposed to escalate complaints about a call to arms,” nonprofit executive director Farhana Khera said. “As a result, when the complaints started coming in about Kenosha, the content reviewers effectively denied them and did not escalate anything.”
Facebook is no stranger to moderation controversies like these. However, it seems like this one might be heating up in a major way
The Choice of Tech Experts Worldwide. Try 90 days free of Bitdefender and experience the highest level of digital safety.
Surf the web truly incognito. Try Bitdefender Premium VPN, the ultra-fast VPN that keeps your online identity and activities safe from hackers, ISPs and snoops.