With a platform as large as Facebook, it’s no surprise that offensive content falls through the cracks. That’s why the watchdog organization ProPublica decided to send Facebook a sample of 49 posts containing hate speech and see what the company had to say for itself. The result: Facebook admitted it made a mistake in allowing almost half the posts to remain on the site.
ProPublica chose the sample of 49 from 900 posts users sent in as part of a crowd-sourced investigation. And of that final sample, Facebook said it made in a mistake in 22 instances. In six cases, Facebook said users didn’t properly flag the content, and in two it said it didn’t have enough info.
“We’re sorry for the mistakes we have made — they do not reflect the community we want to help build,” Facebook Vice President Justin Osofsky told ProPublica in a statement. “We must do better.”
But perhaps most disturbingly, Facebook defended its decision regarding 19 posts, some which included sexist and racist content, including one photo that declared “the only good Muslim is a f**king dead one.”
Facebook is taking steps to improve its content moderation, but it’s clear the company still has a way to go before it satisfactorily addresses the problem.