Facebook has faced an incredible amount of pressure to take accountability for the spread of fake news on the site, and while the social media giant has introduced tools that allow third-party apps to detect hoax content, Facebook said this week that users must take responsibility for the stories they share.
“At the end of the day, if people want to share stories that have been flagged with their friends, that’s ultimately their prerogative,” Facebook vice president of partnerships Dan Rose said at a media conference. “We are making a very important point of not putting ourselves in a position of deciding what’s fake and not fake. I don’t think people want us to be the arbiters of truth… There are third parties out there who do this for a living.”
Facebook is certainly caught between a rock and a hard place with the problem of fake news. If it goes too far in policing content, it will be accused of censorship, but if it does nothing, it will face criticism for facilitating the spread of propaganda. Still, Facebook’s stance to essentially blame users for the problem is not a good look for the site, and it’s certainly not a good way to convince the public that it takes fake news seriously.