Facebook often finds itself in trouble for the content it removes from its platform — or the content it doesn’t. That was the case this week when it was revealed in court that Facebook disabled, then restored, a suspected terrorist’s Facebook account nine times.
Abdulrahman Alcharbati is standing trial in the UK for allegedly sharing links on his Facebook profile encouraging radical Islamic terrorism. He is also charged with possessing a bomb-making manual himself. Facebook suspended him nine different times for posting this disturbing content, but each time he wrote the site insisting that his free speech rights had been violated. And Facebook relented repeatedly, restoring his access.
“After reviewing your appeal we have reactivated your account. Please keep in mind that one of our main priorities is the comfort and safety of the people who use Facebook,” the company wrote to him after one of his pleas. “We do not allow credible threats to harm others, support for violent organizations or let exceedingly graphic content live on Facebook.”
Facebook really can’t win when it comes to censoring content. Either it goes too far and infringes on user privacy, or it does something like this and allows a genuinely bad actor to flourish on its platform.