In a story on social media privacy this week, The New York Times relayed the story of Michael Letwin, a Brooklyn lawyer who had his Facebook account shut down for no apparent reason. After a month of contacting Facebook to get reinstated, the site finally apologized and corrected the mistake. It turned out that Letwin had been reported for using a fake name, which he wasn’t, and someone looking at his account (Letwin serves as an administrator for the group Jews for Palestinian Right of Return) had reported him for promoting violence and terrorism.
Cases like this one raise an important question: just how much power should Facebook have, and how should they decide what is and is not allowed to stay on the site?
“The average person’s soapbox is now digital, and we’re now in a world where the large social media companies have a governmentlike ability to set social norms,” Lee Rowland, a lawyer with the American Civil Liberties Union, told the Times. “It’s a massive power and it comes with a responsibility.”
Facebook’s own Community Standards are often confusing, and though the site has improved its transparency in recent months to help win back jaded users, they still have a long way to go when it comes to making sure everyone understands their rules.