Facebook is no stranger to controversy surrounding what content it does and does not delete from the platform. Another ugly example occurred this week when activists in Myanmar accused the social media giant of censoring posts that document ethnic cleansing in the country.
Hundreds of thousands of refugees have been displaced from Myanmar after the country’s military escalated violence against the Rohingya people, a Muslim ethnic minority. Activists say Facebook has removed posts documenting the violence and even suspended several profiles over the content. They also say that Facebook is an essential service in Myanmar, as there are few other ways to communicate with the outside world due to a lack of infrastructure. Thus, Facebook can seriously stifle the open sharing of information when it blocks a profile. But this is not exactly the company’s fault.
“It’s not Facebook actively taking posts and accounts down,” said Mark Farmaner, director at Burma Campaign UK. “Racists in Burma have coordinated people making complaints on Facebook about people and posts knowing it triggers Facebook automatic systems that remove posts or suspend accounts.”
For its part, Facebook said it is only removing content that glorifies violence in the country.
“In response to the situation in Myanmar, we are only removing graphic content when it is shared to celebrate the violence, versus raising awareness and condemning the action,” a Facebook spokesperson told Mashable.
Even though the company may not actively be removing this content, it still ultimately is—and that’s a problem.