Facebook will soon begin displaying more graphic content on its pages, including nudity and violence, as long as the content is considered “newsworthy” enough, according to a statement released by the social media giant late last week.
“In the weeks ahead, we’re going to begin allowing more items that people find newsworthy, significant, or important to the public interest — even if they might otherwise violate our standards,” Facebook officials wrote in a blog post announcing the policy change. “Our intent is to allow more images and stories without posing safety risks or showing graphic images to minors and others who do not want to see them.”
Facebook also pledged to work with journalists, publishers and members of law enforcement to help create its policies, but the site’s brief statement still leaves many questions unanswered. For instance, how Facebook will determine if graphic content is “newsworthy” enough to remain on the site? Will it be an automated system much like its current Trending Topics newsfeed or will it be personally handled by humans on a case-by-case basis?
This policy change, as ambiguous as it seems on paper, may help Facebook avoid controversies like the recent incident involving the site censoring a historic picture from the Vietnam War because it contained nudity. However, it also puts Facebook in a position to determine what content is important or relevant enough for us to see, and that’s a scary thought.