Facebook tightened its advertising rules this week to ensure publishers of violence, pornography, hate speech and more can’t cash in on offensive content. The move comes as Facebook deals with a public backlash over its sale of ads to a shady Russian company during the 2016 presidential election.
According to the site’s new guidelines, publishers can’t monetize content that includes violence, adult content, drugs, tragedy, conflict, “debated social issues,” and more. The rules also repeated Facebook’s promise to crack down on users who spread misinformation.
While these guidelines might sound promising, they’re also extremely broad. This is cause for concern, as it leaves an awful lot of power in Facebook’s hands to determine what kind of content is good and bad. And Facebook was careful to point out that it still won’t be able to stop everything.
“As soon as we determine that content has breached our community standards, we remove it. With a community as large as Facebook, however, zero tolerance cannot mean zero occurrence,” Facebook’s Senior Vice President for Global Marketing Solutions Carolyn Everson said.
It’s encouraging that Facebook is dedicated to stopping offensive content. However, it gave itself a ton of censorship power in the process, and that doesn’t seem like a good thing for users.