Following the 2016 U.S. presidential election, Facebook vowed to do a better job stopping the spread of misinformation on its platform. However, a new study from a prominent nonprofit suggests that not only is that not the case, but misleading information is actually thriving on Facebook.
According to the German Marshall Fund, Facebook engagement with outlets that publish “inaccurate or misleading” articles has increased 242 percent since 2016, and 177 percent in 2020. For its part, Facebook said that engagement is an unfair and inaccurate metric to judge the company’s moderation efforts.
“Engagement does not capture what most people actually see on Facebook,” a Facebook spokesperson said. “Using it to draw conclusions about the progress we’ve made in limiting misinformation and promoting authoritative sources of information since 2016 is misleading. Over the past four years we’ve built the largest fact-checking network of any platform, made investments in highlighting original, informative reporting, and changed our products to ensure fewer people see false information and are made aware of it when they do.”
Facebook might be right — but the fact remains that a 200-plus percent increase in misinformation engagement is terrible no matter what way you slice it. The company has made strides to get this problem under control in the lead up to the election on November 3, but it’s clear the social media giant still has its work cut out for it.
The Choice of Tech Experts Worldwide. Try 90 days free of Bitdefender and experience the highest level of digital safety.
Surf the web truly incognito. Try Bitdefender Premium VPN, the ultra-fast VPN that keeps your online identity and activities safe from hackers, ISPs and snoops.