As the nationwide rollout of the COVID-19 vaccine continues, Facebook finds itself battling a familiar problem: misinformation. According to a report from CNN this week, COVID-19-related falsehoods continue to flourish on the social media giant’s platform despite new rules established late last year to address the problem.
According to the study, conducted two weeks ago, four of the top ten search results for “vaccine” on Facebook-owned Instagram turned up anti-vaccination accounts. After that, the platform changed its search results to serve up three credible links first. However, if users click the “See More Results” prompt, they’re still shown a number of accounts peddling misinformation. Meanwhile, three of the top twenty search results on Facebook have to do with vaccine misinformation.
Of course, the fact that this content continues to spread could be due, in part, to Facebook’s confusing and controversial policy on the matter. According to the company, it distinguishes between specific vaccine misinformation and content that more broadly expresses anti-vaccine sentiment, banning the former while allowing the latter. However, if you’ve ever used Facebook before, you know that the company has a hard enough time detecting spam and offensive content without thinly slicing the definitions of it. As usual, Facebook has a virtually impossible problem on its hands — and it has made the matter worse with a confusing rule.
The Choice of Tech Experts Worldwide. Try 90 days free of Bitdefender and experience the highest level of digital safety.
Surf the web truly incognito. Try Bitdefender Premium VPN, the ultra-fast VPN that keeps your online identity and activities safe from hackers, ISPs and snoops.