The news: Facebook will remove false claims that have been “debunked by public health experts” about covid-19 vaccines, it has announced. In a post, the company outlined how Facebook plans to apply its existing ban on covid misinformation–which is intended to screen out posts that could lead to” imminent physical harm”–as countries around the world move closer to acquiring and rolling out vaccines. The removals will apply to both Facebook and Instagram.
Effective vaccines are coming: The success of covid-19 vaccines is seen as critical to overcoming the pandemic, with a number of candidates in late stage testing. Earlier this week the UK became the first country to approve a vaccine, granting authorization to use the treatment developed by Pfizer and BioNTech and saying that the first doses could be given to patients within days.
What is Facebook removing? The policy announcement isn’t comprehensive, but it gives a few examples of what would be removed from the site:
“This could include false claims about the safety, efficacy, ingredients or side effects of the vaccines. For example, we will remove false claims that COVID-19 vaccines contain microchips, or anything else that isn’t on the official vaccine ingredient list. We will also remove conspiracy theories about COVID-19 vaccines that we know today are false: like specific populations are being used without their consent to test the vaccine’s safety.”
So is this a big deal? Simultaneously yes and no. It’s important that Facebook is addressing how it will handle misinformation about vaccinations with more specifics, particularly as we enter what could be the most important public health moment in modern history. Misinformation about vaccines has long thrived on Facebook, and so anything it announces in terms of a ban or major crackdown has the potential to be very consequential.
The “but” here is also important and multifaceted. Facebook’s policies are only as effective as they are enforced. With health misinformation in particular, these bans will only succeed in their aims if they are effectively enforced within the many private groups on Facebook where false health claims are promoted and amplified. This has been an issue with the platform’s previous attempts to crack down on damaging falsehoods.
Uneven enforcement: Even after Facebook began rolling out policies to limit the spread of vaccine misinformation in 2019–by restricting recommendations of groups and hashtags promoting such messages, for example– the anti-vaccine ecosystem continued to thrive in private spaces on the site. Since the pandemic, however, Facebook has been more aggressive about removing some health misinformation, citing its policy against content that could lead to imminent physical harm. A few weeks ago, Facebook banned prominent anti-vaccine personality Larry Cook and an enormous Facebook group he ran for violating its policies about the QAnon conspiracy theory.