Facebook covid 19 vaccine coronavirus misinformation ban – Breaking News & Latest Updates 2026
Skip to main content

Facebook will remove COVID-19 vaccine misinformation

The updated policy may be too late

The updated policy may be too late

Facebook anti-misinformation box for the novel coronavirus
Facebook anti-misinformation box for the novel coronavirus
Facebook’s existing anti-misinformation box for COVID-19.
Facebook

In an update to their COVID-19 misinformation policy, Facebook will begin removing false claims about COVID-19 vaccines. The update comes as one COVID-19 vaccine has been authorized in the UK and other authorizations are expected to happen soon in the US and around the world. But some experts say these kinds of policies are coming too late to stop the flow of vaccine misinformation.

The policy previously involved removing posts with false information about the virus that could lead to “imminent physical harm.” Facebook is now expanding the policy to include any posts about vaccines that feature claims which have been “debunked by public health experts.” This includes conspiracy theories — like vaccines containing microchips — and false claims about the safety, efficacy, ingredients, or side effects of vaccines.

Facebook says enforcement will not happen overnight, and that as facts about COVID-19 vaccines evolve, the list of claims that constitute removal will be regularly updated. But even with an expanded policy for removing posts, much of the damage of vaccine misinformation has already been done.

Anti-vaccination rhetoric has often proliferated in small, private Facebook groups. Hundreds of smaller groups are more difficult to monitor and remove than larger ones, and they can spread anti-vaccination messaging to groups that have no obvious connection to conspiracy theorists. Experts worry that the consistent spread of COVID-19 vaccine misinformation throughout the pandemic could make people more hesitant to get a vaccine.

Many major platforms have been under pressure during the pandemic to combat COVID-19 misinformation. When it comes to vaccine-related claims, Facebook is behind YouTube, which began removing videos with COVID-19 vaccine misinformation in October. As vaccines begin to roll out, it will be important for these platforms to keep up strict moderation on vaccine-related content.

Follow topics and authors from this story to see more like this in your personalized homepage feed and to receive email updates.