Facebook tighter restrictions vaccine misinformation children – Breaking News & Latest Updates 2026
Skip to main content

Facebook puts tighter restrictions on vaccine misinformation targeted at children

Facebook will also remind users that the vaccine is now available for kids

Facebook will also remind users that the vaccine is now available for kids

An illustration of several vaccine vials over a pink and purple background.
An illustration of several vaccine vials over a pink and purple background.
Illustration by Alex Castro / The Verge
Emma Roth
is a news writer who covers the streaming wars, consumer tech, crypto, social media, and much more. Previously, she was a writer and editor at MUO.

Just as the FDA officially approved Pfizer’s COVID-19 vaccine for kids between the ages of five and 11, Meta, Facebook’s brand new identity, announced that it’s rolling out stricter policies for vaccine misinformation targeted at children (via Engadget). The platform previously put restrictions on COVID-19 vaccine misinformation in late 2020, but didn’t have policies specific to kids.

Meta says in a new blog post that it’s partnering with the Centers for Disease Control and Prevention (CDC) and the World Health Organization (WHO) to take down harmful content related to children and the COVID-19 vaccine. This includes any posts that imply the COVID-19 vaccine is unsafe, untested, or ineffective for children. Additionally, Meta will provide in-feed reminders in English and Spanish that the vaccine has been approved for kids, and will also provide information about where it’s available.

Facebook vaccine notification
Photo by Facebook

Meta notes that it’s taken down a total of 20 million pieces of COVID-19 and vaccine misinformation from both Facebook and Instagram since the beginning of the pandemic. These numbers are at odds with what we’ve seen from the leaked internal documents from Facebook — the Facebook Papers made it clear just how unprepared the platform was for misinformation related to the COVID-19 vaccine. If Facebook were more prepared, it might’ve rolled out campaigns to combat misinformation earlier in the pandemic, both for children and adults, possibly removing more false content as a result.

Follow topics and authors from this story to see more like this in your personalized homepage feed and to receive email updates.