Youtube demonetization anti vaccination conspiracy videos dangerous harmful content – Breaking News & Latest Updates 2026
Skip to main content

YouTube pulls ads from anti-vax conspiracy videos

Anti-vaccination channels are ‘dangerous and harmful’

Anti-vaccination channels are ‘dangerous and harmful’

Amelia Krales
Adi Robertson
is a senior tech and policy editor focused on online platforms and free expression. Adi has covered virtual and augmented reality, the history of computing, and more for The Verge since 2011.

YouTube has removed ads from videos that promote anti-vaccination content, citing a ban on “dangerous and harmful” material. BuzzFeed News reported the news this afternoon, saying YouTube had confirmed the decision after the publication contacted seven companies who were unaware that their advertisements were running on anti-vaccination videos. It’s the latest of several ways YouTube has recently restricted conspiracy theories and other objectionable material on its platform.

BuzzFeed reports that the demonetized accounts include anti-vaccination channels LarryCook333 and VAXXED TV, as well as “alternative medicine” channel iHealthTube. The three channels have a total of roughly 473,000 subscribers. “We have strict policies that govern what videos we allow ads to appear on, and videos that promote anti-vaccination content are a violation of those policies,” a spokesperson said. YouTube already adds Wikipedia article links to some searches that are likely to show anti-vaccination content, attempting to counteract misinformation.

YouTube is trying to restrict the spread of conspiracy videos

At least one company BuzzFeed contacted, vitamin seller Vitacost, said it had already pulled ads over a separate controversy involving child predators communicating on YouTube — an issue that YouTube has spent the past few days scrambling to fix. “We had strict rules to prevent our ads from serving on sensitive content and they were not effective as promised,” it told BuzzFeed.

Last month, YouTube said it would limit the reach of videos featuring conspiracy theories in general, responding to concerns that its recommendation algorithm pushed users down extreme ideological paths. A recent study of “flat earth” believers, for instance, found that nearly all its 30 subjects had been recommended flat earth content after watching videos about other types of conspiracies.

YouTube has had some difficulty distinguishing “harmful” conspiratorial misinformation from programs intended for entertainment, but advocating against vaccines poses clear public health risks. US Representative Adam Schiff recently sent Facebook and YouTube’s parent company Google a letter raising concerns about vaccine-related misinformation, and Facebook is reportedly exploring new options to limit it. The image board site Pinterest, meanwhile, simply stopped returning results for searches about vaccination — saying it was “better not to serve those results than to lead people down what is like a recommendation rabbit hole.”

Follow topics and authors from this story to see more like this in your personalized homepage feed and to receive email updates.