New york city suing tiktok meta google snap kids mental health – Breaking News & Latest Updates 2026
Skip to main content

New York City is suing TikTok and Instagram for ‘addicting’ children

It adds to a growing wave of litigation from state and local governments alleging harmful design features.

It adds to a growing wave of litigation from state and local governments alleging harmful design features.

NYC Mayor Eric Adams announced a lawsuit against leading tech platforms over their allegedly addictive features.
NYC Mayor Eric Adams announced a lawsuit against leading tech platforms over their allegedly addictive features.
Photo by Michael M. Santiago / Getty Images
Lauren Feiner
is a senior policy reporter at The Verge, covering the intersection of Silicon Valley and Capitol Hill. She spent 5 years covering tech policy at CNBC, writing about antitrust, privacy, and content moderation reform.

New York City is the latest government to go after big tech companies for allegedly addicting kids to their platforms.

Several city agencies, including NYC’s Department of Health and Mental Hygiene and Department of Education, filed suit against Meta, TikTok, Snap, and Google, accusing them of “fueling the nationwide youth mental health crisis.” The city charged them with public nuisance and negligence.

The city claims that the platforms’ design features, including recommendation algorithms and likes, addict children to the services and manipulate them into spending more and more time online.

The lawsuit adds to a growing wave of litigation from state and local governments that target tech platforms for allegedly addictive features that deceive and harm kids. The same set of social media companies faced a lawsuit from a Maryland school district last June that also accused them of contributing to a “mental health crisis.” Dozens of state governments sued Meta last fall for allegedly misleading the public about the harms its services could cause young users.

A ruling by a federal judge late last year indicates these kinds of lawsuits may survive early challenges. The California district court judge said that claims dealing with the “defects” of the platforms, rather than speech, could move forward and would not be considered in conflict with tech’s legal liability shield known as Section 230. However, none of the lawsuits mentioned above have had their core arguments tested in court.

Protecting children from online harms has been one of the few internet policy issues that has seemed to maintain momentum across many levels of government, even as legislation is increasingly difficult to pass in Congress. A Senate hearing last month that featured several tech CEOs saw lawmakers on both sides of the aisle expressing similar anger toward the companies’ handling of child safety.

NYC Mayor Eric Adams has alluded to his skepticism of social media platforms in the past, including recently in his State of the City speech in January. In that speech, he spoke about his health commissioner’s advisory that categorized social media as a public health hazard.

“Just as the surgeon general did with tobacco and guns, we are treating social media like other public health hazards and ensuring that tech companies take responsibility for their products,” the mayor said at the time, according to his prepared remarks.

Alongside the lawsuit on Wednesday, Adams released a “social media action plan,” which includes holding tech platforms accountable for harms to children’s mental health, educating families on safer use, and researching social media’s impact on youth.

The tech platforms named in the suit said they already take steps to ensure young users are safe and supported on their services.

“We want teens to have safe, age-appropriate experiences online, and we have over 30 tools and features to support them and their parents,” Meta spokesperson Andy Stone said in a statement, pointing to features like parental supervision tools and notifications suggesting social media breaks.

A Snap spokesperson drew differences between Snapchat and other social media platforms in a statement, saying the app was “intentionally designed to be different from traditional social media, with a focus on helping Snapchatters communicate with their close friends.”

A TikTok spokesperson pointed to what it called the app’s “industry-leading safeguards to support teens’ well-being, including age-restricted features, parental controls, an automatic 60-minute time limit for users under 18, and more.”

Google spokesperson José Castañeda called the NYC allegations “simply not true.”

“In collaboration with youth, mental health and parenting experts, we’ve built services and policies to give young people age-appropriate experiences, and parents robust controls,” Castañeda added in a statement.

Follow topics and authors from this story to see more like this in your personalized homepage feed and to receive email updates.