This is The Stepback, a weekly newsletter breaking down one essential story from the tech world. For more on how AI is changing music and the music industry, follow Terrence O’Brien. The Stepback arrives in our subscribers’ inboxes at 8AM ET. Opt in for The Stepback here.
AI music is flooding streaming services — but who wants it?
They won’t ban it. They won’t embrace it either.


AI music is flooding streaming services — but who wants it?
They won’t ban it. They won’t embrace it either.
How it started
The use of generative AI in pop music started almost as a gimmick. There was a sense of experimentalism to 2018’s I AM AI by Taryn Southern and 2019’s Proto by Holly Herndon, albums that were created with significant assistance from AI. Others got in on the action too, exploring the outer limits of tools like Google’s Magenta and even training their own models. But things quickly changed with the launch of Suno in December of 2023 and Udio in April of 2024.
Suno and Udio allow users to quickly create entire compositions with a simple text prompt. AI-generated music was no longer the realm of technical experts and fringe experimenters, it was now accessible to anyone with an internet connection. This led to an influx of machine-made music hitting streaming platforms.
In September of 2025, Deezer said that 28 percent of music uploaded was fully AI-generated. By the end of the year, that had grown to over 50,000 tracks per day, accounting for 34 percent of uploads. Both users and artists have expressed frustration, demanding streaming platforms do something to combat the growing problem that is watering down playlists and siphoning millions in royalties away from legitimate artists. Udio did not reply to a request for comment.
How it’s going
Things have only gotten worse at Deezer, where daily uploads of AI-generated content have grown to 75,000, and are threatening to overtake actual human-made music. And Spotify removed over 75 million spam tracks in just 12 months.
Deezer was the first major streaming platform to implement a system that detects and labels AI-generated content. The service also prevents its algorithm from recommending it and has demonetized 85 percent of the streams. In a recent press release, Deezer CEO Alexis Lanternier said that, “AI-generated music is now far from a marginal phenomenon and as daily deliveries keep increasing, we hope the whole music ecosystem will join us in taking action to help safeguard artist’s rights and promote transparency for fans.”
Qobuz was next to implement a detection system. It also published an AI charter, promising that it would never use AI for its editorial or curation content. While the company stopped short of banning AI-generated content, it did lean into the discontent, saying, “The heart of Qobuz is and will remain human.”
Apple soon followed. Though its labeling system has an obvious flaw — it relies on self-reporting. Apple Music “requires” labels and creators to voluntarily add Transparency Tags to their metadata. When asked how it was enforcing requirements, or what penalties, if any, there were for failing to label AI-generated content, Apple declined to comment and pointed me to an industry newsletter from early March that says it “defers to content providers to determine what qualifies as AI content.”
Spotify also opted for a voluntary system. It recently launched AI credits, which identify tracks made using generative AI. It’s working with the standards group DDEX to create an industry standard for labeling AI content. It goes beyond blanket labeling, allowing artists to specify whether AI was used to create the lyrics, vocals, or backing music. Initial glimpses of those efforts began rolling out in mid-April, with DistroKid as its first partner.
While DDEX counts most of the industry’s heavyweights as members — including Amazon, Google, Meta, Apple, Songtradr (home of Bandcamp), Pandora, BMI, UMG, Sony Music Entertainment, and Warner Music Group — not everyone is necessarily on board with Spotify’s standard yet.
Spotify has come under criticism for its handling of AI slop and so-called ghost artists. But recently, it has gone out of its way to talk up its transparency efforts and its increased offensive against spam and impersonation. The company also recently launched a Verified by Spotify badge that is supposed to guarantee there’s a human behind an artist profile. Sam Duboff, global dead of Marketing & Policy, Spotify for Artists, told The Verge that it is experimenting with third-party detection tools, but that they still make a “material amount of incorrect assessments.”
Google also requires that AI-generated content be labeled, whether that’s on YouTube or YouTube Music. While the company won’t publicly detail how its systems to combat AI slop work, it has said that it’s “building on… established systems that have been very successful in combatting spam and clickbait, and reducing the spread of low quality, repetitive content.” It also says that failing to disclose can carry penalties, including the removal of content or suspension from the YouTube Partner program.
In survey after survey, public opinion toward AI music is pretty unfavorable. A study by Deezer and Ipsos showed that 51 percent of respondents think AI will “lead to the creation of more low-quality, generic-sounding music.”
A poll conducted by The Hollywood Reporter and the Frost School of Music found that 66 percent of people never knowingly listen to music generated by AI. And 52 percent said they wouldn’t even want to listen to music from their favorite artist if they knew it was made with help from AI.
Researchers from Singapore also found significant negative bias toward AI-generated content. The paper’s authors claim that this is because emotion plays such a central role in how we engage with music. They say that “due to its lack of expressive intent, AI-generated music may be perceived as less capable of conveying authentic emotion or fostering meaningful connections with listeners.”
Despite this, only Bandcamp has banned generative AI music outright. Of course, it says that “Music and audio that is generated wholly or in substantial part by AI is not permitted,” but enforcing that policy is easier said than done. Bandcamp is not proactively scanning uploads to catch AI music. Instead it’s relying on manual reports from users to flag suspicious content.
What happens next
The flood of AI music shows no signs of abating. The number of AI tracks uploaded has grown steadily over the last year, and according to Deezer’s Director of Research, Manuel Moussallam, “It is likely that deliveries will keep increasing.”
If there is a silver lining, it’s that while the number of generative AI uploads has grown by nearly 40 percent, there doesn’t seem to be a marked increase in streams. Moussallam says, “The consumption after fraud removal is not gaining much traction and is still very concentrated on a few viral tracks.”
AI generated music accounts for as little as 1 percent of streams on Deezer as of April, up from about 0.5 percent in early November. But in that time the percentage of fraudulent streams of AI music has increased dramatically from “up to 70 percent” to 85 percent. That suggests people are seeking out AI music less often — perhaps the novelty has worn off.
YouTube Policy communications manager Jack Malon told The Verge the company is “involved in the active development of new industry standards for AI disclosures in music credits,” though stopped shy of saying that it was collaborating with Apple or Spotify specifically. Google was heavily involved in the creation of C2PA for authenticating content, but it’s been criticized for inconsistent implementation, potential for abuse, and creating a false sense of security.
Neither Google nor Spotify seems ready to start demonetizing or excluding AI-generated music from its recommendation engine. Duboff says that, “Over time, we believe the use of AI in music will increasingly be a spectrum, not a binary. Tracks won’t be ‘categorically AI’ or ‘not AI at all’ with no in between.”
Creations like Velvet Sundown, Breaking Rust, and Solomon Ray might be anomalies at the end of the day. They’ve generated more attention for being AI than they have for the quality of the music. Fully generative AI music will continue to be a threat to working musicians, session artists, library music composers, and the like. But they may struggle to find footing on the charts.
However, artists are more frequently embracing AI, even if it’s largely behind the scenes. It’s worked its way into songwriting sessions in Nashville and replaced sampling for hip-hop producers, and Diplo says creatives need to adapt. (Or “just like give up and become an Uber driver until everyone has a Waymo.”) Duboff says, “We hear from top artists, songwriters, and producers all the time who are incorporating AI technology into their creative processes.”
Companies are hesitant to penalize AI use in part because they expect it to become a standard tool in the industry. Even when launching its Verified by Spotify program, the company left the door open to AI acts saying, “the concept of artist authenticity is complex and quickly evolving.”
But when Suno users are churning out an entire Spotify’s worth of AI slop every two weeks, the demand for dramatic steps is likely to grow. That Deezer / Ipsos study found that 45 percent of people would like to filter out all AI-generated music from their music streaming library. It’s a solution that neither Deezer nor any other streaming service has committed to. And it would face its own steep hurdles, including an industry-wide standard for labeling that is consistently implemented, and robust, reliable AI detection tools.
If someone wants to listen to Xania Monet, nobody should stand in their way. If you could flip a switch and instantly hide all generative AI music on Spotify, I bet a lot of people would.
By the way
- Suno and Udio have spawned an entire subculture of AI creators who claim to only listen to the music they prompt, and nothing else.
- The first widely recognized AI pop song is “Daddy’s Car,” composed using Sony’s Flow Machines tech trained on the Beatles’ catalog. (You can tell.)
- Companies are working on tech that would allow them to reverse engineer what data AI is trained on, which could lead to a whole new set of lawsuits.
- Artists are considering adopting a certified “human-made” label.
Read this
- The story of Mike Smith is a wild one. This story from Kate Knibbs at Wired charts how he exploited generative AI, bot farms, and unwitting collaborators to make over $10 million in streaming royalties.
- The Hollywood Reporter and the Frost School of Music collaborated on what might be the most comprehensive survey of American attitudes toward AI music.
- Illiac Suite: String Quartet No. 4 is a fascinating, if often overlooked, bit of musical history. The Guardian tells the story of what is generally considered to be the first piece of music composed by a computer.
- Jess Weatherbed looks at how Big Tech’s attempts to fight AI slop might actually be making things worse.











