Spotify just did a massive cleanup. The streaming giant announced it has removed over 75 million “spammy tracks” in the last year alone. That number is almost as big as its entire official library of 100 million songs. This wasn’t just routine maintenance. It was a direct response to the explosion of generative AI tools that let fraudsters pump out low-quality music, or “AI slop,” on an industrial scale.
- Combating Streaming Fraud: Spotify removed 75 million spam tracks and is implementing measures like banning unauthorized AI voice clones and creating a spam filter to protect the royalty system from artificial streaming and AI content.
- Transparency and Disclosure: Spotify is encouraging industry-wide collaboration on AI disclosure standards for music credits, fostering trust with listeners while navigating the use of AI in music creation.
- Artist Impact and Challenges: Independent artists have experienced negative consequences like false fraud accusations and music removal, highlighting the need for more precise detection systems that target fraudsters without penalizing legitimate creators.
This flood of fake music is more than just annoying. It’s a calculated attack on the system. Scammers use bots and AI to generate millions of streams, which siphons royalties away from real, hardworking artists. In response, Spotify is rolling out a three-part plan to fight back against deepfakes, spam, and fraudulent uploads. 1
What Exactly is Streaming Fraud?
Streaming fraud is a catch-all term for tricking the system to make money. With AI, these old tricks are now supercharged.
Fraudsters use a few key tactics:
- Artificial Streaming: This is the classic method. Scammers use automated bots or networks of devices called “click farms” to play songs on a loop. This generates fake plays that look real to the system. 2
- AI Content Spam: Generative AI makes it cheap and easy to create thousands of tracks. Scammers flood Spotify with mass uploads of generic instrumentals or ambient noise. They also create artificially short tracks, just over the 30-second mark needed to get paid, to maximize their fraudulent revenue.
- SEO Hacking: A sneakier method involves gaming Spotify’s search results. Spammers stuff keywords into artist and track names, like “Relaxing Music for Stress Relief,” to capture anyone searching for mood-based playlists.
Suno Faces a Staggering Demand: $150,000 for Every Single Song in Copyright Lawsuit
Spotify’s New Rules of Engagement
To combat this, Spotify has laid out a new battle plan focused on three key areas.
First, the platform is cracking down on identity theft. Unauthorized AI voice clones, or deepfakes, are now explicitly banned. An AI-generated track that mimics an artist’s voice is only allowed if the original artist gives clear permission.
Second, Spotify is launching a new music spam filter this fall. Instead of just deleting spam tracks, this system will tag them and stop them from appearing in recommendations. This is a crucial move because it prevents spammers from generating royalties. The company says it will roll the filter out “conservatively” to avoid penalizing innocent artists.
Third, Spotify is pushing for transparency. It’s working with industry partners on a voluntary AI disclosure standard. This will let artists add information to their track credits explaining how AI was used. Spotify insists this is not about punishing artists who use AI, but about building trust with listeners.
What This Means for Human Artists and Their Royalties
So, what does this crackdown actually mean for human artists and their royalties? In theory, it’s a huge win. Spotify’s payment system works like a giant shared pool. All the money from subscriptions and ads goes into one pot, and artists get paid based on their percentage of total streams. Every fake stream from a bot or AI-generated track dilutes that pool, lowering the value of each real stream and taking money directly from legitimate artists. By removing 75 million spam tracks and filtering out new ones, Spotify is trying to make sure that the royalty money goes to the people who actually earned it, which should lead to a higher per-stream payout for human creators. 3
The Community Backlash
While these changes are needed, the rollout has been messy, and many artists feel they are being unfairly punished.
Listeners on forums like Reddit have been complaining for a while. Many are tired of their playlists being “entirely replaced by slop” and want a simple filter to block all AI-generated music from their feeds. There’s also a long-held suspicion that Spotify itself promotes “fake artists” on its own playlists to lower royalty payouts, which fuels distrust. 4
But the biggest outcry has come from independent artists. Many have been falsely accused of “artificial streaming” and had their music removed 5 after being added to bot-driven playlists without their knowledge. In some cases, artists have even been flagged after getting a stream spike from Spotify’s own algorithmic playlists, like Release Radar.
This creates a nightmare for artists. Spotify’s policy can fine distributors like DistroKid and TuneCore for fraudulent tracks. The distributors, in turn, often pass the penalty down to the artist, sometimes removing their entire catalog. This system punishes the artist, who is often the victim, instead of the person running the fraudulent playlist.
The Road Ahead
Spotify’s crackdown is a major step in the fight for a fair music ecosystem. The platform is trying to balance supporting new technology while protecting artists from bad actors. However, the stories from independent artists show that the automated systems are far from perfect.
This is an industry-wide problem, with streaming fraud costing artists billions of dollars every year. The challenge for Spotify and other platforms will be to refine their tools to be more precise, punishing the actual fraudsters without catching innocent creators in the crossfire. The future of music depends on getting that balance right.
- https://www.theguardian.com/music/2025/sep/25/spotify-removes-75m-spam-tracks-past-year-ai-increases-ability-make-fake-music ↩︎
- https://www.arkoselabs.com/explained/what-is-music-streaming-fraud/ ↩︎
- https://news.ycombinator.com/item?id=11045504 ↩︎
- https://www.reddit.com/r/truespotify/comments/1nq5adq/spotify_is_finally_taking_steps_to_address_its_ai/ ↩︎
- https://www.reddit.com/r/SpotifyArtists/comments/1ie3lct/help_spotify_removing_my_song_for_fake_streams/ ↩︎
* generate randomized username
- COMMENT_FIRST
- #1 Lord_Nikon [12]
- #2 Void_Reaper [10]
- #3 Cereal_Killer [10]
- #4 Dark_Pulse [9]
- #5 Void_Strike [8]
- #6 Phantom_Phreak [7]
- #7 Data_Drifter [7]
- #8 Cipher_Blade [6]




Too little, too late.
Even if they cut half the AI slop, I don’t trust their “conservative” rollout to be anything other than cutting non-Spotify-sanctioned muzak which already eats into the massive portion of artists proceeds that they’re laundering to pocket themselves, and have been for years.
This is pandering (along with CEO stepping aside—not down), and trying desperately to save some face while it all crashes down around them.