Imagine being a fan of a musician who died decades ago, only to see a notification that they’ve just dropped a new single. That’s exactly what happened in the summer of 2025. According to Spotify, Blaze Foley—a beloved country singer-songwriter murdered in 1989—had just released a new song called “Together.” 1
But something was wrong. The track was a generic, slow country tune that sounded nothing like Foley’s raw, distinctive style. The cover art was even stranger: an AI-generated image of a young, blonde man who looked nothing like the real Blaze Foley.
Craig McDonald, the owner of Lost Art Records and the manager of Foley’s entire musical legacy, knew immediately it was a fake. “I can clearly tell you that this song is not Blaze, not anywhere near Blaze’s style, at all,” he said, calling the track an “AI schlock bot” with “the authenticity of an algorithm.” Fans were equally outraged, with one commenting that Foley “would absolutely hate that his name is tied to anything like this.”
This wasn’t a one-off glitch. A similar AI-generated track, “Happened to You,” briefly appeared on the page of another late country legend, Guy Clark. These incidents reveal a disturbing new trend: using AI to create fraudulent music and attaching it to the names of deceased, niche artists to exploit their dedicated fanbases for streams and royalties. 2

How Do Fake Songs Get on Spotify?
You might think it would be hard to upload a fake song to a major artist’s official page, but the system is surprisingly easy to game. Artists and labels don’t upload music directly to Spotify. Instead, they use third-party distributors—companies like DistroKid or TuneCore—that handle the process for a fee. 3
The problem is that some of these distributors have weak verification. In the case of the Foley and Clark songs, Spotify blamed a distributor called SoundOn, which is owned by TikTok. However, the copyright for the fake tracks was credited to a mysterious entity named “Syntax Error,” which has been linked to other suspicious AI uploads. 4
This means a bad actor, hiding behind a fake name, was able to use a distributor’s platform to push fraudulent content onto Spotify. Because Spotify’s system is built for speed and scale, it automatically trusted the information it received and placed the songs on the official artist pages without any secondary checks.
Craig McDonald places the blame firmly on the streaming giant. “It’s kind of surprising that Spotify doesn’t have a security fix for this type of action, and I think the responsibility is all on Spotify,” he argued. “They could fix this problem. One of their talented software engineers could stop this fraudulent practice in its tracks, if they had the will to do so.”
The Law is Playing Catch-Up
This digital grave-robbing falls into a legal gray area. Standard copyright law protects specific songs, but not an artist’s general “style” or vocal sound. This makes it difficult to sue over a new song that just sounds like a famous artist.
The more relevant legal tool is the “Right of Publicity,” which protects a person’s name and likeness from being used for commercial gain without permission. The issue is that there’s no federal law for this; it’s a messy patchwork of state laws that vary wildly on whether these rights even exist after an artist dies.
In response, new laws are being proposed. Tennessee recently passed the ELVIS Act to specifically protect artists’ voices from AI fakes, and the federal NO FAKES Act aims to create a national standard. But until these laws are widespread, artists’ legacies remain vulnerable.
An Industry at War with AI
The music industry is fighting back on multiple fronts. Over 200 major artists and estates, including those of Frank Sinatra and Bob Marley, have signed open letters demanding protection from unethical AI.
Meanwhile, major record labels like Universal, Sony, and Warner have filed massive lawsuits against AI music companies Suno and Udio. The suits allege that these companies committed mass copyright infringement by training their AI models on huge libraries of popular music without permission or payment. The AI companies claim their actions are protected under “fair use,” but the outcome of these legal battles could reshape the future of AI. 5
The fraudulent songs appearing on Spotify are a stark reminder of what’s at stake. They represent a violation of artistic legacy and a betrayal of fan trust. While the industry fights its battles in court and congress, the responsibility falls on platforms like Spotify to protect the very artists—living and dead—that their business is built on. Without stronger verification and a commitment to authenticity, the ghosts in the machine will only continue to multiply.
- https://www.reddit.com/r/Music/comments/1m5sm4c/spotify_publishes_aigenerated_songs_from_dead/ ↩︎
- https://consequence.net/2025/07/spotify-ai-generated-songs-dead-artists/ ↩︎
- https://soundcamps.com/blog/how-to-upload-music-to-spotify-as-an-artist/ ↩︎
- https://bluntmag.com.au/news/spotify-publishes-ai-generated-songs/ ↩︎
- https://www.soundverse.ai/blog/article/ai-music-generators-ethical-innovation-or-legal-nightmare ↩︎
* generate randomized username
- COMMENT_FIRST
- #1 Lord_Nikon [12]
- #2 Void_Reaper [10]
- #3 Cereal_Killer [10]
- #4 Dark_Pulse [9]
- #5 Void_Strike [8]
- #6 Phantom_Phreak [7]
- #7 Data_Drifter [7]
- #8 Zero_Cool [7]


