Fake AI tracks uploaded to verified Spotify pages of dead artists raise alarm over legacy rights. Spotify removes tracks like “Together,” but scrutiny intensifies over platform safeguards.

AI Imposters on Legacy Artist Pages
Spotify is facing fresh scrutiny after several seemingly AI-generated songs were uploaded under the verified artist pages of deceased musicians—without consent from estates or rights holders. The most prominent case involved Blaze Foley, a country singer who died in 1989. A track titled “Together” appeared as a new release on his official Spotify page, complete with cover art depicting an unrelated young man.
Craig McDonald, manager of Foley’s estate and the Lost Art Records label, condemned the upload, calling it “an AI schlock bot” that distorts Foley’s legacy. He emphasized that any fan would immediately recognize the track as fake. The song was eventually removed, but McDonald criticized Spotify’s lack of proactive protections.
Beyond Foley: A Wider Pattern
This issue extends beyond a single incident. Similar AI-generated songs were uploaded under the names of other deceased artists like Guy Clark (“Happened To You”) and Dan Berk (“With You”), each branded with the same mysterious copyright holder, Syntax Error.
These tracks combined AI-generated vocals, cover art, and metadata, falsely integrating into artists’ official discographies.
Spotify’s Response & Root Issues
Spotify swiftly removed the unauthorized content and cited violations of its deceptive content policy, particularly those involving impostor tracks impersonating real individuals. The uploads were traced to SoundOn, a TikTok-owned distributor—highlighting the platform’s reliance on distributorship honor-based systems.
But removal alone wasn’t enough. Critics argue Spotify’s upload pipelines lack effective vetting, and AI-generated uploads may continue unchecked.
Ethical, Legal & Economic Impacts
The controversy raises multifaceted issues:
- Cultural & ethical harm: Mislabeling AI-generated music as part of an artist’s legacy distorts public memory.
- Economic dilution: Royalty pools are finite. Fake AI tracks siphon earnings away from real artists or their estates.
- Legal ambiguity: Current copyright law struggles to address unauthorized AI impersonation. Future regulation, watermarking, or AI labeling mandates may offer solutions.
Calls for Transparency & Verification
Industry groups are urging:
- Mandatory disclosure of AI-generated content.
- Stronger vetting tools to flag deceptive uploads.
- Real-time verification tied to artist estates or labels.
Some platforms like ROKK have implemented human-reviewed systems to identify AI uploads—an approach Spotify could adopt to reduce fraudulent content.
Why It Matters: Legacy Integrity at Risk
For estate managers and fans alike, music is more than data—it’s memory. These AI tracks not only mislead listeners but can permanently alter how an artist’s catalogue is consumed and understood.
Spotify’s inaction until forced removal raises deeper questions: How many more fake uploads exist? How many listeners have been misled? And at what point does this digital erosion cross into cultural vandalism?
Comments are closed.