Home / Technology / Musicians Battle AI Imposters on Streaming Platforms
Musicians Battle AI Imposters on Streaming Platforms
16 Dec
Summary
- AI-generated albums are appearing on artists' profiles unnoticed.
- Scammers use distribution companies to upload fake music for royalties.
- Artists are demanding better security from streaming platforms.

A growing concern for musicians involves AI-generated music being fraudulently placed on their official streaming profiles. British folk artist Emily Portman was alerted by a fan to a new album, "Orca," which, despite mimicking her style, was AI-produced and trained on her prior work. Similarly, Australian musician Paul Bender found four AI-generated songs on his band's profile, describing them as "bizarrely bad."
The perpetrators are believed to exploit a loophole by using distribution companies that upload music without stringent identity verification. This tactic allows scammers to potentially earn royalties from the fake tracks, especially given the low per-stream revenue. The sophistication of AI music generators means these fakes are increasingly difficult for listeners to distinguish from genuine recordings.
In response, artists like Bender have launched petitions, gaining significant traction and signatures from fellow musicians and fans, urging platforms such as Spotify and Apple Music to implement stronger security measures. While some platforms are beginning to address the issue by working with distributors to detect fraud, many artists feel vulnerable due to current copyright laws and the ease of this emerging scam.




