
Paul Bender’s music has been sampled by Beyoncé and Kendrick. His band, Hiatus Kaiyote, has three Grammy nominations. His side project, The Sweet Enoughs, racks up millions of streams. So it came as a shock when fans started hearing tracks on his Spotify profile that he didn’t recognise — or approve.
Tracks that sounded like they’d been composed by an AI trapped in an elevator.
“It was probably the worst attempt at music I’ve ever heard,” Bender told Brisbane Times. “Just absolutely cooked.” His reaction soon gave way to a grim realisation: someone was uploading fake music — apparently AI-generated — directly to his artist profile. And it wasn’t just Spotify. Apple Music, Tidal, YouTube Music and Deezer all carried the same fakes.
No passwords were stolen. No logins compromised. Just a ticking time bomb in the music distribution supply chain.
The Loophole That Became a Business Model
The scam works like this: a grifter uploads garbage tracks via a digital music distributor, assigns them to a known artist name, and — voila — the platform “maps” it to the artist’s official profile. Instant legitimacy, with algorithmic discovery to match.
No ID check. No consent. No authentication.
This isn’t just a quirk of one platform’s back end. It’s systemic. And it’s being exploited on an industrial scale. One vlogger, TankTheTech, showed how anyone can assign AI music to an artist profile in under ten minutes.
And the numbers are staggering:
-
Deezer reports that 18% of its daily uploads in 2025 are AI-generated.
-
Mubert, an AI music tool, claims over 100 million tracks were made on its platform in just the first half of 2023.
-
The Music Fights Fraud Alliance estimates 10% of all global music streams are fraudulent, with some distributors seeing fraud rates as high as 50%.
That’s not fringe — it’s a revenue model. And it’s bleeding real artists.
Legal Implications: Between Passing Off and Platform Apathy
Let’s be clear: uploading fake music under someone else’s name looks a lot like impersonation, if not passing off, especially where artist reputation and income are at stake. There may also be:
-
Copyright infringement if elements of an artist’s work were used in training or replication.
-
Moral rights violations under the Copyright Act 1968 (Cth), especially the right of integrity where a fake work is falsely attributed.
-
Misleading or deceptive conduct under section 18 of the Australian Consumer Law.
Yet despite the legal exposure, platforms and distributors are playing hot potato with responsibility. Spotify calls it a “mapping issue.” Artists call it what it is: a scam that platforms are structurally enabling.
Why This Matters — Beyond Music
This isn’t just a niche concern for indie musicians. It’s a case study in what happens when:
-
AI-generated content floods creative ecosystems,
-
platforms prioritise volume over verification,
-
and IP rights become an afterthought to scale.
In short, it’s the algorithm’s world — and creatives are just living in it.
But not quietly. Artists like Bender and Michael League (of Snarky Puppy) are now speaking out and pushing for industry action. With growing numbers of testimonials and escalating complaints, the music world may be the canary in the coal mine for a broader wave of AI impersonation and platform indifference.
Until then, don’t be surprised if the next time you hit play on a favourite artist’s profile… what comes out is 100% algorithm, 0% soul.
Here’s a thought: 2FA authentication before allowing uploads? Verify before you amplify!
From Elton John to anonymous meme-makers, a battle is raging over what it means to be “creative” — and whether it starts with permission.

