Welcome to DU!
The truly grassroots left-of-center political community where regular people, not algorithms, drive the discussions and set the standards.
Join the community:
Create a free account
Support DU (and get rid of ads!):
Become a Star Member
Latest Breaking News
Editorials & Other Articles
General Discussion
The DU Lounge
All Forums
Issue Forums
Culture Forums
Alliance Forums
Region Forums
Support Forums
Help & Search
General Discussion
Showing Original Post only (View all)How AI-Generated Music Became A $4 Billion Fraud Machine (Forbes, 5/5) [View all]
https://www.forbes.com/sites/virginieberger/2026/05/05/how-ai-generated-music-became-a-4-billion-fraud-machine/Generative AI made it easy to produce and distribute fraudulent music at scale. A CISAC and PMP Strategy study projects that nearly 25% of creators revenues are at risk by 2028, potentially amounting to four billion euros.
-snip-
In April 2026, Deezer reported receiving 75,000 fully AI-generated tracks per day, representing 44% of all daily uploads, more than two million tracks per month. Of the streams those tracks generate, 85% are fraudulent. Thibault Roucou, Deezers head of streaming, stated it directly in Music Week: "Generating fake streams continues to be the main purpose for uploading AI-generated music."
-snipping to get to this paragraph about fraudsters-
They now use AI generators to flood platforms with millions of tracks and stream each one just a few thousand times, enough to generate royalties from each but not enough to trigger detection systems tuned for high-volume replay.
As Melissa Morgia, Chief Global Content Protection Officer at IFPI, told a panel on the sidelines of the seventeenth session of WIPOs Advisory Committee on Enforcement in February 2025, AI is the ultimate enabler of streaming fraud because it allows bad actors to stay under the radar but still operate at a sufficient scale that their activities are lucrative.
-snip-
-snip-
In April 2026, Deezer reported receiving 75,000 fully AI-generated tracks per day, representing 44% of all daily uploads, more than two million tracks per month. Of the streams those tracks generate, 85% are fraudulent. Thibault Roucou, Deezers head of streaming, stated it directly in Music Week: "Generating fake streams continues to be the main purpose for uploading AI-generated music."
-snipping to get to this paragraph about fraudsters-
They now use AI generators to flood platforms with millions of tracks and stream each one just a few thousand times, enough to generate royalties from each but not enough to trigger detection systems tuned for high-volume replay.
As Melissa Morgia, Chief Global Content Protection Officer at IFPI, told a panel on the sidelines of the seventeenth session of WIPOs Advisory Committee on Enforcement in February 2025, AI is the ultimate enabler of streaming fraud because it allows bad actors to stay under the radar but still operate at a sufficient scale that their activities are lucrative.
-snip-
Generative AI seems to be better at fraud than anything else.
It's important for DUers to keep in mind that when they run into AI slop online, it's not only usually trashy, error-filled and unethical as hell simply because the AI was trained on stolen intellectual property, but there's a good chance there are professional fraudsters behind it. That goes for AI music, AI videos and AI images. Lots of the AI slop images and videos on Facebook and YouTube, for instance, come from content farms, and although that Forbes article doesn't use the term, the generation of lots of AI music tracks to be streamed by bots is done by content farms.
Professional criminals. Not usually individuals who could be well-meaning but are naive or desperate enough, for whatever reason, to use AI.
And the fact professional fraudsters are so heavily involved in AI slop is another reason why DUers shouldn't give any attention to it, especially copying it here or elsewhere, if they don't know exactly who's behind that slop video, image or music track.
As for the individual AI users who don't mean to be fraudsters - well, they should know better than to use generative AI to create content. Sometimes they might be so naive they don't know how AI is trained. Sometimes they might think a lofty goal outweighs using unethical tools. They should be reminded that no matter how well-intentioned they are, it's a bad idea and hurts their message to use those AI tools. Ideally they'll stop using genAI, because it's beneath them and does entangle their message with AI-bro and pro-AI messaging.
Anyway, if you don't know who's creating and posting AI slop, it's most reasonable to assume they're just fraudsters out to make a quick buck and steal attention and income from real artists - human artists.
In April 2026, Deezer reported receiving 75,000 fully AI-generated tracks per day, representing 44% of all daily uploads, more than two million tracks per month. Of the streams those tracks generate, 85% are fraudulent.
85% are fraudulent. Stunning number. And with statistics like that, AI slop from unknown sources does not deserve the benefit of the doubt. It should be shunned - not trusted, and not shared.
4 replies
= new reply since forum marked as read
Highlight:
NoneDon't highlight anything
5 newestHighlight 5 most recent replies
How AI-Generated Music Became A $4 Billion Fraud Machine (Forbes, 5/5) [View all]
highplainsdem
Yesterday
OP
In a way, 85% of the consumption of AI slop being done by AI bots is better vs. less.
AZJonnie
Yesterday
#1
That 85% is an indication that most of those AI tracks were generated for fraudulent purposes, not
highplainsdem
22 hrs ago
#3