Deezer might not be the first name that comes to mind when we think of music streaming, but it’s still got skin in the game. The platform receives a huge number of uploaded tracks every day, and according to a new press release, “nearly 75,000” of those are fully gen AI-created tunes.
That’s 44% of daily uploads, according to the company (via Ars Technica). Deezer says that its AI detection measures mean that a mere one to three percent of said music is actually consumed by end users, and that 85% of the AI-created tunes are detected as fraudulent and demonetized.
However, Deezer also points towards its “world’s first” survey into perceptions and attitudes towards gen AI music commissioned late last year, with a total of 9,000 participants. The company reports that 97% of those asked couldn’t tell the difference between fully AI-generated music and its human-made equivalent in a blind test, and that’s… well, troubling.
80% of survey participants agreed that AI generated music should be clearly labelled to listeners, while 52% of respondents “feel that 100% AI-generated songs should not be included in charts alongside human-made songs”. I didn’t participate in the survey, but… yeah. I’ll say.
As an ex semi-professional musician myself, I’d posit that the paid minstrels among us might need even more protection from generative AI than your average worker.
(Image credit: Larian Studios)
It’s a tough game, creating music for a living—and while streaming services often pay artists relatively small amounts for their work, if the music of said human artists is competing against vast numbers of AI-created tunes that can fool many listeners, it’s not looking good for your average bard.
Of course, there’s a devil’s advocate argument to be made here. If AI-generated music can fool listeners to such a high degree, then it perhaps further devalues the need for those that write it from scratch. I guess we need to decide as a society whether we value the fruits of human creation, and…
I’m getting off topic. Deezer’s AI-generated music detection tools are now being licensed to the wider music industry to help prevent the spread, and the company says that, as the purpose of uploading these tracks seems to be to make money, it is essentially fraudulent behaviour.
It describes the spread of AI music as “a critical challenge for the music industry”, and cites a CISAC study that estimates nearly 25% of creator’s revenues to be at risk by 2028, which could amount to as much as $4 billion by that time.
It’ll be a sad state of affairs indeed, if all this comes to pass. If you need me, I’ll be in the corner, plucking gently at my guitar and lamenting the potential fall of the creative industry I adore. Anyone know any happy tunes?
