A survey published last week suggested 97% of respondents could not spot an AI-generated song. But there are some telltale signs - if you know where to look.
Here’s a quick guide …
-
No live performances or social media presence
-
‘A mashup of rock hits in a blender’
A song with a formulaic feel - sweet but without much substance or emotional weight - can be a sign of AI, says the musician and technology speaker, as well as vocals that feel breathless.
- ‘AI hasn’t felt heartbreak yet’
“AI hasn’t felt heartbreak yet… It knows patterns,” he explains. “What makes music human is not just sound but the stories behind it.”
- Steps toward transparency
In January, the streaming platform Deezer launched an AI detection tool, followed this summer by a system which tags AI-generated music.



AI imitates an overall sound. But doesn’t care much about “instruments” individually. For simple minimal segments it can easily lay down a simple clear beat or melody. But as more gets added. The more the sound becomes muddy and generic. That and if you’re familiar enough with a given instrument. It can often just sound “wrong”. Again because the AI is imitating a sound, not an instrument generally.
But yeah. The other points stand. Social media presence and output are great indicators.
Midnight Darkwave is one I’m highly suspicious of. Super generic name. Not much presence beyond the streaming sites. I like the overall sound, but it often gets muddy and kind of droning. And not in the coldwave sort of way. Something a bit more inhuman, over processed, and mechanical.
I have used Suno quite extensively just for fun, I insert my own lyrics and let it create different styles and beats, and you have to push out like 30 before it does something actually decent, but some of them are fucking bangers. I consider it like watching visualizations in WinAmp.
I am not stating a moral proposition in either direction, just an observation.