Spotify has been hit with another AI controversy after publishing computer-generated songs under the names of dead musicians.
An investigation by 404 Media found that Spotify is releasing AI-generated songs on the pages of deceased artists — without approval from their estates or labels.
One such track, “Together,” recently appeared on the official page of Blake Foley, a country singer who was murdered in 1989. The song sounds vaguely similar to Foley’s style, but the accompanying image features a blonde, young man who looks nothing like him.
404 Media linked the track to a company account called Syntax Error, which was also responsible for several other seemingly fabricated numbers. One included “Happened To You,” a song supposedly performed by Grammy-winning country singer-songwriter Guy Clark, who died in 2016.
Spotify removed the unauthorised tracks after 404 Media’s report was published. However, while this is a particularly grim example of AI-generated music on the Swedish streaming platform, it’s not the first — and unlikely to be the last.
Last month, an AI-generated band called the Velvet Sundown popped up on Spotify. Its top track, “Dust on the Wind” — which sounds similar to the 1977 Kansas hit “Dust in the Wind” — has been played almost 2 million times since its release on June 20.
Velvet Sundown’s Spotify bio now describes the band as a “synthetic music project,” but the platform doesn’t label the tracks — or any other music — as AI-generated.
Daniel Ek, Spotify’s CEO, has taken a consistently laissez-faire approach to managing AI-generated content. Ek previously said that tracks created with AI were fair game on the platform — unless they mimicked real artists. However, Spotify seems to be doing a lousy job of identifying and removing these AI imitations as well, according to several reports.
The rise of AI-generated music on Spotify has sparked widespread backlash for several reasons. One involves the frequent use of AI tools like Suno or Udio, which generate entire tracks based on a simple text prompt. While the companies behind them claim that training their models on copyrighted music falls under “fair use,” opponents argue it amounts to copyright infringement. Critics also warn that AI-generated tracks compete for streams, reducing the share of royalties available to human artists.
Sophie Jones, the chief strategy officer at the music trade body the British Phonographic Industry (BPI), called for new protections in an interview with the Guardian last week.
“The rise of AI-generated bands and music entering the market points to the fact that tech companies have been training AI models using creative works — largely without authorisation or payment to creators and rights-holders — in order to directly compete with human artistry,” she said.
Another concern is that deceiving listeners with AI-generated profiles and songs is a form of misinformation that risks ruining the reputation of human artists.
To give creators a fair shot, Jones and others argue that streaming platforms should start by clearly labelling AI-generated content. That’s a move pioneered by rival streaming app Deezer, which has developed an algorithm that can identify artificially created songs made using several popular generative AI models, including Suno and Udio.
“AI is not inherently good or bad, but we believe a responsible and transparent approach is key to building trust with our users and the music industry,” Deezer’s CEO Alexis Lanternier said in June.
“We are also clear in our commitment to safeguarding the rights of artists and songwriters at a time where copyright law is being put into question in favour of training AI models.”