Advertisement
Spotify has seen a big rise in AI-generated songs on its platform. To deal with this, it is introducing new policies and working on a new system to show when AI was used in making music.
The company is starting with stricter rules against impersonation. This means artists will have more protection if someone tries to upload AI music under their name. Spotify is also improving its content-mismatch system to catch fake uploads before they even go live.
Next, Spotify is adding a new music spam filter. This will stop users who upload lots of AI-generated songs from being recommended. The goal is to make sure artists who follow the rules get fair exposure and payouts.
Spotify is also teaming up with DDEX to create a new metadata standard. This system will show exactly how AI was used in a song. The idea is not to ban AI but to be clear with listeners about it. Fifteen labels and distributors have already agreed to use this new standard.
Spotify wants to protect real artists while still embracing AI tools, like its AI DJ feature, in a world where AI in music is growing fast.