SoundCloud has publicly clarified its stance on the use of artist content for AI training, stating that it “has never used artist content to train AI models” and is now making a formal commitment to consent, transparency, and artist control when it comes to AI technologies on its platform.
The move comes after mounting concern from artists who discovered vague language in SoundCloud’s February 2023 Terms of Use, which appeared to grant the company rights to use uploaded music and other content to train machine learning models, including generative AI.
The problematic clause stated: “In the absence of a separate agreement that states otherwise, You explicitly agree that your Content may be used to inform, train, develop or serve as input to artificial intelligence or machine intelligence technologies or services as part of and for providing the services.”
SoundCloud CEO Eliah Seton acknowledged the issue in a statement: “The language in the Terms of Use was too broad and wasn’t clear enough. It created confusion, and that’s on us.”
To address the confusion, SoundCloud will soon replace that clause with more restrictive and transparent language: “We will not use Your Content to train generative AI models that aim to replicate or synthesize your voice, music, or likeness without your explicit consent, which must be affirmatively provided through an opt-in mechanism.”
SoundCloud Says: No AI Training Without Artist Consent
Seton reiterated that SoundCloud has never used member content to train generative AI, including large language models for music synthesis or replication. If the company ever chooses to leverage AI tools in the future, Seton says it will be strictly opt-in and only involve artists who explicitly provide consent.
A SoundCloud spokesperson also told The Verge that any future AI initiatives would be built with human artists in mind, and that artists would have full control over participation.
Some Critics Say the New Language Still Isn’t Enough
Despite the update, not everyone is convinced. Ed Newton-Rex, the AI ethics advocate who originally flagged the issue, called the revised wording insufficient. In a post on X, he argued that the new terms could still permit SoundCloud to train AI on artist content—as long as the goal isn’t to mimic an artist’s “voice, music, or likeness.”
“If they actually want to address concerns, the change required is simple,” Rex wrote. “It should just read: ‘We will not use your content to train generative AI models without your explicit consent.’”
SoundCloud’s policy update is the latest in a series of moves by music platforms, labels, and creators responding to the growing pressure surrounding AI-generated music and unauthorized data usage. As generative tools advance and questions around voice replication and training data ethics intensify, platform trust and clear consent mechanisms are becoming central to the artist-platform relationship.
For now, SoundCloud appears to be drawing a line: no artist content will be used for AI training without a clear, affirmative opt-in. Whether that satisfies the wider community—or influences competitors—remains to be seen.