The Business & Technology Network
Helping Business Interpret and Use Technology
«  
  »
S M T W T F S
1
 
2
 
3
 
4
 
5
 
6
 
7
 
8
 
9
 
10
 
11
 
12
 
13
 
14
 
15
 
16
 
17
 
18
 
19
 
20
 
21
 
22
 
23
 
24
 
25
 
26
 
27
 
28
 
29
 
30
 
31
 
 
 
 
 

Who Made This Song? Platforms Start to Label AI Music

DATE POSTED:March 6, 2026

An artificial intelligence (AI)-generated country song reached No. 1 on Billboard’s Country Digital Song Sales chart last fall, and few listeners knew the song was created by a machine.

The track, “Walk My Walk,” was released under the name Breaking Rust, a fictional artist with an AI-generated cowboy persona, generic lyrics about perseverance, and an Instagram page that never disclosed its synthetic origins.

It has become increasingly difficult to tell who or what was powered by AI, and to what extent.

Apple Music moved this month to address that gap directly. According to a Wednesday (March 4) TechCrunch report, Apple sent a newsletter to industry partners announcing Transparency Tags, a metadata framework covering four content categories: Track, Composition, Artwork and Music Video. Labels and distributors can apply the tags immediately, with Apple noting that the requirements will eventually become mandatory for new content.

How AI Is Reshaping Music Production and Distribution

The technical barriers to creating and distributing AI-generated music have effectively collapsed. Tools such as Suno and Udio allow users to generate full tracks from text prompts, with outputs that can be indistinguishable from human recordings to the untrained ear.

Those tracks can be uploaded through standard music distribution services, the same aggregators labels and independent artists use, and appear in streaming catalogs alongside human-made work, carrying identical metadata fields and monetization rights.

Breaking Rust attracted over 2 million monthly listeners on Spotify, where it was listed as a verified artist despite not having a biography. Several of its songs were played more than a million times, and one single exceeded 4.5 million streams. The creator never publicly identified themselves.

Deezer, which has invested in its own AI detection infrastructure, reported in January that it is receiving over 60,000 fully AI-generated tracks on a daily basis, up from 10,000 when it first deployed its detection tool in early 2025.

Synthetic content now makes up roughly 39% of all music delivered to the platform daily, according to. Music Business Worldwide report. More significantly, Deezer found that up to 85% of streams on AI-generated music were fraudulent in 2025, used to game royalty payouts rather than reflect genuine listener demand. The transparency problem and the fraud problem are, in practice, the same problem.

Platforms Reach for Disclosure Frameworks With Limits

Apple framed Transparency Tags as “a concrete first step toward the transparency necessary for the industry to establish best practices and policies that work for everyone.”

The limitation is self-evident: Apple is asking the parties responsible for uploading synthetic content to voluntarily declare it. The technical specification describes the tags as optional for now, and if omitted, none is assumed. There is no independent verification, no detection layer and no enforcement mechanism for labels or distributors that choose not to disclose.

Spotify’s stance covers similar ground through a different rhetorical frame. Co-CEO Gustav Söderström said at the company’s Feb. 10 Q4 earnings call that the platform should not police creative tools: “Spotify should not decide what kind of tools you are allowed to use. Are you allowed to use an electric guitar, a synthesizer, a digital audio workstation? Or AI or—more complicated question—a bit of AI. … I do not think it is our decision to make.”

But Söderström acknowledged the listener demand for clarity: “What we do think is that consumers would like to know and understand what tools were used in the creation of their music. We’ve been working with the industry to allow creators and labels uploading music to put in the metadata how it was created so that we can surface this to users.”

Labeling Challenge Across the Internet

The music industry’s disclosure debate is one instance of a broader platform challenge.

Music is a measurable case because streaming platforms have royalty data that quantifies the consequences of synthetic content at scale. The same structural problem applies to video and images, where detection is harder and the stakes for trust are arguably higher.

Meta launched Vibes in September, a dedicated feed within the Meta AI app for short-form AI-generated video, allowing users to create, remix and share synthetic content. The choice to create a feed reflects a different philosophy than Apple’s labeling approach: rather than surfacing disclosure within a shared content environment, Meta is routing AI-generated content into a separate container.

Both approaches are experiments. Neither has been tested at the scale the problem demands.

The post Who Made This Song? Platforms Start to Label AI Music appeared first on PYMNTS.com.