@ArtistRights Institute Newsletter 11/17/25: Highlights from a fast-moving week in music policy, AI oversight, and artist advocacy.

American Music Fairness Act

Don’t Let Congress Reward the Stations That Don’t Pay Artists (Editor Charlie/Artist Rights Watch)

Trump AI Executive Order

White House drafts order directing Justice Department to sue states that pass AI regulations (Gerrit De Vynck and Nitasha Tiku/Washington Post)

DOJ Authority and the “Because China” Trump AI Executive Order (Chris Castle/MusicTech.Solutions)

THE @DAVIDSACKS/ADAM THIERER EXECUTIVE ORDER CRUSHING PROTECTIVE STATE LAWS ON AI—AND WHY NO ONE SHOULD BE SURPRISED THAT TRUMP TOOK THE BAIT

Bartz Settlement

WHAT $1.5 BILLION GETS YOU:  AN OBJECTOR’S GUIDE TO THE BARTZ SETTLEMENT (Chris Castle/MusicTechPolicy)

Ticketing

StubHub’s First Earnings Faceplant: Why the Ticket Reseller Probably Should Have Stayed Private (Chris Castle/ArtistRightsWatch)

The UK Finally Moves to Ban Above-Face-Value Ticket Resale (Chris Castle/MusicTech.Solutions)

Ashley King: Oasis Praises Victoria’s Strict Anti-Scalping Laws While on Tour in Oz — “We Can Stop Large-Scale Scalping In Its Tracks” (Artist Rights Watch/Digital Music News)

NMPA/Spotify Video Deal

GUEST POST: SHOW US THE TERMS: IMPLICATIONS OF THE SPOTIFY/NMPA DIRECT AUDIOVISUAL LICENSE FOR INDEPENDENT SONGWRITERS (Gwen Seale/MusicTechPolicy)

WHAT WE KNOW—AND DON’T KNOW—ABOUT SPOTIFY AND NMPA’S “OPT-IN” AUDIOVISUAL DEAL (Chris Castle/MusicTechPolicy)

What We Know—and Don’t Know—About Spotify and NMPA’s “Opt-In” Audiovisual Deal

When Spotify and the National Music Publishers’ Association (NMPA) announced an “opt-in” audiovisual licensing portal this month, the headlines made it sound like a breakthrough for independent songwriters. In reality, what we have is a bare-bones description of a direct-license program whose key financial and legal terms remain hidden from view.

Here’s what we do know. The portal (likely an HFA extravaganza) opened on November 11, 2025 and will accept opt-ins through December 19. Participation is limited to NMPA member publishers, and the license covers U.S. audiovisual uses—that is, music videos and other visual elements Spotify is beginning to integrate into its platform. It smacks of the side deal on pending and unmatched tied to frozen mechanicals that the CRB rejected in Phonorecords IV.

Indeed, one explanation for the gun decked opt-in period is in The Desk:

Spotify is preparing to launch music videos in the United States, expanding a feature that has been in beta in nearly 100 international markets since January, the company quietly confirmed this week.

The new feature, rolling out to Spotify subscribers in the next few weeks, will allow streaming audio fans to watch official music videos directly within the Spotify app, setting the streaming platform in more direct competition with YouTube.

The company calls it a way for indies to share in “higher royalties,” but no rates, formulas, or minimum guarantees have been disclosed so it’s hard to know “higher” compared to what? Yes, it’s true that if you evan made another 1¢ that would be “higher”—and in streaming-speak, 1¢ is big progress, but remember that it’s still a positive number to the right of the decimal place preceded by a zero.

The deal sits alongside Spotify’s major-publisher audiovisual agreements, which are widely believed to include large advances and broader protections—none of which apply here. There’s also an open question of whether the majors granted public performance rights as an end run around PROs, which I fully expect. There’s no MFN clause, no public schedule, and no audit details. I would be surprised if Spotify agreed to be audited by an independent publisher and even more surprised if the announced publishers with direct deals did not have an audit right. So there’s one way we can be pretty confident this is not anything like MFN terms aside from the scrupulous avoidance of mentioning the dirty word: MONEY.

But it would be a good guess that Spotify is interested in this arrangement because it fills out some of the most likely plaintiffs to protect them when they launch their product with unlicensed songs or user generated videos and no Content ID clone (which is kind of Schrödinger’s UGC—not expressly included in the deal but not expressly excluded either, and would be competitive with TikTok or Spotify nemesis YouTube).

But here’s what else we don’t know: how much these rights are worth, how royalties will be calculated, whether they include public performances to block PRO licensing of Spotify A/V (and which could trigger MFN problems with YouTube or other UGC services) and whether the December 19 date marks the end of onboarding—or the eve of a US product launch. And perhaps most importantly, how is it that NMPA is involved, the NMPA which has trashed Spotify far and wide over finally taking advantage of the bundling rates negotiated in the CRB (indeed in some version since 2009). Shocked, shocked that there’s bundling going on.

It’s one thing to talk about audiovisual covering “official” music videos and expressly stating that the same license will not be used to cover UGC no way, no how. Given Spotify’s repeated hints that full-length music videos are coming to the U.S. and the test marketing reported by The Desk and disclosed by Spotify itself, the absolute silence of the public statements about royalty rates and UGC, as well as the rush to get publishers to opt in before year-end all suggest that rollout is imminent. Until Spotify and the NMPA release the actual deal terms, though, we’re all flying blind—sheep being herded toward an agreement cliff we can’t fully see.

[A version of this post first appeared on MusicTechPolicy]

Why Artists Are Striking Spotify Over Daniel Ek’s AI-Offensive Weapons Bet—and Why It Matters for AI Deals

Over the summer, a growing group of artists began pulling their catalogs from Spotify—not over miserable and Dickensian-level royalties alone, but over Spotify CEO Daniel Ek’s vast investment in Helsing, a European weapons company.  Helsing builds AI-enabled offensive weapons systems that skirt international human rights law, specifically Article 36 of the Geneva Conventions. Deerhoof helped kick off the current wave; other artists (including Xiu Xiu, King Gizzard & the Lizard Wizard, Hotline TNT, The Mynabirds, WU LYF, Kadhja Bonet, and Young Widows) have followed or announced plans to do so.

What is Helsing—and what does it build?

Helsing is a Munich-based defense-tech firm founded in 2021. It began with AI software for perception, decision-support, and electronic warfare, and has expanded into hardware. The company markets the HX‑2 “AI strike drone,” described as a software‑defined loitering munition intended to engage artillery and armored targets at significant range—and kill people. It emphasizes resilience to electronic warfare, swarm/networked tactics via its Altra recon‑strike platform, and a human in/on the loop for critical decisions, and that limited role for humans in killing other humans is where it runs into Geneva Convention issues.   Trust me, they know this.

The X-2 Strike Drone

Beyond drones, Helsing provides AI electronic‑warfare upgrades for Germany’s Eurofighter EK (with Saab), and has been contracted to supply AI software for Europe’s Future Combat Air System (FCAS). Public briefings and reporting indicate an active role supporting Ukraine since 2022, and a growing UK footprint linked to defense modernization initiatives. In 2025, Ek’s investment firm led a major funding round that valued Helsing in the multibillion‑euro range alongside contracts in the UK, Germany, and Sweden.

So let’s be clear—Helsing is not making some super tourniquet or AI medical device that has a dual use in civilian and military applications.  This is Masters of War stuff.  Which, for Mr. Ek’s benefit, is a song.

Why artists care

For these artists, the issue isn’t abstract: they see a direct line between Spotify‑generated wealth and AI‑enabled lethality, especially as Helsing moves from software into weaponized autonomy at scale. That ethical conflict is why exit statements explicitly connect Dickensian streaming economics and streamshare thresholds to military investment choices.  In fact, it remains to be seen whether Spotify itself is using its AI products and the tech and data behind them for Helsing’s weapons applications.

How many artists have left?

There’s no official tally. Reporting describes a wave of departures and names specific acts. The list continues to evolve as more artists reassess their positions.

The financial impact—on Spotify vs. on artists

For Spotify, a handful of indie exits barely moves the needle. The reason is the pro‑rata or “streamshare” payout model: each rightsholder’s share is proportional to total streams, not a fixed per‑stream rate except if you’re “lucky” enough to get a “greater of” formula. Remove a small catalog and its share simply reallocates to others. For artists, leaving can be meaningful—some replace streams with direct sales (Bandcamp, vinyl, fan campaigns) and often report higher revenue per fan. But at platform scale, the macro‑economics barely budge.  

Of course because of Spotify’s tying relationships with talent buyers for venues (explicit or implicit) not being on Spotify can be the kiss of death for a new artist competing for a Wednesday night at a local venue when the venue checks your Spotify stats.

Why this is a cautionary tale for AI labs

Two practices make artist exits feel symbolically loud but structurally quiet—and they’re exactly what frontier AI should avoid:

1) Revenue‑share pools with opaque rules. Pro‑rata “streamshare” pushes smaller players toward zero; any exit just enriches whoever remains. AI platforms contemplating rev‑share training or retrieval deals should learn from this: user‑centric or usage‑metered deals with transparent accounting are more legible than giant, shifting pools.

2) NDA‑sealed terms. The streaming era normalized NDAs that bury rates and conditions. If AI deals copy that playbook—confidential blacklists, secret style‑prompt fees, unpublished audit rights—contributors will see protest as the only lever. Transparency beats backlash.

3) Weapons Related Use Cases for AI.  We all know that the frontier labs like Google, Amazon, Microsoft and others are all also competing like trained seals for contracts from the Department of War.  They use the same technology trained on culture ripped off from artists to kill people for money.

A clearer picture of Helsing’s products and customers

• HX‑2 AI Strike Drone: beyond‑line‑of‑sight strike profile, on‑board target re‑identification, EW‑resilient, swarm‑capable via Altra; multiple payload options; human in/on the loop.
• Eurofighter EK (Germany): with Saab, AI‑enabled electronic‑warfare upgrade for Luftwaffe Eurofighters oriented to SEAD/DEAD roles.
• FCAS AI Backbone (Europe): software/AI layer for the next‑generation air combat system under European procurement.
• UK footprint: framework contracting in the UK defense ecosystem, tied to strike/targeting modernization efforts.
• Ukraine: public reporting indicates delivery of strike drones; company statements reference activity supporting Ukraine since 2022.

The bigger cultural point

Whether you applaud or oppose war tech, the ethical through‑line in these protests is consistent: creators don’t want their work—or the wealth it generates—financing AI (especially autonomous) weaponry. Because the platform’s pro‑rata economics make individual exits financially quiet, the conflict migrates into public signaling and brand pressure.

What would a better model look like for AI?

• Opt‑in, auditable deals for creative inputs to AI models (training and RAG) with clear unit economics and published baseline terms.
• User‑centric or usage‑metered payouts (by contributor, by model, by retrieval) instead of a single, shifting revenue pool.
• Public registries and audit logs so participants can verify where money comes from and where it goes.
• No gag clauses on baseline rates or audit rights.

The strike against Spotify is about values as much as value. Ek’s bet on Helsing—drones, electronic warfare, autonomous weapons—makes those values impossible for some artists to ignore. Thanks to the pro‑rata royalty machine, the exits won’t dent Spotify’s bottom line—but they should warn AI platforms against repeating the same opaque rev‑shares and NDAs that leave creators feeling voiceless in streaming.

If you got one of these emails from Spotify, you might be interested

Spotify failed to consult any of the people who drive fans to the data abattoir: the musicians, artists, podcasters and authors.

Spotify has quietly tightened the screws on AI this summer—while simultaneously clarifying how it uses your data to power its own machine‑learning features. For artists, rightsholders, developers, and policy folks, the combination matters: Spotify is making it harder for outsiders to train models on Spotify data, even as it codifies its own first‑party uses like AI DJ and personalized playlists.

Spotify is drawing a bright line: no training models on Spotify; yes to Spotify training its own. If you’re an artist or developer, that means stronger contractual leverage against third‑party scrapers—but also a need to sharpen your own data‑governance and licensing posture. Expect other platforms in music and podcasting to follow suit—and for regulators to ask tougher questions about how platform ML features are audited, licensed, and accounted for.

Below is a plain‑English (hopefully) breakdown of what changed, what’s new or newly explicit, and the practical implications for different stakeholders.

Explicit ban on using Spotify to train AI models (third parties). 

Spotify’s User Guidelines now flatly prohibit “crawling” or “scraping” the service and, crucially, “using any part of the Services or Content to train a machine learning or AI model.” That’s a categorical no for bots and bulk data slurps. The Developer Policy mirrors this: apps using the Web API may not “use the Spotify Platform or any Spotify Content to train a machine learning or AI model.” In short: if your product ingests Spotify data, you’re in violation of the rules and risk enforcement and access revocation.

Spotify’s own AI/ML uses are clearer—and broad. 

The Privacy Policy (effective August 27, 2025) spells out that Spotify uses personal data to “develop and train” algorithmic and machine‑learning models to improve recommendations, build AI features (like AI DJ and AI playlists), and enforce rules. That legal basis is framed largely as Spotify’s “legitimate interests.” Translation: your usage, voice, and other data can feed Spotify’s own models.

The user content license is very broad. 

If you post “User Content” (messages, playlist titles, descriptions, images, comments, etc.), you grant Spotify a worldwide, sublicensable, transferable, royalty‑free, irrevocable license to reproduce, modify, create derivative works from, distribute, perform, and display that content in any medium. That’s standard platform drafting these days, but the scope—including derivative works—has AI‑era consequences for anything you upload to or create within Spotify’s ecosystem (e.g., playlist titles, cover images, comments).

Anti‑manipulation and anti‑automation rules are baked in. 

The User Guidelines and Developer Policy double down on bans against bots, artificial streaming, and traffic manipulation. If you’re building tools that touch the Spotify graph, treat “no automated collection, no metric‑gaming, no derived profiling” as table stakes—or risk enforcement, up to termination of access.

Data‑sharing signals to rightsholders continue. 

Spotify says it can provide pseudonymized listening data to rightsholders under existing deals. That’s not new, but in the ML context it underscores why parallel data flows to third parties are tightly controlled: Spotify wants to be the gateway for data, not the faucet you can plumb yourself.

What this means by role:

• Artists & labels: The AI‑training ban gives you a clear contractual hook against services that scrape Spotify to build recommenders, clones, or vocal/style models. Document violations (timestamps, IPs, payloads) and send notices citing the User Guidelines and Developer Policy. Meanwhile, assume your own usage and voice interactions can be used to improve Spotify’s models—something to consider for privacy reviews and internal policies.

• Publishers and collecting societies: The combination of “no third‑party training” + “first‑party ML training” is a policy trend to watch across platforms. It raises familiar questions about derivative data, model outputs, and whether platform machine learning features create new accounting categories—or require new audit rights—in future licenses.

• Policymakers: Read this as another brick in the “closed data/open model risk” wall. Platforms restrict external extraction while expanding internal model claims. That asymmetry will shape future debates over data‑access mandates, competition remedies, and model‑audit rights—especially where platform ML features may substitute for third‑party discovery tools.

Practical to‑dos

1) For rights owners: Add explicit “no platform‑sourced training” language in your vendor, distributor, or analytics contracts. Track and log known scrapers and third‑party tools that might be training off Spotify. Consider notice letters that cite the specific clauses.

2) For privacy and legal teams: Update DPIAs and data maps. Spotify’s Privacy Policy identifies “User Data,” “Usage Data,” “Voice Data,” “Message Data,” and more as inputs for ML features under legitimate interest. If you rely on Spotify data for compliance reports, make sure you’re only using permitted, properly aggregated outputs—not raw exports.

3) For users: I will be posting a guideline to how to clawback your data. I may not hit everything so always open to suggestions about whatever else that others spot.

Spotify’s terms give it very broad rights to collect, combine, and use your data (listening history, device/ads data, voice features, third-party signals) for personalization, ads, and product R&D. They also take a broad license to user content you upload (e.g., playlist art). 

Key cites

• User Guidelines: prohibition on scraping and on “using any part of the Services or Content to train a machine learning or AI model.”

• Developer Policy (effective May 15, 2025): “Do not use the Spotify Platform or any Spotify Content to train a machine learning or AI model…” Also bans analyzing Spotify content to create new/derived listenership metrics or user profiles for ad targeting.

• Privacy Policy (effective Aug. 27, 2025): Spotify uses personal data to “develop and train” ML models for recommendations, AI DJ/AI playlists, and rule‑enforcement, primarily under “legitimate interests.”

• Terms & Conditions of Use: very broad license to Spotify for any “User Content” you post, including the right to “create derivative works” and to use content by any means and media worldwide, irrevocably.

[A version of this post first appeared on MusicTechPolicy]

Spotify Makes Kate Nash’s Argument With the Usual Blame Game

Daniel Ek is indifferent to whether the economics of streaming causes artists to give up or actually starve to actual death. He’s already got the tracks and he’ll keep selling them forever like an evil self-licking ice cream cone.

Kate Nash is the latest artist to slam Spotify’s pathetic royalty payments even after the payola and the streaming manipulation with the Orwellian “Discovery Mode” as discovered by Liz Pelly. According to Digital Music News, Kate Nash says: 

“‘Foundations’ has over 100 million plays on Spotify — and I’m shocked I’m not a millionaire when I hear that! I’m shocked at the state of the music industry and how the industry has allowed this to happen,” said Nash. “We’re paid very, very, very poorly and unethically for our recorded music: it’s like 0.003 of a penny per stream. I think we should not only be paid fairly, but we should be paid very well. People love music and it’s a growing economy and there are plenty of millionaires in the industry because of that, and our music.”

But then she said the quiet part out loud that will get them right in their Portlandia hearts:

She added: “And what they’re saying to artists from non-rich privileged backgrounds, which is you’re not welcome here, you can’t do this, we don’t want to hear from you. Because it’s not possible to even imagine having a career if you don’t have a privileged background or a privileged situation right now.”

This, of course, comes the same time that Spotify board members have cashed out over $1 billion in stock including hundreds of millions to Daniel Ek personally, speaking of privilege.

Using forks and knives to eat their bacon

Spotify responds with the same old whine that starts with the usual condescending drivel, deflection and distraction:

“We’re huge fans of Kate Nash. For streams of her track ‘Foundations’ alone — which was released before Spotify existed — Spotify has paid out around half a million pounds in revenue to Kate Nash’s rights holders,” reads Spotify’s statement.

“Her most streamed songs were released via Universal Music Group. Spotify has no visibility over the deals that Kate signed with her rights holders. Therefore, we have no knowledge of the payment terms that were agreed upon between her and her partners.”

This is a very carefully worded statement–notice that they switch from the specific to the general and start talking about “her rights holders”. That means no doubt that they are including the songwriters and publishers of the compositions, so that’s bullshit for starters. But notice how they are making Kate’s own argument here by trying to get you to focus on the “big check” that they wrote to Universal.

Well, last time I checked in the world of arithmetic, “around half a million pounds” (which means less than, but OK) divided by 100,000,000 streams is…wait for it…shite. £0.005 per stream–at the Universal level but all-in by the sound of it, i.e., artist share, label share, songwriters and publishers. This is why Spotify is making Kate’s argument at the same time they are trying to deflect attention onto Universal.

Then–always with an eye on the DCMS authorities in the UK and the UK Parliament, Spotify says:

“We do know that British artists generated revenues of over £750 million on Spotify alone in 2023 — a number that is on the rise year on year — so it’s disappointing to hear that Spotify’s payments are not making it through to Kate herself,” the company concluded.

Oh, so “disappointed.” Please spare us. What’s disappointing is that the streaming services participate in this charade where their executives make more in one day of stock trading than the company’s entire payments to UK artists and songwriters.

This race to the bottom is not lost on artists.  Al Yankovic, a card-carrying member of the pantheon of music parodists from Tom Leher to Spinal Tap to The Rutles, released a hysterical video about his “Spotify Wrapped” account.  

Al said he’d had 80 million streams and received enough cash from Spotify to buy a $12 sandwich.  This was from an artist who made a decades-long career from—parody.  Remember that–parody.

Do you think he really meant he actually got $12 for 80 million streams?  Or could that have been part of the gallows humor of calling out Spotify Wrapped as a propaganda tool for…Spotify?  Poking fun at the massive camouflage around the Malthusian algebra of streaming royalties gradually choking the life out of artists and songwriters? Gallows humor, indeed, because a lot of artists and especially songwriters are gradually collapsing as the algebra predicted.

The services took the bait Al dangled, and they seized upon Al’s video poking fun at how ridiculously low Spotify payments are to make a point about how Al’s sandwich price couldn’t possibly be 80 million streams and if it were, it’s his label’s fault.  Just like Spotify is blaming Universal rather than take responsibility for once in their lives. 

Nothing if not on message, right? As Daniel Ek told MusicAlly, “There is a narrative fallacy here, combined with the fact that, obviously, some artists that used to do well in the past may not do well in this future landscape, where you can’t record music once every three to four years and think that’s going to be enough.” This is kind of like TikTok bragging about how few children hung themselves in the latest black out challenge compared to the number of all children using the platform. Pretty Malthusian. It’s not a fallacy; it’s all too true.

I’d suggest that Al and Kate Nash were each making the point–if you think of everyday goods, like bacon for example, in terms of how many streams you would have to sell in order to buy a pound of bacon, a dozen eggs, a gallon of gasoline, Internet access, or a sandwich in a nice restaurant, you start to understand that the joke really is on us. The best way to make a small fortune in the streaming business is to start with a large one. Unless you’re a Spotify executive, of course.