@ArtistRights Institute Newsletter 11/17/25: Highlights from a fast-moving week in music policy, AI oversight, and artist advocacy.

American Music Fairness Act

Don’t Let Congress Reward the Stations That Don’t Pay Artists (Editor Charlie/Artist Rights Watch)

Trump AI Executive Order

White House drafts order directing Justice Department to sue states that pass AI regulations (Gerrit De Vynck and Nitasha Tiku/Washington Post)

DOJ Authority and the “Because China” Trump AI Executive Order (Chris Castle/MusicTech.Solutions)

THE @DAVIDSACKS/ADAM THIERER EXECUTIVE ORDER CRUSHING PROTECTIVE STATE LAWS ON AI—AND WHY NO ONE SHOULD BE SURPRISED THAT TRUMP TOOK THE BAIT

Bartz Settlement

WHAT $1.5 BILLION GETS YOU:  AN OBJECTOR’S GUIDE TO THE BARTZ SETTLEMENT (Chris Castle/MusicTechPolicy)

Ticketing

StubHub’s First Earnings Faceplant: Why the Ticket Reseller Probably Should Have Stayed Private (Chris Castle/ArtistRightsWatch)

The UK Finally Moves to Ban Above-Face-Value Ticket Resale (Chris Castle/MusicTech.Solutions)

Ashley King: Oasis Praises Victoria’s Strict Anti-Scalping Laws While on Tour in Oz — “We Can Stop Large-Scale Scalping In Its Tracks” (Artist Rights Watch/Digital Music News)

NMPA/Spotify Video Deal

GUEST POST: SHOW US THE TERMS: IMPLICATIONS OF THE SPOTIFY/NMPA DIRECT AUDIOVISUAL LICENSE FOR INDEPENDENT SONGWRITERS (Gwen Seale/MusicTechPolicy)

WHAT WE KNOW—AND DON’T KNOW—ABOUT SPOTIFY AND NMPA’S “OPT-IN” AUDIOVISUAL DEAL (Chris Castle/MusicTechPolicy)

What We Know—and Don’t Know—About Spotify and NMPA’s “Opt-In” Audiovisual Deal

When Spotify and the National Music Publishers’ Association (NMPA) announced an “opt-in” audiovisual licensing portal this month, the headlines made it sound like a breakthrough for independent songwriters. In reality, what we have is a bare-bones description of a direct-license program whose key financial and legal terms remain hidden from view.

Here’s what we do know. The portal (likely an HFA extravaganza) opened on November 11, 2025 and will accept opt-ins through December 19. Participation is limited to NMPA member publishers, and the license covers U.S. audiovisual uses—that is, music videos and other visual elements Spotify is beginning to integrate into its platform. It smacks of the side deal on pending and unmatched tied to frozen mechanicals that the CRB rejected in Phonorecords IV.

Indeed, one explanation for the gun decked opt-in period is in The Desk:

Spotify is preparing to launch music videos in the United States, expanding a feature that has been in beta in nearly 100 international markets since January, the company quietly confirmed this week.

The new feature, rolling out to Spotify subscribers in the next few weeks, will allow streaming audio fans to watch official music videos directly within the Spotify app, setting the streaming platform in more direct competition with YouTube.

The company calls it a way for indies to share in “higher royalties,” but no rates, formulas, or minimum guarantees have been disclosed so it’s hard to know “higher” compared to what? Yes, it’s true that if you evan made another 1¢ that would be “higher”—and in streaming-speak, 1¢ is big progress, but remember that it’s still a positive number to the right of the decimal place preceded by a zero.

The deal sits alongside Spotify’s major-publisher audiovisual agreements, which are widely believed to include large advances and broader protections—none of which apply here. There’s also an open question of whether the majors granted public performance rights as an end run around PROs, which I fully expect. There’s no MFN clause, no public schedule, and no audit details. I would be surprised if Spotify agreed to be audited by an independent publisher and even more surprised if the announced publishers with direct deals did not have an audit right. So there’s one way we can be pretty confident this is not anything like MFN terms aside from the scrupulous avoidance of mentioning the dirty word: MONEY.

But it would be a good guess that Spotify is interested in this arrangement because it fills out some of the most likely plaintiffs to protect them when they launch their product with unlicensed songs or user generated videos and no Content ID clone (which is kind of Schrödinger’s UGC—not expressly included in the deal but not expressly excluded either, and would be competitive with TikTok or Spotify nemesis YouTube).

But here’s what else we don’t know: how much these rights are worth, how royalties will be calculated, whether they include public performances to block PRO licensing of Spotify A/V (and which could trigger MFN problems with YouTube or other UGC services) and whether the December 19 date marks the end of onboarding—or the eve of a US product launch. And perhaps most importantly, how is it that NMPA is involved, the NMPA which has trashed Spotify far and wide over finally taking advantage of the bundling rates negotiated in the CRB (indeed in some version since 2009). Shocked, shocked that there’s bundling going on.

It’s one thing to talk about audiovisual covering “official” music videos and expressly stating that the same license will not be used to cover UGC no way, no how. Given Spotify’s repeated hints that full-length music videos are coming to the U.S. and the test marketing reported by The Desk and disclosed by Spotify itself, the absolute silence of the public statements about royalty rates and UGC, as well as the rush to get publishers to opt in before year-end all suggest that rollout is imminent. Until Spotify and the NMPA release the actual deal terms, though, we’re all flying blind—sheep being herded toward an agreement cliff we can’t fully see.

[A version of this post first appeared on MusicTechPolicy]

Why Artists Are Striking Spotify Over Daniel Ek’s AI-Offensive Weapons Bet—and Why It Matters for AI Deals

Over the summer, a growing group of artists began pulling their catalogs from Spotify—not over miserable and Dickensian-level royalties alone, but over Spotify CEO Daniel Ek’s vast investment in Helsing, a European weapons company.  Helsing builds AI-enabled offensive weapons systems that skirt international human rights law, specifically Article 36 of the Geneva Conventions. Deerhoof helped kick off the current wave; other artists (including Xiu Xiu, King Gizzard & the Lizard Wizard, Hotline TNT, The Mynabirds, WU LYF, Kadhja Bonet, and Young Widows) have followed or announced plans to do so.

What is Helsing—and what does it build?

Helsing is a Munich-based defense-tech firm founded in 2021. It began with AI software for perception, decision-support, and electronic warfare, and has expanded into hardware. The company markets the HX‑2 “AI strike drone,” described as a software‑defined loitering munition intended to engage artillery and armored targets at significant range—and kill people. It emphasizes resilience to electronic warfare, swarm/networked tactics via its Altra recon‑strike platform, and a human in/on the loop for critical decisions, and that limited role for humans in killing other humans is where it runs into Geneva Convention issues.   Trust me, they know this.

The X-2 Strike Drone

Beyond drones, Helsing provides AI electronic‑warfare upgrades for Germany’s Eurofighter EK (with Saab), and has been contracted to supply AI software for Europe’s Future Combat Air System (FCAS). Public briefings and reporting indicate an active role supporting Ukraine since 2022, and a growing UK footprint linked to defense modernization initiatives. In 2025, Ek’s investment firm led a major funding round that valued Helsing in the multibillion‑euro range alongside contracts in the UK, Germany, and Sweden.

So let’s be clear—Helsing is not making some super tourniquet or AI medical device that has a dual use in civilian and military applications.  This is Masters of War stuff.  Which, for Mr. Ek’s benefit, is a song.

Why artists care

For these artists, the issue isn’t abstract: they see a direct line between Spotify‑generated wealth and AI‑enabled lethality, especially as Helsing moves from software into weaponized autonomy at scale. That ethical conflict is why exit statements explicitly connect Dickensian streaming economics and streamshare thresholds to military investment choices.  In fact, it remains to be seen whether Spotify itself is using its AI products and the tech and data behind them for Helsing’s weapons applications.

How many artists have left?

There’s no official tally. Reporting describes a wave of departures and names specific acts. The list continues to evolve as more artists reassess their positions.

The financial impact—on Spotify vs. on artists

For Spotify, a handful of indie exits barely moves the needle. The reason is the pro‑rata or “streamshare” payout model: each rightsholder’s share is proportional to total streams, not a fixed per‑stream rate except if you’re “lucky” enough to get a “greater of” formula. Remove a small catalog and its share simply reallocates to others. For artists, leaving can be meaningful—some replace streams with direct sales (Bandcamp, vinyl, fan campaigns) and often report higher revenue per fan. But at platform scale, the macro‑economics barely budge.  

Of course because of Spotify’s tying relationships with talent buyers for venues (explicit or implicit) not being on Spotify can be the kiss of death for a new artist competing for a Wednesday night at a local venue when the venue checks your Spotify stats.

Why this is a cautionary tale for AI labs

Two practices make artist exits feel symbolically loud but structurally quiet—and they’re exactly what frontier AI should avoid:

1) Revenue‑share pools with opaque rules. Pro‑rata “streamshare” pushes smaller players toward zero; any exit just enriches whoever remains. AI platforms contemplating rev‑share training or retrieval deals should learn from this: user‑centric or usage‑metered deals with transparent accounting are more legible than giant, shifting pools.

2) NDA‑sealed terms. The streaming era normalized NDAs that bury rates and conditions. If AI deals copy that playbook—confidential blacklists, secret style‑prompt fees, unpublished audit rights—contributors will see protest as the only lever. Transparency beats backlash.

3) Weapons Related Use Cases for AI.  We all know that the frontier labs like Google, Amazon, Microsoft and others are all also competing like trained seals for contracts from the Department of War.  They use the same technology trained on culture ripped off from artists to kill people for money.

A clearer picture of Helsing’s products and customers

• HX‑2 AI Strike Drone: beyond‑line‑of‑sight strike profile, on‑board target re‑identification, EW‑resilient, swarm‑capable via Altra; multiple payload options; human in/on the loop.
• Eurofighter EK (Germany): with Saab, AI‑enabled electronic‑warfare upgrade for Luftwaffe Eurofighters oriented to SEAD/DEAD roles.
• FCAS AI Backbone (Europe): software/AI layer for the next‑generation air combat system under European procurement.
• UK footprint: framework contracting in the UK defense ecosystem, tied to strike/targeting modernization efforts.
• Ukraine: public reporting indicates delivery of strike drones; company statements reference activity supporting Ukraine since 2022.

The bigger cultural point

Whether you applaud or oppose war tech, the ethical through‑line in these protests is consistent: creators don’t want their work—or the wealth it generates—financing AI (especially autonomous) weaponry. Because the platform’s pro‑rata economics make individual exits financially quiet, the conflict migrates into public signaling and brand pressure.

What would a better model look like for AI?

• Opt‑in, auditable deals for creative inputs to AI models (training and RAG) with clear unit economics and published baseline terms.
• User‑centric or usage‑metered payouts (by contributor, by model, by retrieval) instead of a single, shifting revenue pool.
• Public registries and audit logs so participants can verify where money comes from and where it goes.
• No gag clauses on baseline rates or audit rights.

The strike against Spotify is about values as much as value. Ek’s bet on Helsing—drones, electronic warfare, autonomous weapons—makes those values impossible for some artists to ignore. Thanks to the pro‑rata royalty machine, the exits won’t dent Spotify’s bottom line—but they should warn AI platforms against repeating the same opaque rev‑shares and NDAs that leave creators feeling voiceless in streaming.

If you got one of these emails from Spotify, you might be interested

Spotify failed to consult any of the people who drive fans to the data abattoir: the musicians, artists, podcasters and authors.

Spotify has quietly tightened the screws on AI this summer—while simultaneously clarifying how it uses your data to power its own machine‑learning features. For artists, rightsholders, developers, and policy folks, the combination matters: Spotify is making it harder for outsiders to train models on Spotify data, even as it codifies its own first‑party uses like AI DJ and personalized playlists.

Spotify is drawing a bright line: no training models on Spotify; yes to Spotify training its own. If you’re an artist or developer, that means stronger contractual leverage against third‑party scrapers—but also a need to sharpen your own data‑governance and licensing posture. Expect other platforms in music and podcasting to follow suit—and for regulators to ask tougher questions about how platform ML features are audited, licensed, and accounted for.

Below is a plain‑English (hopefully) breakdown of what changed, what’s new or newly explicit, and the practical implications for different stakeholders.

Explicit ban on using Spotify to train AI models (third parties). 

Spotify’s User Guidelines now flatly prohibit “crawling” or “scraping” the service and, crucially, “using any part of the Services or Content to train a machine learning or AI model.” That’s a categorical no for bots and bulk data slurps. The Developer Policy mirrors this: apps using the Web API may not “use the Spotify Platform or any Spotify Content to train a machine learning or AI model.” In short: if your product ingests Spotify data, you’re in violation of the rules and risk enforcement and access revocation.

Spotify’s own AI/ML uses are clearer—and broad. 

The Privacy Policy (effective August 27, 2025) spells out that Spotify uses personal data to “develop and train” algorithmic and machine‑learning models to improve recommendations, build AI features (like AI DJ and AI playlists), and enforce rules. That legal basis is framed largely as Spotify’s “legitimate interests.” Translation: your usage, voice, and other data can feed Spotify’s own models.

The user content license is very broad. 

If you post “User Content” (messages, playlist titles, descriptions, images, comments, etc.), you grant Spotify a worldwide, sublicensable, transferable, royalty‑free, irrevocable license to reproduce, modify, create derivative works from, distribute, perform, and display that content in any medium. That’s standard platform drafting these days, but the scope—including derivative works—has AI‑era consequences for anything you upload to or create within Spotify’s ecosystem (e.g., playlist titles, cover images, comments).

Anti‑manipulation and anti‑automation rules are baked in. 

The User Guidelines and Developer Policy double down on bans against bots, artificial streaming, and traffic manipulation. If you’re building tools that touch the Spotify graph, treat “no automated collection, no metric‑gaming, no derived profiling” as table stakes—or risk enforcement, up to termination of access.

Data‑sharing signals to rightsholders continue. 

Spotify says it can provide pseudonymized listening data to rightsholders under existing deals. That’s not new, but in the ML context it underscores why parallel data flows to third parties are tightly controlled: Spotify wants to be the gateway for data, not the faucet you can plumb yourself.

What this means by role:

• Artists & labels: The AI‑training ban gives you a clear contractual hook against services that scrape Spotify to build recommenders, clones, or vocal/style models. Document violations (timestamps, IPs, payloads) and send notices citing the User Guidelines and Developer Policy. Meanwhile, assume your own usage and voice interactions can be used to improve Spotify’s models—something to consider for privacy reviews and internal policies.

• Publishers and collecting societies: The combination of “no third‑party training” + “first‑party ML training” is a policy trend to watch across platforms. It raises familiar questions about derivative data, model outputs, and whether platform machine learning features create new accounting categories—or require new audit rights—in future licenses.

• Policymakers: Read this as another brick in the “closed data/open model risk” wall. Platforms restrict external extraction while expanding internal model claims. That asymmetry will shape future debates over data‑access mandates, competition remedies, and model‑audit rights—especially where platform ML features may substitute for third‑party discovery tools.

Practical to‑dos

1) For rights owners: Add explicit “no platform‑sourced training” language in your vendor, distributor, or analytics contracts. Track and log known scrapers and third‑party tools that might be training off Spotify. Consider notice letters that cite the specific clauses.

2) For privacy and legal teams: Update DPIAs and data maps. Spotify’s Privacy Policy identifies “User Data,” “Usage Data,” “Voice Data,” “Message Data,” and more as inputs for ML features under legitimate interest. If you rely on Spotify data for compliance reports, make sure you’re only using permitted, properly aggregated outputs—not raw exports.

3) For users: I will be posting a guideline to how to clawback your data. I may not hit everything so always open to suggestions about whatever else that others spot.

Spotify’s terms give it very broad rights to collect, combine, and use your data (listening history, device/ads data, voice features, third-party signals) for personalization, ads, and product R&D. They also take a broad license to user content you upload (e.g., playlist art). 

Key cites

• User Guidelines: prohibition on scraping and on “using any part of the Services or Content to train a machine learning or AI model.”

• Developer Policy (effective May 15, 2025): “Do not use the Spotify Platform or any Spotify Content to train a machine learning or AI model…” Also bans analyzing Spotify content to create new/derived listenership metrics or user profiles for ad targeting.

• Privacy Policy (effective Aug. 27, 2025): Spotify uses personal data to “develop and train” ML models for recommendations, AI DJ/AI playlists, and rule‑enforcement, primarily under “legitimate interests.”

• Terms & Conditions of Use: very broad license to Spotify for any “User Content” you post, including the right to “create derivative works” and to use content by any means and media worldwide, irrevocably.

[A version of this post first appeared on MusicTechPolicy]