What We Know—and Don’t Know—About Spotify and NMPA’s “Opt-In” Audiovisual Deal

When Spotify and the National Music Publishers’ Association (NMPA) announced an “opt-in” audiovisual licensing portal this month, the headlines made it sound like a breakthrough for independent songwriters. In reality, what we have is a bare-bones description of a direct-license program whose key financial and legal terms remain hidden from view.

Here’s what we do know. The portal (likely an HFA extravaganza) opened on November 11, 2025 and will accept opt-ins through December 19. Participation is limited to NMPA member publishers, and the license covers U.S. audiovisual uses—that is, music videos and other visual elements Spotify is beginning to integrate into its platform. It smacks of the side deal on pending and unmatched tied to frozen mechanicals that the CRB rejected in Phonorecords IV.

Indeed, one explanation for the gun decked opt-in period is in The Desk:

Spotify is preparing to launch music videos in the United States, expanding a feature that has been in beta in nearly 100 international markets since January, the company quietly confirmed this week.

The new feature, rolling out to Spotify subscribers in the next few weeks, will allow streaming audio fans to watch official music videos directly within the Spotify app, setting the streaming platform in more direct competition with YouTube.

The company calls it a way for indies to share in “higher royalties,” but no rates, formulas, or minimum guarantees have been disclosed so it’s hard to know “higher” compared to what? Yes, it’s true that if you evan made another 1¢ that would be “higher”—and in streaming-speak, 1¢ is big progress, but remember that it’s still a positive number to the right of the decimal place preceded by a zero.

The deal sits alongside Spotify’s major-publisher audiovisual agreements, which are widely believed to include large advances and broader protections—none of which apply here. There’s also an open question of whether the majors granted public performance rights as an end run around PROs, which I fully expect. There’s no MFN clause, no public schedule, and no audit details. I would be surprised if Spotify agreed to be audited by an independent publisher and even more surprised if the announced publishers with direct deals did not have an audit right. So there’s one way we can be pretty confident this is not anything like MFN terms aside from the scrupulous avoidance of mentioning the dirty word: MONEY.

But it would be a good guess that Spotify is interested in this arrangement because it fills out some of the most likely plaintiffs to protect them when they launch their product with unlicensed songs or user generated videos and no Content ID clone (which is kind of Schrödinger’s UGC—not expressly included in the deal but not expressly excluded either, and would be competitive with TikTok or Spotify nemesis YouTube).

But here’s what else we don’t know: how much these rights are worth, how royalties will be calculated, whether they include public performances to block PRO licensing of Spotify A/V (and which could trigger MFN problems with YouTube or other UGC services) and whether the December 19 date marks the end of onboarding—or the eve of a US product launch. And perhaps most importantly, how is it that NMPA is involved, the NMPA which has trashed Spotify far and wide over finally taking advantage of the bundling rates negotiated in the CRB (indeed in some version since 2009). Shocked, shocked that there’s bundling going on.

It’s one thing to talk about audiovisual covering “official” music videos and expressly stating that the same license will not be used to cover UGC no way, no how. Given Spotify’s repeated hints that full-length music videos are coming to the U.S. and the test marketing reported by The Desk and disclosed by Spotify itself, the absolute silence of the public statements about royalty rates and UGC, as well as the rush to get publishers to opt in before year-end all suggest that rollout is imminent. Until Spotify and the NMPA release the actual deal terms, though, we’re all flying blind—sheep being herded toward an agreement cliff we can’t fully see.

[A version of this post first appeared on MusicTechPolicy]

Why Artists Are Striking Spotify Over Daniel Ek’s AI-Offensive Weapons Bet—and Why It Matters for AI Deals

Over the summer, a growing group of artists began pulling their catalogs from Spotify—not over miserable and Dickensian-level royalties alone, but over Spotify CEO Daniel Ek’s vast investment in Helsing, a European weapons company.  Helsing builds AI-enabled offensive weapons systems that skirt international human rights law, specifically Article 36 of the Geneva Conventions. Deerhoof helped kick off the current wave; other artists (including Xiu Xiu, King Gizzard & the Lizard Wizard, Hotline TNT, The Mynabirds, WU LYF, Kadhja Bonet, and Young Widows) have followed or announced plans to do so.

What is Helsing—and what does it build?

Helsing is a Munich-based defense-tech firm founded in 2021. It began with AI software for perception, decision-support, and electronic warfare, and has expanded into hardware. The company markets the HX‑2 “AI strike drone,” described as a software‑defined loitering munition intended to engage artillery and armored targets at significant range—and kill people. It emphasizes resilience to electronic warfare, swarm/networked tactics via its Altra recon‑strike platform, and a human in/on the loop for critical decisions, and that limited role for humans in killing other humans is where it runs into Geneva Convention issues.   Trust me, they know this.

The X-2 Strike Drone

Beyond drones, Helsing provides AI electronic‑warfare upgrades for Germany’s Eurofighter EK (with Saab), and has been contracted to supply AI software for Europe’s Future Combat Air System (FCAS). Public briefings and reporting indicate an active role supporting Ukraine since 2022, and a growing UK footprint linked to defense modernization initiatives. In 2025, Ek’s investment firm led a major funding round that valued Helsing in the multibillion‑euro range alongside contracts in the UK, Germany, and Sweden.

So let’s be clear—Helsing is not making some super tourniquet or AI medical device that has a dual use in civilian and military applications.  This is Masters of War stuff.  Which, for Mr. Ek’s benefit, is a song.

Why artists care

For these artists, the issue isn’t abstract: they see a direct line between Spotify‑generated wealth and AI‑enabled lethality, especially as Helsing moves from software into weaponized autonomy at scale. That ethical conflict is why exit statements explicitly connect Dickensian streaming economics and streamshare thresholds to military investment choices.  In fact, it remains to be seen whether Spotify itself is using its AI products and the tech and data behind them for Helsing’s weapons applications.

How many artists have left?

There’s no official tally. Reporting describes a wave of departures and names specific acts. The list continues to evolve as more artists reassess their positions.

The financial impact—on Spotify vs. on artists

For Spotify, a handful of indie exits barely moves the needle. The reason is the pro‑rata or “streamshare” payout model: each rightsholder’s share is proportional to total streams, not a fixed per‑stream rate except if you’re “lucky” enough to get a “greater of” formula. Remove a small catalog and its share simply reallocates to others. For artists, leaving can be meaningful—some replace streams with direct sales (Bandcamp, vinyl, fan campaigns) and often report higher revenue per fan. But at platform scale, the macro‑economics barely budge.  

Of course because of Spotify’s tying relationships with talent buyers for venues (explicit or implicit) not being on Spotify can be the kiss of death for a new artist competing for a Wednesday night at a local venue when the venue checks your Spotify stats.

Why this is a cautionary tale for AI labs

Two practices make artist exits feel symbolically loud but structurally quiet—and they’re exactly what frontier AI should avoid:

1) Revenue‑share pools with opaque rules. Pro‑rata “streamshare” pushes smaller players toward zero; any exit just enriches whoever remains. AI platforms contemplating rev‑share training or retrieval deals should learn from this: user‑centric or usage‑metered deals with transparent accounting are more legible than giant, shifting pools.

2) NDA‑sealed terms. The streaming era normalized NDAs that bury rates and conditions. If AI deals copy that playbook—confidential blacklists, secret style‑prompt fees, unpublished audit rights—contributors will see protest as the only lever. Transparency beats backlash.

3) Weapons Related Use Cases for AI.  We all know that the frontier labs like Google, Amazon, Microsoft and others are all also competing like trained seals for contracts from the Department of War.  They use the same technology trained on culture ripped off from artists to kill people for money.

A clearer picture of Helsing’s products and customers

• HX‑2 AI Strike Drone: beyond‑line‑of‑sight strike profile, on‑board target re‑identification, EW‑resilient, swarm‑capable via Altra; multiple payload options; human in/on the loop.
• Eurofighter EK (Germany): with Saab, AI‑enabled electronic‑warfare upgrade for Luftwaffe Eurofighters oriented to SEAD/DEAD roles.
• FCAS AI Backbone (Europe): software/AI layer for the next‑generation air combat system under European procurement.
• UK footprint: framework contracting in the UK defense ecosystem, tied to strike/targeting modernization efforts.
• Ukraine: public reporting indicates delivery of strike drones; company statements reference activity supporting Ukraine since 2022.

The bigger cultural point

Whether you applaud or oppose war tech, the ethical through‑line in these protests is consistent: creators don’t want their work—or the wealth it generates—financing AI (especially autonomous) weaponry. Because the platform’s pro‑rata economics make individual exits financially quiet, the conflict migrates into public signaling and brand pressure.

What would a better model look like for AI?

• Opt‑in, auditable deals for creative inputs to AI models (training and RAG) with clear unit economics and published baseline terms.
• User‑centric or usage‑metered payouts (by contributor, by model, by retrieval) instead of a single, shifting revenue pool.
• Public registries and audit logs so participants can verify where money comes from and where it goes.
• No gag clauses on baseline rates or audit rights.

The strike against Spotify is about values as much as value. Ek’s bet on Helsing—drones, electronic warfare, autonomous weapons—makes those values impossible for some artists to ignore. Thanks to the pro‑rata royalty machine, the exits won’t dent Spotify’s bottom line—but they should warn AI platforms against repeating the same opaque rev‑shares and NDAs that leave creators feeling voiceless in streaming.

If you got one of these emails from Spotify, you might be interested

Spotify failed to consult any of the people who drive fans to the data abattoir: the musicians, artists, podcasters and authors.

Spotify has quietly tightened the screws on AI this summer—while simultaneously clarifying how it uses your data to power its own machine‑learning features. For artists, rightsholders, developers, and policy folks, the combination matters: Spotify is making it harder for outsiders to train models on Spotify data, even as it codifies its own first‑party uses like AI DJ and personalized playlists.

Spotify is drawing a bright line: no training models on Spotify; yes to Spotify training its own. If you’re an artist or developer, that means stronger contractual leverage against third‑party scrapers—but also a need to sharpen your own data‑governance and licensing posture. Expect other platforms in music and podcasting to follow suit—and for regulators to ask tougher questions about how platform ML features are audited, licensed, and accounted for.

Below is a plain‑English (hopefully) breakdown of what changed, what’s new or newly explicit, and the practical implications for different stakeholders.

Explicit ban on using Spotify to train AI models (third parties). 

Spotify’s User Guidelines now flatly prohibit “crawling” or “scraping” the service and, crucially, “using any part of the Services or Content to train a machine learning or AI model.” That’s a categorical no for bots and bulk data slurps. The Developer Policy mirrors this: apps using the Web API may not “use the Spotify Platform or any Spotify Content to train a machine learning or AI model.” In short: if your product ingests Spotify data, you’re in violation of the rules and risk enforcement and access revocation.

Spotify’s own AI/ML uses are clearer—and broad. 

The Privacy Policy (effective August 27, 2025) spells out that Spotify uses personal data to “develop and train” algorithmic and machine‑learning models to improve recommendations, build AI features (like AI DJ and AI playlists), and enforce rules. That legal basis is framed largely as Spotify’s “legitimate interests.” Translation: your usage, voice, and other data can feed Spotify’s own models.

The user content license is very broad. 

If you post “User Content” (messages, playlist titles, descriptions, images, comments, etc.), you grant Spotify a worldwide, sublicensable, transferable, royalty‑free, irrevocable license to reproduce, modify, create derivative works from, distribute, perform, and display that content in any medium. That’s standard platform drafting these days, but the scope—including derivative works—has AI‑era consequences for anything you upload to or create within Spotify’s ecosystem (e.g., playlist titles, cover images, comments).

Anti‑manipulation and anti‑automation rules are baked in. 

The User Guidelines and Developer Policy double down on bans against bots, artificial streaming, and traffic manipulation. If you’re building tools that touch the Spotify graph, treat “no automated collection, no metric‑gaming, no derived profiling” as table stakes—or risk enforcement, up to termination of access.

Data‑sharing signals to rightsholders continue. 

Spotify says it can provide pseudonymized listening data to rightsholders under existing deals. That’s not new, but in the ML context it underscores why parallel data flows to third parties are tightly controlled: Spotify wants to be the gateway for data, not the faucet you can plumb yourself.

What this means by role:

• Artists & labels: The AI‑training ban gives you a clear contractual hook against services that scrape Spotify to build recommenders, clones, or vocal/style models. Document violations (timestamps, IPs, payloads) and send notices citing the User Guidelines and Developer Policy. Meanwhile, assume your own usage and voice interactions can be used to improve Spotify’s models—something to consider for privacy reviews and internal policies.

• Publishers and collecting societies: The combination of “no third‑party training” + “first‑party ML training” is a policy trend to watch across platforms. It raises familiar questions about derivative data, model outputs, and whether platform machine learning features create new accounting categories—or require new audit rights—in future licenses.

• Policymakers: Read this as another brick in the “closed data/open model risk” wall. Platforms restrict external extraction while expanding internal model claims. That asymmetry will shape future debates over data‑access mandates, competition remedies, and model‑audit rights—especially where platform ML features may substitute for third‑party discovery tools.

Practical to‑dos

1) For rights owners: Add explicit “no platform‑sourced training” language in your vendor, distributor, or analytics contracts. Track and log known scrapers and third‑party tools that might be training off Spotify. Consider notice letters that cite the specific clauses.

2) For privacy and legal teams: Update DPIAs and data maps. Spotify’s Privacy Policy identifies “User Data,” “Usage Data,” “Voice Data,” “Message Data,” and more as inputs for ML features under legitimate interest. If you rely on Spotify data for compliance reports, make sure you’re only using permitted, properly aggregated outputs—not raw exports.

3) For users: I will be posting a guideline to how to clawback your data. I may not hit everything so always open to suggestions about whatever else that others spot.

Spotify’s terms give it very broad rights to collect, combine, and use your data (listening history, device/ads data, voice features, third-party signals) for personalization, ads, and product R&D. They also take a broad license to user content you upload (e.g., playlist art). 

Key cites

• User Guidelines: prohibition on scraping and on “using any part of the Services or Content to train a machine learning or AI model.”

• Developer Policy (effective May 15, 2025): “Do not use the Spotify Platform or any Spotify Content to train a machine learning or AI model…” Also bans analyzing Spotify content to create new/derived listenership metrics or user profiles for ad targeting.

• Privacy Policy (effective Aug. 27, 2025): Spotify uses personal data to “develop and train” ML models for recommendations, AI DJ/AI playlists, and rule‑enforcement, primarily under “legitimate interests.”

• Terms & Conditions of Use: very broad license to Spotify for any “User Content” you post, including the right to “create derivative works” and to use content by any means and media worldwide, irrevocably.

[A version of this post first appeared on MusicTechPolicy]

Spotify Makes Kate Nash’s Argument With the Usual Blame Game

Daniel Ek is indifferent to whether the economics of streaming causes artists to give up or actually starve to actual death. He’s already got the tracks and he’ll keep selling them forever like an evil self-licking ice cream cone.

Kate Nash is the latest artist to slam Spotify’s pathetic royalty payments even after the payola and the streaming manipulation with the Orwellian “Discovery Mode” as discovered by Liz Pelly. According to Digital Music News, Kate Nash says: 

“‘Foundations’ has over 100 million plays on Spotify — and I’m shocked I’m not a millionaire when I hear that! I’m shocked at the state of the music industry and how the industry has allowed this to happen,” said Nash. “We’re paid very, very, very poorly and unethically for our recorded music: it’s like 0.003 of a penny per stream. I think we should not only be paid fairly, but we should be paid very well. People love music and it’s a growing economy and there are plenty of millionaires in the industry because of that, and our music.”

But then she said the quiet part out loud that will get them right in their Portlandia hearts:

She added: “And what they’re saying to artists from non-rich privileged backgrounds, which is you’re not welcome here, you can’t do this, we don’t want to hear from you. Because it’s not possible to even imagine having a career if you don’t have a privileged background or a privileged situation right now.”

This, of course, comes the same time that Spotify board members have cashed out over $1 billion in stock including hundreds of millions to Daniel Ek personally, speaking of privilege.

Using forks and knives to eat their bacon

Spotify responds with the same old whine that starts with the usual condescending drivel, deflection and distraction:

“We’re huge fans of Kate Nash. For streams of her track ‘Foundations’ alone — which was released before Spotify existed — Spotify has paid out around half a million pounds in revenue to Kate Nash’s rights holders,” reads Spotify’s statement.

“Her most streamed songs were released via Universal Music Group. Spotify has no visibility over the deals that Kate signed with her rights holders. Therefore, we have no knowledge of the payment terms that were agreed upon between her and her partners.”

This is a very carefully worded statement–notice that they switch from the specific to the general and start talking about “her rights holders”. That means no doubt that they are including the songwriters and publishers of the compositions, so that’s bullshit for starters. But notice how they are making Kate’s own argument here by trying to get you to focus on the “big check” that they wrote to Universal.

Well, last time I checked in the world of arithmetic, “around half a million pounds” (which means less than, but OK) divided by 100,000,000 streams is…wait for it…shite. £0.005 per stream–at the Universal level but all-in by the sound of it, i.e., artist share, label share, songwriters and publishers. This is why Spotify is making Kate’s argument at the same time they are trying to deflect attention onto Universal.

Then–always with an eye on the DCMS authorities in the UK and the UK Parliament, Spotify says:

“We do know that British artists generated revenues of over £750 million on Spotify alone in 2023 — a number that is on the rise year on year — so it’s disappointing to hear that Spotify’s payments are not making it through to Kate herself,” the company concluded.

Oh, so “disappointed.” Please spare us. What’s disappointing is that the streaming services participate in this charade where their executives make more in one day of stock trading than the company’s entire payments to UK artists and songwriters.

This race to the bottom is not lost on artists.  Al Yankovic, a card-carrying member of the pantheon of music parodists from Tom Leher to Spinal Tap to The Rutles, released a hysterical video about his “Spotify Wrapped” account.  

Al said he’d had 80 million streams and received enough cash from Spotify to buy a $12 sandwich.  This was from an artist who made a decades-long career from—parody.  Remember that–parody.

Do you think he really meant he actually got $12 for 80 million streams?  Or could that have been part of the gallows humor of calling out Spotify Wrapped as a propaganda tool for…Spotify?  Poking fun at the massive camouflage around the Malthusian algebra of streaming royalties gradually choking the life out of artists and songwriters? Gallows humor, indeed, because a lot of artists and especially songwriters are gradually collapsing as the algebra predicted.

The services took the bait Al dangled, and they seized upon Al’s video poking fun at how ridiculously low Spotify payments are to make a point about how Al’s sandwich price couldn’t possibly be 80 million streams and if it were, it’s his label’s fault.  Just like Spotify is blaming Universal rather than take responsibility for once in their lives. 

Nothing if not on message, right? As Daniel Ek told MusicAlly, “There is a narrative fallacy here, combined with the fact that, obviously, some artists that used to do well in the past may not do well in this future landscape, where you can’t record music once every three to four years and think that’s going to be enough.” This is kind of like TikTok bragging about how few children hung themselves in the latest black out challenge compared to the number of all children using the platform. Pretty Malthusian. It’s not a fallacy; it’s all too true.

I’d suggest that Al and Kate Nash were each making the point–if you think of everyday goods, like bacon for example, in terms of how many streams you would have to sell in order to buy a pound of bacon, a dozen eggs, a gallon of gasoline, Internet access, or a sandwich in a nice restaurant, you start to understand that the joke really is on us. The best way to make a small fortune in the streaming business is to start with a large one. Unless you’re a Spotify executive, of course.

Does it have an index? @LizPelly’s Must-Read Investigation in “Mood Machine” Raises Deep Questions About Spotify’s Financial Integrity

Spotify Playlist Editors

By Chris Castle

If you don’t know of Liz Pelly, I predict you soon will. I’ve been a fan for years but I really think that her latest work, Mood Machine: The Rise of Spotify and the Costs of the Perfect Playlist, coming in January by One Signal Publishers, an imprint of Atria Books at Simon & Schuster, will be one of those before and after books. Meaning the world you knew before reading the book was radically different than the world you know afterward. It is that insightful. And incriminating.

We are fortunate that Ms. Pelly has allowed Harper’s to excerpt Mood Machine in the current issue. I want to suggest that if you are a musician or care about musicians, or if you are at a record label or music publisher, or even if you are in the business of investing in music, you likely have nothing more important to do today than read this taste of the future. 

The essence of what Ms. Pelly has identified is the intentional and abiding manipulation of Spotify’s corporate playlists. She explains what called her to write Mood Machine:

Spotify, the rumor had it, was filling its most popular playlists with stock music attributed to pseudonymous musicians—variously called ghost or fake artists—presumably in an effort to reduce its royalty payouts. Some even speculated that Spotify might be making the tracks itself. At a time when playlists created by the company were becoming crucial sources of revenue for independent artists and labels, this was a troubling allegation.

What you will marvel at is the elaborate means Ms. Pelly has discovered–through dogged reporting worthy of the great deadline artists–that Spotify undertook to deceive users into believing that playlists were organic. And, it must be said, to deceive investors, too. As she tells us:

For years, I referred to the names that would pop up on these playlists simply as “mystery viral artists.” Such artists often had millions of streams on Spotify and pride of place on the company’s own mood-themed playlists, which were compiled by a team of in-house curators. And they often had Spotify’s verified-artist badge. But they were clearly fake. Their “labels” were frequently listed as stock-music companies like Epidemic, and their profiles included generic, possibly AI-generated imagery, often with no artist biographies or links to websites. Google searches came up empty.

You really must read Ms. Pelly’s except in Harper’s for the story…and did I say the book itself is available for preorder now?

All this background manipulation–undisclosed and furtive manipulation by a global network of confederates–was happening while Spotify devoted substantial resources worthy of a state security operation into programming music in its own proprietary playlists. That programmed music not only was trivial and, to be kind, low brow, but also essentially at no cost to Spotify. It’s not just that it was free, it was free in a particular way. In Silicon Valley-speak, Ms. Pelly has discovered how Spotify disaggregated the musician from the value chain.

What she has uncovered has breathtaking implications, particularly with the concomitant rise of artificial intelligence and that assault on creators. The UK Parliament’s House of Commons Digital, Culture, Media & Sport Committee’s Inquiry into the Economics of Music Streaming quoted me as saying “If a highly trained soloist views getting included on a Spotify “Sleep” playlist as a career booster, something is really wrong.” That sentiment clearly resonated with the Committee, but was my feeble attempt at calling government’s attention to then-only-suspected playlist grift that was going on at Spotify. Ms. Pelly’s book is a solid indictment–there’s that word again–of Spotify’s wild-eyed, drooling greed and public deception. 

Ms. Pelly’s work raises serious questions about streaming payola and its fellow-travelers in the annals of crime. The last time this happened in the music business was with Fred Dannen’s 1991 book called Hit Men that blew the lid off of radio payola. That book also sent record executives running to unfamiliar places called “book stores” but for a particular reason. They weren’t running to read the book. They already knew the story, sometimes all too well. They were running to see if their name was in the index.

Like the misguided iHeart and Pandora “steering agreements” that nobody ever investigated which preceded mainstream streaming manipulation, it’s worth investigating whether Spotify’s fakery actually rises to the level of a kind of payola or other prosecutable offense. As the noted broadcasting lawyer David Oxenford observed before the rise of Spotify:

The payola statute, 47 USC Section 508, applies to radio stations and their employees, so by its terms it does not apply to Internet radio (at least to the extent that Internet Radio is not transmitted by radio waves – we’ll ignore questions of whether Internet radio transmitted by wi-fi, WiMax or cellular technology might be considered a “radio” service for purposes of this statute). But that does not end the inquiry.Note that neither the prosecutions brought by Eliot Spitzer in New York state a few years ago nor the prosecution of legendary disc jockey Alan Fried in the 1950s were brought under the payola statute. Instead, both were based on state law commercial bribery statutes on the theory that improper payments were being received for a commercial advantage. Such statutes are in no way limited to radio, but can apply to any business. Thus, Internet radio stations would need to be concerned.

Ms. Pelly’s investigative work raises serious questions of its own about the corrosive effects of fake playlists on the music community including musicians and songwriters. She also raises equally serious questions about Spotify’s financial reporting obligations as a public company.

For example, I suspect that if Spotify were found to be using deception to boost certain recordings on its proprietary playlists without disclosing this to the public, it could potentially raise issues under securities laws, including the Sarbanes-Oxley Act (SOX). SOX requires companies to maintain accurate financial records and disclose material information that could affect investors’ decisions.

Deceptive practices that mislead investors about the company’s performance or business practices could be considered a violation of SOX. Additionally, such actions could lead to investigations by regulatory bodies like the Securities and Exchange Commission (SEC) and potential legal consequences.

Publicly traded companies like Spotify are required to disclose “risk factors” in their public filings which are potential events that could significantly impact Spotify’s business, financial condition, or operations. Ms. Pelly’s reporting raises issues that likely should be addressed in a risk factor. Imagine that risk factor in Spotify’s next SEC filing? It might read something like this:


Risk Factor: Potential Legal and Regulatory Actions

Spotify is currently under investigation for alleged deceptive practices related to the manipulation of Spotify’s proprietary playlists. If these allegations are substantiated, Spotify could face significant legal and regulatory actions, including fines, penalties, and enforcement actions by regulatory bodies such as the Securities and Exchange Commission (SEC) and the Federal Trade Commission (FTC). Such actions could result in substantial financial liabilities, damage to our reputation, and a loss of user trust, which could adversely affect our business operations and financial performance.


[A version of this post first appeared in MusicTech.Solutions]

@wordsbykristin: Legal Fights, Transparency & Neutrality: DiMA’s CEO On Improvements Streamers Suggest for the MLC

Kristin Robinson makes another important contribution to the artist rights conversation with her interview of Graham Davies, the new head of the Digital Media Association. Graham comes to DiMA from a background in the artist rights movement at our friends the Ivors Academy in the UK. We have high hopes for Graham who brings his intellect to clean up a long, long line of mediocrity at the DiMA leadership who are from Washington and here to help.

Kristin’s interview highlights DiMA’s recent filings in The Reup–the redesignation of the MLC by the Copyright Office that we’ve highlighted on Trichordist. He also has some well thought out analysis on how the MLC is not HFA, however similar the two may seem in practice.

This is an important interview and you can find it on Billboard (subscription required).

Here’s an example of Graham’s insight:

Do you think a re-designation every five years is not enough on its own?

I think it’ll be interesting to see what the re-designation process brings forward from the Copyright Office. Maybe the Copyright Office leans in on governance and says, “We’ve heard enough, and we can come forward with ideas.” But the re-designation process is a different thing than a governance review, which would bring in a special team to actually dig into governance-related issues and bring forward recommendations and proposals that could then be implemented. It would be something more specific and something the MLC could just do. You wouldn’t need the Copyright Office to sponsor it, though they could if they wanted to.