Say No to Suno

Late last year, thieves disguised as construction workers broke into the Louvre during broad daylight, grabbed more than $100 million worth of crown jewels, and roared off on their motorbikes into the busy streets of Paris. While some of those thieves were later arrested, the jewelry they stole has yet to be recovered, and many fear those historic works of artistry have already been recut, reset, and resold.

Closer to home, but no less nefarious, is the brazen rip-off of artists enabled by irresponsible AI, whose profiteers are recutting, remixing, and reselling original works of artistry as something new.  The hijacking of the world’s entire treasure-trove of music floods platforms with AI slop and dilutes the royalty pools of legitimate artists from whose music this slop is derived. 

Meanwhile, those who are promoting this new business model are operating in broad daylight, too – minus the yellow safety vests.  That is AI music company Suno, the brazen “smash and grab” platform whose “Make it Music” ad campaign suggests that the most personal and meaningful forms of music can now be fabricated by their unauthorized AI platform machinery trained on human artists’ work. 

How significant is this activity?  Publicly revealed data says Suno is used to generate 7 million tracks a day, a massive quantity that suggests a dominant market share of AI tracks.  According to recent reports, Deezer “deems 85% of streams of fully AI-generated tracks [on its service] to be fraudulent,” and that such tracks include outputs from major generative models.  As JP Morgan’s analysts said, Deezer’s data “should be indicative of the broader market.”  Suno has yet to demonstrate persuasively that its platform does not, in practice, serve as a scalable input into streaming-fraud schemes — raising a serious concern that Suno has, in effect, become a fraud-fodder factory on an industrial scale.

In a February 2 LinkedIn post, Paul Sinclair, Suno’s Chief Music Officer, claims that his company’s platform is about “empowerment” that enables “billions of fans to create and play with music.”  He argues that closed systems are “walled gardens” that deny people access to the full joy of music.

Ironically, Sinclair’s choice of analogy undermines his own argument.  Ask yourself: just why are most gardens surrounded by fences or walls?  To keep out rabbits, deer, raccoons and wild pigs seeking a free lunch.  We cultivate, nurture and protect our gardens precisely because that makes them much more productive over the long run.

While Sinclair may be loath to admit it, AI is fundamentally different from past disruptive innovations in the music industry.  The phonograph, cassettes, CDs, MP3s, downloads, streaming – all these technologies were about the reproduction and distribution of creative work.  By contrast, irresponsible AI like Suno appropriates and plunders such creative work while undermining the commercial ecosystem for artists.

Think back to the days of Napster.  What brought the music industry back from the ruinous abyss of unfettered digital piracy?  It was the very “closed systems” that Sinclair derides as exclusionary.  At least streaming platforms maintain access controls and content management systems that enable creator compensation, even if the economic outcomes for many creators remain inadequate.  Should we be against Apple Music, Spotify, Deezer, YouTube Music, and Amazon Music?  What about Netflix, Disney+ and HBO, too, while we’re at it?

At its core, Sinclair’s argument is just a tired remix of the old trope that “information wants to be free.”  What that really means is: “We want your music for free.”

Artists need to understand Suno’s game.  They are not putting technology in the service of artists; they are putting artists in the service of their technology.  Every time artists’ creations are used by the platform, those creations have just unwittingly been contributed to the creation of endless derivatives of artists’ own work, not to mention AI slop, with limited or no remuneration back to the human creators.  Suno built its business on our backs, scraping the world’s cultural output without permission, then competing against the very works exploited.

It’s also important to keep in mind that using Suno to generate audio output calls into question the copyrightability of whatever Suno creates.  Most countries around the world including the US Copyright Office have been clear that generative AI outputs are largely ineligible for a copyright – meaning the economic value of the Suno creation lies solely with Suno, not with the artist using it.  The only ones gaining empowerment from Suno are Suno themselves.

Many in our community are embracing responsible AI as a tool for creation, and as a means for fans to explore and interact with our artistry.  That’s wonderful.  But it’s not the same as creating an environment where AI-generated works sourced from our music are mass distributed to dilute our royalties or, worse yet, reward those actively seeking to commit fraud.  Artists need to know the difference – all AI platforms are not the same, and Suno, which is being sued for copyright infringement, is not a platform artists should trust.

Responsible AI-generated music must evolve within a framework that respects and remunerates artists, enhances human creativity rather than supplants it, and empowers fans to engage with the music they love.  At the same time, AI services must preclude mass distribution of slop and prevent fraudsters from destroying the very ecosystem that has been built to reward and sustain artists and audiences alike.

All of us, including billions of music fans, share an urgent, deep and abiding interest in protecting and rewarding human genius, even as AI continues to change our industry and the world in unimaginable ways.  So in 2026, even as the Louvre continues to revamp its own approach to security, we in the arts must rise to confront those who would “smash-and-grab” our creativity for their own benefit.

Together, while embracing innovation, we must work to establish more effective safeguards – both legal and technological – that better promote and protect all creative artists, our intellectual property, and the spark of human genius.

Say no to Suno. Say yes to the beauty and bounty of the gardens that feed us all.

Signed: 

Ron Gubitz, Executive Director, Music Artist Coalition

Helienne Lindvall, Songwriter and President, European Composer and Songwriter Alliance

David C. Lowery, Artist and Editor The Trichordist

Tift Merritt artist, Practitioner in Residence, Duke University and Artist Rights Alliance Board Member

Blake Morgan, artist, producer, and President of ECR Music Group.

Abby North, President, North Music Group

Chris Castle, Artist Rights Institute

Synthetic Emotion from The Music Department: Suno’s Unsettling Ad Campaign and the Return of Orwell’s Machine-Made Culture from 1984

In George Orwell’s 1984, the “versificator” was a machine designed to produce poetry, songs, and sentimental verse synthetically, without human thought or feeling. Its purpose was not artistic expression but industrial-scale cultural production—filling the air with endless, disposable content to occupy attention and shape perception. Nearly a century later, the comparison to modern generative music systems such as Suno is difficult to ignore. While the technologies differ dramatically, the underlying question is strikingly similar: what happens when music is produced by machines at scale rather than by human experience?

Orwell’s versificator was built for scale, not meaning (reminding you of anyone?). It generated formulaic songs for the masses, optimized for emotional familiarity rather than originality. Suno, by contrast, uses sophisticated machine learning trained on vast corpora of human-created music to generate complete recordings on demand that would be the envy of Big Brother’s Music Department. Suno can reportedly generate millions of tracks per day, a level of output impossible in any human-centered musical economy. When music becomes infinitely reproducible, the limiting factor shifts from creation to distribution and attention—precisely the dynamic Orwell imagined.

Nothing captures the versificator analogy more vividly than Suno’s own dystopian-style “first kiss” advertisingcampaign. In one widely circulated spot, the product is promoted through a stylized, synthetic emotional narrative that emphasizes instant, machine-generated musical cliche creation untethered from human musicians, vocalists, or composers. The message is not about artistic struggle, collaboration, or lived expression—it is about mediocre frictionless production. The ad unintentionally echoes Orwell’s warning: when culture can be manufactured instantly, expression becomes simulation. And on top of it, those ads are just downright creepy.

The versificator also blurred authorship. In 1984, no individual poet existed behind the machine’s output; creativity was subsumed into a system. Suno raises a comparable question. If a system trained on thousands or millions of human performances produces a new track, where does authorship reside? With the user who typed a prompt? With the engineers who built the model? With the countless musicians whose expressive choices shaped the training data? Or nowhere at all? This diffusion of authorship challenges long-standing cultural and legal assumptions about what it means to “create” music.

Another parallel lies in standardization. The versificator produced content that was emotionally predictable—pleasant, familiar, subservient and safe. Generative music systems often display a similar gravitational pull toward stylistic averages embedded in their training data that has been averaged into pablum. The result can be competent, even polished output that nevertheless lacks the unpredictability, risk, and individual voice associated with human artistry. Orwell’s concern was not that machine-generated culture would be bad, but that it would be flattened—replacing lived expression with algorithmic imitation. Substitutional, not substantial.

There is also a structural similarity in scale and economics. The versificator’s value to The Party lay in its ability to replace human labor in cultural production and to force the creation of projects that humans would find too creepy. Suno and similar systems raise analogous questions for modern musicians, particularly session players and composers whose work historically formed the backbone of recorded music. When a single system can generate instrumental tracks, arrangements, and stylistic variations instantly, the economic pressure on human contributors becomes obvious. Orwell imagined machines replacing poets; today the substitution pressure may fall first on instrumental performance, arrangement, sound designer, and production roles.

Yet the comparison has limits, and those limits matter. The versificator was a tool of centralized control in a dystopian state, designed to narrow human thought. Suno operates in a pluralistic technological environment where many artists themselves experiment with AI as a creative instrument. Unlike Orwell’s machine, generative music systems can be used collaboratively, interactively, and sometimes in ways that expand rather than suppress creative exploration. The technology is not inherently dystopian; its impact depends on how institutions, markets, and creators choose to shape it.

A deeper difference lies in intention. Orwell’s versificator was never meant to create art; it was meant to simulate it. Modern generative music systems are often framed as tools that can assist, augment, or inspire human creativity. Some artists use AI to prototype ideas, explore unfamiliar styles, or generate textures that would be difficult to produce otherwise. In these contexts, the machine functions less like a replacement and more like a new instrument—one whose cultural role is still evolving.

Still, Orwell’s versificator is highly relevant to understanding Suno’s corporate direction. When cultural production becomes industrialized, quantity can overwhelm meaning. The risk is not merely that machine-generated music exists, but that its scale reshapes attention, value, and recognition. If millions of synthetic tracks flood listening environments as is happening with some large DSPs, the signal of individual human expression may become harder to perceive—even if human creativity continues to exist beneath the surface.

The comparison between Suno and the versificator symbolizes the moment when technology challenges the boundaries of authorship, creativity, and cultural labor. Orwell warned of a world where machines produced endless culture without human voice. Today’s question is subtler: can society integrate generative systems in ways that preserve the distinctiveness of human expression rather than dissolving it into algorithmic slop?

The answer will not come from technology alone. It will depend on choices—legal, cultural, and economic—about how machine-generated music is labeled, valued, and integrated into the broader creative ecosystem. Orwell imagined a future where the machine replaced the poet. The task now is to ensure that, even in an age of generative AI, the humans remains audible.

Don’t Sell What You Don’t Have: Why AB 1349’s Crackdown on Speculative Event Tickets Matters to Touring Artists and Fans

Update: AB 1349 passed the California Assembly, on to the Senate.

I rely on ticket revenue to pay my band and crew, and I depend on trust—between me and my fans—for my career to work at all. That’s why I support California’s AB 1349. At its core, this bill confronts one of the most corrosive practices in touring: speculative ticketing.

Speculative ticketing isn’t normal resale. It’s when sellers list tickets they don’t actually own and may never acquire. These listings often appear at inflated prices on reseller markets before tickets even go on sale, with no guarantee the seller can deliver the seat. In other words, it’s selling a promise, not a ticket. Fans may think they bought a ticket, but what they’ve really bought is a gamble that the reseller can later obtain the seat—usually at a lower price—and flip it to them while the reseller marketplace looks the other way.

Here’s how it works in practice. A reseller posts a listing, sometimes even a specific section, row, and seat, before they possess anything. The marketplace presents that listing like real inventory: seat maps, countdown timers, “only a few left” banners. That creates artificial scarcity before a single legitimate ticket has even been sold. Once tickets go on sale, the reseller tries to “cover” the sale—buying tickets during the onsale (often using bots or multiple accounts), buying from other resellers who did secure inventory, or substituting some “comparable” seat if the promised one doesn’t exist at an arbitrage price. If they can source lower than what they sold to the fan, they pocket the difference.

When that gamble fails, the risk gets dumped on the fan. Prices jump. Inventory really sells out. The reseller can’t deliver. What follows is a last-minute cancellation, a refund that arrives too late to help, a downgrade to worse seats, or a customer-service maze between the seller and the platform. Fans blame artists even if the artists had nothing to do with the arbitrage. I’ve seen fans get priced out because listings appeared online that had nothing to do with the actual onsale.   The reseller and the marketplace profit themselves while the fan, artist and venue suffer.

AB 1349 draws a bright-line rule that should have existed years ago: if you don’t actually have the ticket—or a contractual right to sell it—you can’t list it. That single principle collapses the speculative model. You can’t post phantom seats or inflate prices using imaginary inventory. It doesn’t ban resale. It doesn’t cap prices. It does stop a major source of fraud.

The bill also tackles the deception that makes speculative ticketing profitable. Fake “sold out” claims, copycat websites that look like official artist or venue pages, and listings that bury or hide face value all push fans into rushed, fear-based decisions. AB 1349 requires transparency about whether a ticket is a resale, what the original face price was, and what seat is actually being offered. That information lets fans make rational choices—and it reduces the backlash that inevitably lands on performers and venues when fans feel tricked.

Bots and circumvention tools are another part of the speculative pipeline. Artists and venues spend time and money designing fair onsales, presales for fan clubs, and purchase limits meant to spread tickets across real people. Automated systems that evade those limits defeat the entire purpose, feeding inventory into speculative listings within seconds. AB 1349 doesn’t outlaw resale; it targets the deliberate technological abuse that turns live music into a high-speed extraction game.

I also support the bill’s enforcement structure. This isn’t about turning fans into litigants or flooding courts. It’s about giving public enforcers real tools to police a market that has repeatedly shown it won’t self-regulate.

AB 1349 won’t fix everything overnight. But by stopping people from selling what they don’t have, it moves ticketing back toward a system built on possession, truth, and accountability. If every state prohibited speculative ticketing, it would largely disappear because resale would finally be backed by real inventory. For fans who just want to see the music they love—that’s not radical. It’s essential.

[This post first appeared on Hypebot]

Stealing Isn’t Innovation!

Don’t let the so-called “AI czar” sell you the idea that changing the law to legalize taking artists’ work without consent is innovation. It isn’t.

Innovation creates new value. The AI boondoggle takes existing value from creators and communities and hands it to a small number of tech companies—without permission, without payment, and without accountability but with a nuclear reactor next to your house.

Artists aren’t raw material. They’re rights-holders under U.S. law. Rewriting those rights to subsidize AI business models isn’t progress—it’s a policy choice to reward theft at scale.

AI can thrive without gutting creative rights. But that requires consent, licensing, and fair compensation—not retroactive immunity dressed up as innovation.

Stealing isn’t innovation. It’s just stealing, with a press strategy.

Find out more at Stealing Isn’t Innovation and @human_artistry

Victory in the Vetter v. Resnik Lawsuit: Artist Rights, Songwriter Advocacy, and the Power of Termination

At the center of Vetter v. Resnik are songwriters reclaiming what Congress promised them in a detailed and lengthy legislative negotiation over the 1976 revision to the Copyright Act—a meaningful second chance to terminate what the courts call “unremunerative transfers,” aka crappy deals. That principle comes into sharp focus through Cyril Vetter, whose perseverance brought this case to the Fifth Circuit, and Cyril’s attorney Tim Kappel, whose decades-long advocacy for songwriter rights helped frame the issues not as abstractions, but as lived realities.

Cyril won his case against his publisher at trial in a landmark judicial ruling by Chief Judge Shelly Dick. His publisher appealed Judge Dick’s ruling to the Fifth Circuit. As readers will remember, oral arguments in the case were earlier this year. A bunch of songwriter and author groups including the Artist Rights Institute filed “friend of the court” briefs in the case in favor of Cyril.

In a unanimous opinion, the United States Court of Appeals for the Fifth Circuit affirmed Judge Shelly Dick’s carefully reasoned trial-court ruling, holding that when an author terminates a worldwide grant, the recapture also worldwide. It is not artificially limited to U.S. territory only, which had been the industry practice. The court understood that anything less would hollow out Congress’s intent.

It is often said that the whole point of the termination law is to give authors (including songwriters) a “second bite at the apple”. Which is why the Artist Rights Institute wrote (and was quoted by the 5th Circuit) that limiting the reversion to US rights only is a “second bite at half the apple” which was the opposite of Congressional intent.

25-30108-2026-01-12Download



What made this 5th Circuit decision especially meaningful for the creative community is that the Fifth Circuit did not reach it in a vacuum. Writing for the panel, Judge Carl Stewart expressly quoted the Artist Rights Institute amicus brief, observing:

“Denying terminating authors the full return of a worldwide grant leaves them with only half of the apple—the opposite of congressional intent.”

That sentence—simple, vivid, and unmistakably human—captured what this case has always been about.

ARI.Amicus.Vetter.Final RDownload



The Artist Rights Institute’s amicus brief did not appear overnight. It grew out of a longstanding relationship between songwriter advocate Tim Kappel and Chris Castle, a collaboration shaped over many years by shared concern for how statutory rights actually function—or fail to function—for creators in the real world.

When the Vetter appeal crystallized the stakes, that history mattered. It allowed ARI to move quickly, confidently, and with credibility—translating dense statutory language into a narrative to help courts understand that termination rights are supposed to restore leverage, not preserve a publisher’s foreign control veto through technicalities.

Crucially, the brief was inspired and strengthened by the voices of songwriter advocates and heirs, including Abby North (heir of composer Alex North), Blake Morgan (godson of songwriter Lesley Gore), and Angela Rose White (heir of legendary music director David Rose) and of course David Lowery and Nikki Rowling. The involvement of these heirs ensured the court understood context—termination is not merely about renegotiating deals for living authors. It is often about families, estates, and heirs—people for whom Congress explicitly preserved termination rights as a matter of intergenerational fairness.

The Fifth Circuit’s opinion reflects that understanding. By rejecting a cramped territorial reading of termination, the court avoided a result that would have undermined heirs’ rights just as surely as authors’ rights.

Vetter v. Resnik represents a rare and welcome alignment: an author willing to press his statutory rights all the way, advocates who understood the lived experience behind those rights, a district judge who took Congress at its word, and an appellate court willing to say plainly that “half of the apple” is not enough.

For the Artist Rights Institute, it was an honor to participate—to stand alongside Cyril Vetter, Tim Kappel, and the community of songwriter advocates and heirs whose experiences shaped a brief that helped the court see the full picture.

And for artists, songwriters, and their families, the decision stands as a reminder that termination rights mean what Congress said they mean—a real chance to reclaim ownership, not an illusion bounded by geography.

@ArtistRights Institute Newsletter 01/05/26: Grok Can’t Control Itself, CRB V Starts, Data Center Rebellion, Sarah Wynn-Williams Senate Testimony, Copyright Review

Artist Rights Institute logo - Artist Rights Weekly newsletter

Phonorecords V Commencement Notice: Government setting song mechanical royalty rates

The Copyright Royalty Judges announce the commencement of a proceeding to determine reasonable rates and terms for making and distributing phonorecords for the period beginning January 1, 2028, and ending December 31, 2032. Parties wishing to participate in the rate determination proceeding must file their Petition to Participate and the accompanying $150 filing fee no later than 11:59 p.m. eastern time on January 30, 2026. Deets here.

US Mechanical Rate Increase

Songwriters Will Get Paid More for Streaming Royalties Starting Today (Erinn Callahan/AmericanSongwriter)

CRB Sets 2026 Mechanical Rate at 13.1¢ (Chris Castle/MusicTechPolicy)

Spotify’s Hack by Anna’s Archive

No news. Biggest music hack in history still stolen.

MLC Redesignation

The MMA’s Unconstitutional Unclaimed Property Preemption: How Congress Handed Protections to Privatize Escheatment (Chris Castle/MusicTechPolicy)

Under the Radar: Data Center Grass Roots Rebellion

Data Center Rebellion (Chris Castle/MusicTechSolutions)

The Data Center Rebellion is Here and It’s Reshaping the Political Landscape (Washington Post)

Residents protest high-voltage power lines that could skirt Dinosaur Valley State Park (ALEJANDRA MARTINEZ AND PAUL COBLER/Texas Tribune)

US Communities Halt $64B Data Center Expansions Amid Backlash (Lucas Greene/WebProNews)

Big Tech’s fast-expanding plans for data centers are running into stiff community opposition (Marc Levy/Associated Press)

Data center ‘gold rush’ pits local officials’ hunt for new revenue against residents’ concerns (Alander Rocha/Georgia Record)

AI Policy

Meet the New AI Boss, Worse Than the Old Internet Boss (Chris Castle/MusicTechPolicy)

Deloitte’s AI Nightmare: Top Global Firm Caught Using AI-Fabricated Sources to Support its Policy Recommendations (Hugh Stephens/Hugh Stephens Blog)

Grok Can’t Stop AI Exploitation of Women

Facebook/Meta Whistleblower Testifies at US Senate

Copyright Case 2025 Review

Year in Review: The U.S. Copyright Office (George Thuronyi/Library of Congress)

Copyright Cases: 2025 Year in Review (Rachel Kim/Copyright Alliance)

AI copyright battles enter pivotal year as US courts weigh fair use (Blake Brittain/Reuters)

2026 Music Predictions: The Legal and Policy Fault Lines Ahead

By Chris Castle

I was grateful to Hypebot for publishing my 2026 music‑industry predictions, which focused on the legal and structural pressures already reshaping the business. For regular readers, I’m reposting those predictions here—and adding a few more that follow directly from the policy work, regulatory engagement, and royalty‑system scrutiny we’ve been immersed in over the past year with the Artist Rights Institute. These additional observations are less about trend‑spotting and more about where the underlying legal and institutional logic appears to be heading next.

1. AI Copyright Litigation Will Move From Abstract Theory to Operational Discovery

In 2026, the center of gravity in AI‑copyright cases will shift toward discovery that exposes how models are trained, weighted, filtered, and monetized. Courts will increasingly treat AI systems as commercial products rather than research experiments, and discovery fights for the good of humanity…ahem…rather than summary judgment rhetoric. The result will be pressure on platforms to settle, license, or restructure before full disclosure occurs particularly since it’s becoming increasingly likely that every frontier AI lab as ripped off the world’s culture the old fashioned way—they stole it off the Internet.

The next round of AI copyright litigation will come from fans: As more deals are done with AI like the Disney/Sora deal, fans who use Sora or other AI to create separatable rights with AI (like new characters, new story lines) or even new universes with old story lines (like maybe new versions of the Luke/Darth/Hans/Leia arc in the Old West) will start to get the idea that their IP is…well…their IP. If it’s used without compensating them or getting their permission, that whole copyright thing is going to start to get real for them.

2. Streaming Platforms Will Face Structural Payola Scrutiny, Not Just Royalty Complaints

Minimum‑payment thresholds, bundled offerings, and “greater‑of” formulas will no longer be treated as isolated business choices. Regulators and courts will begin to examine how these mechanisms function together to shift risk onto artists while preserving platform margins. Antitrust, consumer‑protection, and unfair‑competition theories will increasingly converge around the same conduct. Due to Spotify’s market dominance and intimidation factor for majors and big to medium sized independent labels, these cases will have to come from independent artists.

3. The Copyright Office Will Approve a Conditional Redesignation of the MLC

Rather than granting an unconditional redesignation of the Mechanical Licensing Collective, the Copyright Office is likely to impose conditions tied to governance, transparency, and financial stewardship. This approach allows continuity for licensees while asserting supervisory authority grounded in the statute. The message will be clear: designation is provisional, not permanent.

Digital-Licensing-Coordinator-to-USCO-2-Sept-22-2025Download

4. The MLC’s Gundecked Investment Policy Will Be Unwound or Materially Rewritten

The practice of investing unmatched royalties as a pooled asset is becoming legally and politically indefensible. In 2026, expect the investment policy to be unwound or rewritten by new regulations to require pass‑through of gains, or strict capital‑preservation limits. Once framed as a fiduciary issue rather than a finance strategy, the current model cannot survive intact.

It’s also worth noting that the MLC’s investment portfolio has grown so large ($1.212 billion) that its investment income reported on its 2023 tax return has also grown to an amount in excess of its operating costs as measured by the administrative assessment paid by licensees.

5. An MLC Independent Royalty‑Accounting and Systems Review Will Become Inevitable

As part of a conditional redesignation, the Copyright Office may require an end‑to‑end operational review of the MLC by a top‑tier royalty‑accounting firm. Unlike a SOC report, such a review would examine whether matching, data logic, and distributions actually produce correct outcomes. Once completed, that analysis would shape litigation, policy reform, and future oversight.

6. Foreign CMOs Will Push Toward Licensee‑Pays Models

Outside the U.S., collective management organizations face rising technology costs and political scrutiny over compensation. In response, many will explore shifting more costs to licensees rather than members, reframing CMOs as infrastructure providers. Ironically, the U.S. MLC experiment may accelerate this trend abroad given the MLC’s rich salaries and vast resources for developing poorly implemented tech.

These developments are not speculative in the abstract. They follow from incentives already in motion, records already being built, and institutions increasingly unable to rely on deference alone.

7.  Environmental Harms of AI Become a Core Climate Issue

We will start to see the AI labs normalize the concept of private energy generation on a massive scale to support data centers built in current green spaces.  If they build or buy electric plants they do not intend to share.  This whole thing about they will build small nuclear reactors and sell excess back to the local grid is crazy—there won’t be any excess and what about their behavior over the last 25 years makes you think they’ll share a thing?

So some time after Los Angeles rezones Griffith Park commercial and sells the Greek Theater to Google for a new data center and private nuclear reactor and Facebook buys the Diablo Canyon reactor, the Music Industry Climate Collective will formally integrate AI’s ecological footprint into their national and international policy agendas. After mounting evidence of data‑center water depletion, aquifer stress, and grid destabilization — particularly in drought‑prone regions — climate coalitions will conceptually reclassify AI infrastructure as a high‑impact industrial activity.

This will become acute after people realize they cannot expect the state or federal government to require new state permitting regimes because of the overwhelming political influence of Big Tech in the form of AI Viceroy-for-Life David Sacks. (He’s not going anywhere in a post-Trump era.). This will lead to environmental‑justice litigation over siting decisions and pressure to require reporting of AI‑related energy, water, and land use.

8.  Criminal RICO Case Against StubHub and Affiliated Resale Networks

By late 2026, the Department of Justice brings a landmark criminal RICO indictment targeting StubHub‑linked reseller networks and individual reseller financiers for systemic ticketing fraud and money laundering. The enterprise theory alleges that major resellers, platform intermediaries, lenders, and bot‑operators coordinated to engage in wire fraud, market manipulation, speculative ticketing, and deceptive consumer practices at international scale. Prosecutors present evidence of an organized structure that used bots, fabricated scarcity, misrepresentation of seat availability, and price‑fixing algorithms to inflate profits.

This becomes the first major criminal RICO prosecution in the secondary‑ticketing economy and triggers parallel state‑level investigations and civil RICO suits. Public resellers like StubHub will face shareholder lawsuits and securities fraud allegations.

Just another bright sunshiny day.

[A version of this post first appeared on MusicTechPolicy]




What Don Draper Knew That AI Forgot: Authorship, Ownership, and Advertising

David is pointing to a quiet but serious problem hiding behind the rush to use generative AI in advertising, film, and television: copyright law protects authorship, not outputs. AI muddies or even erases authorship altogether in some cases

Under current U.S. Copyright Office guidance, works generated primarily by AI are often not registrable in the Copyright Office because they lack a human author exercising creative control. That means a brand that relies on AI to generate a commercial may not actually own exclusive rights in the finished work. If someone copies, remixes, or repurposes that ad, even in a way that damages the brand, the company may have little or no legal recourse under copyright law.

The Copyright Office guidance says:

In the Office’s view, it is well-established that copyright can protect only material that is the product of human creativity. Most fundamentally, the term “author,” which is used in both the Constitution and the Copyright Act, excludes non-humans. The Office’s registration policies and regulations reflect statutory and judicial guidance on this issue….If a work’s traditional elements of authorship were produced by a machine, the work lacks human authorship and the Office will not register it For example, when an AI technology receives solely a prompt from a human and produces complex written, visual, or musical works in response, the “traditional elements of authorship” are determined and executed by the technology—not the human user

David has not identified a theoretical risk. Copyright is the backbone of brand control in media. It’s what allows companies to stop misuse, dilution, parody-turned-weapon, or hostile appropriation. In the US, a copyright registration is required to protect those rights. Remove that protection, and brands are left relying on weaker tools like trademark or unfair competition law, which are narrower, slower, and often ill-suited to digital remix culture.

David’s warning extends beyond ads. Film and TV studios experimenting with AI-generated scripts, scenes, music, or visuals may be undermining their own ability to control, license, or defend those works. In trying to save money upfront, they may be giving up the legal leverage that protects their brand, reputation, and long-term value.