@ArtistRights Institute Newsletter 01/05/26: Grok Can’t Control Itself, CRB V Starts, Data Center Rebellion, Sarah Wynn-Williams Senate Testimony, Copyright Review

Artist Rights Institute logo - Artist Rights Weekly newsletter

Phonorecords V Commencement Notice: Government setting song mechanical royalty rates

The Copyright Royalty Judges announce the commencement of a proceeding to determine reasonable rates and terms for making and distributing phonorecords for the period beginning January 1, 2028, and ending December 31, 2032. Parties wishing to participate in the rate determination proceeding must file their Petition to Participate and the accompanying $150 filing fee no later than 11:59 p.m. eastern time on January 30, 2026. Deets here.

US Mechanical Rate Increase

Songwriters Will Get Paid More for Streaming Royalties Starting Today (Erinn Callahan/AmericanSongwriter)

CRB Sets 2026 Mechanical Rate at 13.1¢ (Chris Castle/MusicTechPolicy)

Spotify’s Hack by Anna’s Archive

No news. Biggest music hack in history still stolen.

MLC Redesignation

The MMA’s Unconstitutional Unclaimed Property Preemption: How Congress Handed Protections to Privatize Escheatment (Chris Castle/MusicTechPolicy)

Under the Radar: Data Center Grass Roots Rebellion

Data Center Rebellion (Chris Castle/MusicTechSolutions)

The Data Center Rebellion is Here and It’s Reshaping the Political Landscape (Washington Post)

Residents protest high-voltage power lines that could skirt Dinosaur Valley State Park (ALEJANDRA MARTINEZ AND PAUL COBLER/Texas Tribune)

US Communities Halt $64B Data Center Expansions Amid Backlash (Lucas Greene/WebProNews)

Big Tech’s fast-expanding plans for data centers are running into stiff community opposition (Marc Levy/Associated Press)

Data center ‘gold rush’ pits local officials’ hunt for new revenue against residents’ concerns (Alander Rocha/Georgia Record)

AI Policy

Meet the New AI Boss, Worse Than the Old Internet Boss (Chris Castle/MusicTechPolicy)

Deloitte’s AI Nightmare: Top Global Firm Caught Using AI-Fabricated Sources to Support its Policy Recommendations (Hugh Stephens/Hugh Stephens Blog)

Grok Can’t Stop AI Exploitation of Women

Facebook/Meta Whistleblower Testifies at US Senate

Copyright Case 2025 Review

Year in Review: The U.S. Copyright Office (George Thuronyi/Library of Congress)

Copyright Cases: 2025 Year in Review (Rachel Kim/Copyright Alliance)

AI copyright battles enter pivotal year as US courts weigh fair use (Blake Brittain/Reuters)

NYT: Silicon Valley’s Man in the White House Is Benefiting Himself and His Friends

This image has an empty alt attribute; its file name is image-9.png

The New York Times published a sprawling investigation into David Sacks’s role as Trump’s A.I. and crypto czar. We’ve talked about David Sacks a few times on these pages. The Times’ piece is remarkable in scope and reporting: a venture capitalist inside the White House, steering chip policy, promoting deregulation, raising money for Trump, hosting administration events through his own podcast brand, and retaining hundreds of A.I. and crypto investments that stand to benefit from his policy work.

But for all its detail, the Times buried the lede.

The bigger story isn’t just ethics violations. or outright financial corruption. It’s that Sacks is simultaneously shaping and shielding the largest regulatory power grab in history: the A.I. moratorium and its preemption structure.

Of all the corrupt anecdotes in the New York Times must read article regarding Viceroy and leading Presidential pardon candidate David Sacks, they left out the whole AI moratorium scam, focusing instead on the more garden variety of self-dealing and outright conflicts of interest that are legion. My bet is that Mr. Sacks reeks so badly that it is hard to know what to leave out. Here’s a couple of examples:

This image has an empty alt attribute; its file name is image-10.png

There is a deeper danger that the Times story never addresses: the long-term damage that will outlive David Sacks himself. Even if Sacks eventually faces investigations or prosecution for unrelated financial or securities matters — if he does — the real threat isn’t what happens to him. It’s what happens to the legal architecture he is building right now.

This image has an empty alt attribute; its file name is sacks-american-flag.jpg

If he succeeds in blocking state-law prosecutions and freezing A.I. liability for a decade, the harms won’t stop when he leaves office. They will metastasize.

Without state enforcement, A.I. companies will face no meaningful accountability for:

  • child suicide induced by unregulated synthetic content
  • mass copyright theft embedded into permanent model weights
  • biometric and voiceprint extraction without consent
  • data-center sprawl that overwhelms local water, energy, and zoning systems
  • surveillance architectures exported globally
  • algorithmic harms that cannot be litigated under preempted state laws

These harms don’t sunset when an administration ends. They calcify. It must also be said that Sacks could face state securities-law liability — including fraud, undisclosed self-dealing, and market-manipulative conflicts tied to his A.I. portfolio — because state blue-sky statutes impose duties possibly stricter than federal law. The A.I. moratorium’s preemption would vaporize these claims, shielding exactly the conduct state regulators are best positioned to police. No wonder he’s so committed to sneaking it into federal law.

The moratorium Sacks is pushing would prevent states from acting at the very moment when they are the only entities with the political will and proximity to regulate A.I. on the ground. If he succeeds, the damage will last long after Sacks has left his government role — long after his podcast fades, long after his investment portfolio exits, long after any legal consequences he might face.

The public will be living inside the system he designed.

There is one final point the public needs to understand. DavidSacksis not an anomaly. Sacks is to Trump what Eric Schmidt was to Biden: the industry’s designated emissary, embedded inside the White House to shape federal technology policy from the inside out. Swap the party labels and the personnel change, but the structural function remains the same. Remember, Schmidt bragged about writing the Biden AI executive order.

This image has an empty alt attribute; its file name is of-all-the-ceos-google-interviewed-eric-schmidt-was-the-only-one-that-had-been-to-burning-man-which-was-a-major-plus.jpg

So don’t think that if Sacks is pushed out, investigated, discredited, or even prosecuted one day — if he is — that the problem disappears. You don’t eliminate regulatory capture by removing the latest avatar of it. The next administration will simply install a different billionaire with a different portfolio and the same incentives: protect industry, weaken oversight, preempt the states, and expand the commercial reach of the companies they came in with.

The danger is not David Sacks the individual. The danger is the revolving door that lets tech titans write national A.I. policy while holding the assets that benefit from it. As much as Trump complains of the “deep state,” he’s doing his best to create the deepest of deep states.

Until that underlying structure changes, it won’t matter whether it’s Sacks, Schmidt, Thiel, Musk, Palihapitiya, or the next “technocratic savior.”

The system will keep producing them — and the public will keep paying the price. For as Sophocles taught us, it is not in our power to escape the curse.

It’s Back: The National Defense Authorization Act Is No Place for a Backroom AI Moratorium

David Sacks Is Bringing Back the AI Moratorium

WHAT’S AT STAKE

The moratorium would block states from enforcing their own laws on AI accountability, deepfakes, consumer protection, energy policy, discrimination, and data rights. Tennessee’s ELVIS Act is a prime example. For ten years — or five years in the “softened” version — the federal government would force states to stand down while some of the most richest and powerful monopolies in commercial history continue deploying models trained on unlicensed works, scraped data, personal information, and everything in between. Regardless of whether it is ten years or five years, either may as well be an eternity in Tech World. Particularly since they don’t plan on following the law anyway with their “move fast and skip things” mentality.

Ted Turns Texas Glowing

99-1/2 just won’t do—Remember the AI moratorium that was defeated 99-1 in the Senate during the heady days of the One Big Beautiful Bill Act? We said it would come back in the must-pass National Defense Authorization Act and sure enough that’s exactly where it is courtesy of Senator and 2028 Presidential hopefull Ted Cruz (fundraising off of the Moratorium no doubt for his “Make Texas California Again” campaign) and other Big Tech sycophants according to a number of sources including Politico and the Tech Policy Press:

It…remains to be seen when exactly the moratorium issue may be taken up, though a final decision could still be a few weeks away.

Congressional leaders may either look to include the moratorium language in their initial NDAA agreement, set to be struck soon between the two chambers, or take it up as a separate amendment when it hits the floor in the House and Senate next month.

Either way, they likely will need to craft a version narrow enough to overcome the significant opposition to its initial iterations. While House lawmakers are typically able to advance measures with a simple majority or party-line vote, in the Senate, most bills require 60 votes to pass, meaning lawmakers must secure bipartisan support.

The pushback from Democrats is already underway. Sen. Brian Schatz (D-HI), an influential figure in tech policy debates and a member of the Senate Commerce Committee, called the provision “a poison pill” in a social media post late Monday, adding, “we will block it.”

Still, the effort has the support of several top congressional Republicans, who have repeatedly expressed their desire to try again to tuck the bill into the next available legislative package.

In Washington, must-pass bills invite mischief. And right now, House leadership is flirting with the worst kind: slipping a sweeping federal moratorium on state AI laws into the National Defense Authorization Act (NDAA).

This idea was buried once already — the Senate voted 99–1 to strike it from Trump’s earlier “One Big Beautiful Bill.” But instead of accepting that outcome, Big Tech trying to resurrect it quietly, through a bill that is supposed to fund national defense, not rewrite America’s entire AI legal structure.

The NDAA is the wrong vehicle, the wrong process, and the wrong moment to hand Big Tech blanket immunity from state oversight. As we have discussed many times the first time around, the concept is probably unconstitutional for a host of reasons and will no doubt be immediately challenged.

AI Moratorium Lobbying Explainer for Your Electric Bill

Here are the key shilleries pushing the federal AI moratorium and their backers:

Lobby Shop / OrganizationSupporters / FundersRole in Pushing MoratoriumNotes
INCOMPAS / AI Competition Center (AICC)Amazon, Google, Meta, Microsoft, telecom/cloud companiesLeads push for 10-year state-law preemption; argues moratorium prevents ‘patchwork’ lawsIdentified as central industry driver
Consumer Technology Association (CTA)Big Tech, electronics & platform economy firmsLobbying for federal preemption; opposed aggressive state AI lawsHigh influence with Commerce/Appropriations staff
American Edge ProjectMeta-backed advocacy orgFrames preemption as necessary for U.S. competitiveness vs. China; backed moratoriumUsed as indirect political vehicle for Meta
Abundance InstituteTech investors, deregulatory donorsArgues moratorium necessary for innovation; publicly predicts return of moratoriumMessaging aligns with Silicon Valley VCs
R Street InstituteMarket-oriented donors; tech-aligned fundersOriginated ‘learning period’ moratorium concept in 2024 papers by Adam ThiererNot a lobby shop but provides intellectual framework
Corporate Lobbyists (Amazon/Google/Microsoft/Meta/OpenAI/etc.)Internal lobbying shops + outside firmsPromote ‘uniform national standards’ in Congressional meetingsOperate through and alongside trade groups

PARASITES GROW IN THE DARK: WHY THE NDAA IS THE ABSOLUTE WRONG PLACE FOR THIS

The National Defense Authorization Act is one of the few bills that must pass every year. That makes it a magnet for unrelated policy riders — but it doesn’t make those riders legitimate.

An AI policy that touches free speech, energy policy and electricity rates, civil rights, state sovereignty, copyright, election integrity, and consumer safety deserves open hearings, transparent markups, expert testimony, and a real public debate. And that’s the last thing the Big Tech shills want.

THE TIMING COULD NOT BE MORE INSULTING

Big Tech is simultaneously lobbying for massive federal subsidies for compute, federal preemption of state AI rules, and multi-billion-dollar 765-kV transmission corridors to feed their exploding data-center footprints.

And who pays for those high-voltage lines? Ratepayers do. Utilities that qualify as political subdivisions in the language of the moratorium—such as municipal utilities, public power districts, and cooperative systems—set rates through their governing boards rather than state regulators. These boards must recover the full cost of service, including new infrastructure needed to meet rising demand. Under the moratorium’s carve-outs, these entities could be required to accept massive AI-driven load increases, even when those loads trigger expensive upgrades. Because cost-of-service rules forbid charging AI labs above their allocated share, the utility may have no choice but to spread those costs across all ratepayers. Residents, not the AI companies, would absorb the rate hikes.

States must retain the power to protect their citizens. Congress has every right to legislate on AI. But it does not have the right to erase state authority in secret to save Big Tech from public accountability.

A CALL TO ACTION

Tell your Members of Congress:
No AI moratorium in the NDAA.
No backroom preemption.
No Big Tech giveaways in the defense budget.

Say Goodbye to Net Zero: The AI Data Center Corporate Welfare Scam

We’ve reported for years about how data centers are a good explanation for why Senators like Ron Wyden seem to always inject themselves into copyright legislation for the sole purpose of slowing it down or killing it, watering it down, or turning it on its head. Why would a senator from Oregon–a state that gave us Courtney Love, Esperanza Spalding, The Decemberists, Sleater-Kinney and the Dandy Warhols–be so such an incredibly basic, no-vibe cosplayer?

Easy answer–he does the bidding of the Big Tech data center operators sucking down that good taxpayer subsidized Oregon hydroelectric power–literally and figuratively. Big Tech loves them some weak copyright and expanded loopholes that let them get away with some hard core damage to artists. Almost as much as they love flexing political muscle.

Senator Wyden with his hand in his own pocket.

This is coming up again in the various public comments on artificial intelligence, which is the data hog of data hogs. For example, the Artist Rights Institute made this point using Oregon as an example in the recent UK Intellectual Property Office call for public comments that produced a huge push back on the plans of UK Prime Minister Sir Kier Starmer to turn Britain into a Google lake for AI, especially the build out of AI data centers.

Google Data Center at The Dalles, Oregon

The thrust of the Oregon discussion in the ARI comment is that Oregon’s experience with data centers should be food for thought in other places (like the UK) as what seems to happen is electricity prices for local rate payers increase while data centers have negotiated taxpayer subsidized discounts. Yes, that old corporate welfare strikes again.

Oregon Taxpayers’ Experience with Crowding Out by Data Centres is a Cautionary Tale for UK

We call the IPO’s attention to the real-world example of the U.S. State of Oregon, a state that is roughly the geographical size of the UK.  Google built the first Oregon data centre in The Dalles, Oregon in 2006.  Oregon now has 125 of the very data centres that Big Tech will necessarily need to build in the UK to implement AI.  In other words, Oregon was sold much the same story that Big Tech is selling you today.

The rapid growth of Oregon data centres driven by the same tech giants like Amazon, Apple, Google, Oracle, and Meta, has significantly increased Oregon’s demand for electricity. This surge in demand has led to higher power costs, which are often passed on to local rate payers while data centre owners receive tax benefits.  This increase in price foreshadows the market effect of crowding out local rate payers in the rush for electricity to run AI—demand will only increase and increase substantially as we enter what the International Energy Agency has called “the age of electricity”.[1]

Portland General Electric, a local power operator, has faced increasing criticism for raising rates to accommodate the encroaching electrical power needs of these data centers. Local residents argue that they unfairly bear the increased electrical costs while data centers benefit from tax incentives and other advantages granted by government.[2] 

This is particularly galling in that the hydroelectric power in Oregon is largely produced by massive taxpayer-funded hydroelectric and other power projects built long ago.[3]  The relatively recent 125 Oregon data centres received significant tax incentives during their construction to be offset by a promise of future jobs.  While there were new temporary jobs created during the construction phase of the data centres, there are relatively few permanent jobs required to operate them long term as one would expect from digitized assets owned by AI platforms.

Of course, the UK has approximately 16 times the population of Oregon.  Given this disparity, it seems plausible that whatever problems that Oregon has with the concentration of data centers, the UK will have those same problems many times over due to the concentration of populations.


[1] International Energy Agency, Electricity 2025 (Revised Edition Feb. 2025) available at https://iea.blob.core.windows.net/assets/0f028d5f-26b1-47ca-ad2a-5ca3103d070a/Electricity2025.pdf.

 [2] Jamie Parfitt, As PGE closes in on another rate increase, are the costs of growing demand for power being borne fairly? KGW8 News (Dec. 13, 2024) available at https://www.kgw.com/article/news/local/the-story/pge-rate-increase-data-centers-power-cost-demand-growth/283-399b079b-cbf5-41cf-8190-4c5f204d2d90 (“Like utilities nationwide, PGE is experiencing a surge in requests for new, substantial amounts of electricity load, including from advanced manufacturing, data centers and AI-related companies.”) 

[3] See, e.g., State of Oregon, Facilities Under Energy Facility Siting Council available at https://www.oregon.gov/energy/facilities-safety/facilities/Pages/Facilities-Under-EFSC.aspx

Will AI Produce the Oregon Effect Internationally?

So let’s look at a quick and dirty comparison of the prices that local residents and businesses pay for electricity compared to what data centers in the same states pay. We’re posting this chart because ya’ll love numbers, but mostly to start discussion and research into just how much of an impact all these data centers might have on the supply and demand price setting in a few representative state and countries. But remember this–our experience with Senator Wyden should tell you that all these data centers will give Big Tech even more political clout than they already have.

The chart shows the percentage difference between the residential rate and the data center rate for energy in each state measured. The percentage difference is calculated as: ((Residential Rate – Data Center Rate) ÷ Residential Rate) × 100. When we say “~X% lower” we mean that the data center price per kilowatt hour (¢/kWh) is approximately X% lower than the residential rate, all based on data from Choose Energy or Electricity Plans. We don’t pretend to be energy analysts, so if we got this wrong, someone will let us know.

On a country by country comparison, here’s some more food for thought:

Data is from (1) here and (2) here

As you can see, most of the G7 countries have significantly higher electricity prices (and therefore potentially higher data prices) than the US and Canada. This suggests that Big Tech data centers will produce the Oregon Effect in those countries with higher residential energy costs in a pre-AI world. That in turn suggests that Big Tech is going to be coming around with their tin cup for corporate welfare to keep their data center electric bills low, or maybe they’ll just buy the electric plants. For themselves.

Either way, it’s unlikely that this data center thumb on the scale and the corporate welfare that goes with it will cause energy prices to decline. And you can just forget that whole Net Zero thing.

If you don’t like where this is going, call your elected representative!

Search is Ripping Us Off Again, this time for AI

By Chris Castle

In case you think that infringement of copyrights in AI training happens only to rock stars, photographers, record companies or movie studios, let me give you a personal anecdote that might cause you to reconsider. I asked Bing’s Copilot AI tool to list the deal points in a record producer agreement, and this is what I got back:

Copilot

Certainly! When negotiating a producer agreement, several key points should be considered. Here are some common negotiation points from the artist’s perspective:

  1. Scope of Engagement:
    1. The producer is generally engaged for one album project at a time.
    2. Clarify the scope of work, including the number of songs or recordings to be produced.
  2. Budget and Compensation:
    1. Determine the budget for the project (e.g., $10,000 in your case).
    2. Agree on the producer’s compensation, which may include an upfront fee, points on sales, or a combination.
  3. Ownership and Rights:
    1. Address ownership of the master recordings:
      1. Will the artist own and control the masters?
      2. Does the producer retain any rights?
  4. Credits and Royalties:
    1. Specify how the producer will be credited on releases.
    2. Discuss any royalties or backend points for the producer.
  5. Recording Schedule:
    1. Set a timeline for recording sessions.
    2. Define the two-week period for your project.
  6. Logistics:
    1. Determine where the recordings will take place.
    2. Outline the artist’s approval process.
    3. Define the producer’s deliverables.
  7. Termination Clause:
    1. Include provisions for termination if either party breaches the agreement.
    2. Address payment for work completed up to that point.

Remember that each negotiation is unique, and it’s essential to tailor the agreement to your specific needs and circumstances. Legal advice is recommended to ensure compliance with local laws and industry standards1 3 4.

Source(s)

1. An Artist’s Guide to Negotiating Record Producer Agreements – Medium

2. Negotiating a Music Producer Agreement: 7 Key Issues

3. 5 Things to Look for in a Music Producer Agreement – Lawyer Drummer

4. How to Negotiate: Steps, Phrases & Tips to Win the Deal – Selling Signals

Notice that Copilot lists “Source(s)” and that source #1 is an article from the Medium website.

My article.

I’ve written a number of articles about negotiating producer agreements and I can categorically say I haven’t given permission or been asked to give permission for any of those articles to be scraped and used to train AI. So how did it come to be in Copilot’s training materials? 

My guess is that Medium (and probably a lot of other sites) were crawled by Bing’s spider and that copies of Medium pages were stored in Bing’s (or Microsoft’s) data centers. When Microsoft launched its Copilot AI tool, one could imagine that the temptation would have been strong to use those copies as part of the training tools for Copilot. In fact, the Verge reports that “companies around the web are using your site and its data to build massive sets of training data, in order to build models and products that may not acknowledge your existence at all.” Can you say Section 230 or DMCA? Maybe there will be a question at the April 11 House Energy & Commerce hearing on Section 230.

So Google-style “innovation” has a self-destructive flavor consistent with their oikophobic capitalism. Greater Silicon Valley’s view of culture in general and copyright in particular is as sources of wealth extracted by destruction–you know, creative destruction, permissionless innovation, etc. (It’s no wonder Google was inexorably attracted to defense contracting despite all the “don’t be evil” hoorah. After all, what creates massive wealth faster than convincing governments to pay big money to blow things up that must be replaced by ever more big money to blow even more things up.)

Are you surprised then that two of the biggest operators in the AI space are the search engine operators Google and Microsoft? This is another example of how Big Tech helps itself to your data and work product without you even knowing it’s happening. So now what? I now know I’m being ripped off, and I’m wondering if Medium is in on it.

The Verge tells us:

The ability to download, store, organize, and query the modern internet gives any company or developer something like the world’s accumulated knowledge to work with. In the last year or so, the rise of AI products like ChatGPT, and the large language models underlying them, have made high-quality training data one of the internet’s most valuable commodities. That has caused internet providers of all sorts to reconsider the value of the data on their servers, and rethink who gets access to what. 

Ya think?