@ArtistRights Newsletter 4/14/25

The Artist Rights Watch podcast returns for another season! This week’s episode features AI Legislation, A View from Europe: Helienne Lindvall, President of the European Composer and Songwriter Alliance (ECSA) and ARI Director Chris Castle in conversation regarding current issues for creators regarding the EU AI Act and the UK Text and Data Mining legislation. Download it here or subscribe wherever you get your audio podcasts.

New Survey for Songwriters: We are surveying songwriters about whether they want to form a certified union. Please fill out our short Survey Monkey confidential survey here! Thanks!

AI Litigation: Kadrey v. Meta

Law Professors Reject Meta’s Fair Use Defense in Friend of the Court Brief

Ticketing
Viagogo failing to prevent potentially unlawful practices, listings on resale site suggest that scalpers are speculatively selling tickets they do not yet have (Rob Davies/The Guardian)

ALEC Astroturf Ticketing Bill Surfaces in North Carolina Legislation

ALEC Ticketing Bill Surfaces in Texas to Rip Off Texas Artists (Chris Castle/MusicTechPolicy)

International AI Legislation

Brazil’s AI Act: A New Era of AI Regulation (Daniela Atanasovska and Lejla Robeli/GDPR Local)

Why robots.txt won’t get it done for AI Opt Outs (Chris Castle/MusicTechPolicy)

Feature TranslationHow has the West’s misjudgment of China’s AI ecosystem distorted the global technology competition landscape (Jeffrey Ding/ChinAI)

Unethical AI Training Harms Creators and Society, Argues AI Pioneer (Ed Nawotka/Publishers Weekly) 

AI Ethics

Céline Dion Calls Out AI-Generated Music Claiming to Feature the Iconic Singer Without Her Permission (Marina Watts/People)

Splice CEO Discusses Ethical Boundaries of AI in Music​ (Nilay Patel/The Verge)

Spotify’s Bold AI Gamble Could Disrupt The Entire Music Industry (Bernard Marr/Forbes)

Books

Apple in China: The Capture of the World’s Greatest Company by Patrick McGee (Coming May 13)

PRESS RELEASE: @Human_Artistry Campaign Endorses NO FAKES Act to Protect Personhood from AI

For Immediate Release

HUMAN ARTISTRY CAMPAIGN ENDORSES NO FAKES ACT

Bipartisan Bill Reintroduced by Senators Blackburn, Coons, Tillis, & Klobuchar and Representatives Salazar, Dean, Moran, Balint and Colleagues

Create New Federal Right for Use of Voice and Visual Likeness
in Digital Replicas

Empowers Artists, Voice Actors, and Individual Victims to Fight Back Against
AI Deepfakes and Voice Clones

WASHINGTON, DC (April 9, 2025) – Amid global debate over guardrails needed for AI, the Human Artistry Campaign today announced its support for the reintroduced “Nurture Originals, Foster Art, and Keep Entertainment Safe Act of 2025” (“NO FAKES Act”) – landmark legislation giving every person an enforceable new federal intellectual property right in their image and voice. 

Building off the original NO FAKES legislation introduced last Congress, the updated bill was reintroduced today by Senators Marsha Blackburn (R-TN), Chris Coons (D-DE), Thom Tillis (R-NC), Amy Klobuchar (D-MN) alongside Representatives María Elvira Salazar (R-FL-27), Madeleine Dean (D-PA-4), Nathaniel Moran (R-TX-1), and Becca Balint (D-VT-At Large) and bipartisan colleagues.

The legislation sets a strong federal baseline protecting all Americans from invasive AI-generated deepfakes flooding digital platforms today. From young students bullied by non-consensual sexually explicit deepfakes to families scammed by voice clones to recording artists and performers replicated to sing or perform in ways they never did, the NO FAKES Act provides powerful remedies requiring platforms to quickly take down unconsented deepfakes and voice clones and allowing rights​​holders to seek damages from creators and distributors of AI models designed specifically to create harmful digital replicas.

The legislation’s thoughtful, measured approach preserves existing state causes of action and rights of publicity, including Tennessee’s groundbreaking ELVIS Act. It also contains carefully calibrated exceptions to protect free speech, open discourse and creative storytelling – without trampling the underlying need for real, enforceable protection against the vast range of invasive and harmful deepfakes and voice clones.

Human Artistry Campaign Senior Advisor Dr. Moiya McTier released the following statement in support of the legislation:

​“The Human Artistry Campaign stands for preserving essential qualities of all individuals – beginning with a right to their own voice and image. The NO FAKES Act is an important step towards necessary protections that also support free speech and AI development. The Human Artistry Campaign commends Senators Blackburn, Coons, Tillis, and Klobuchar and Representatives Salazar, Dean, Moran, Balint, and their colleagues for shepherding bipartisan support for this landmark legislation, a necessity for every American to have a right to their own identity as highly realistic voice clones and deepfakes become more pervasive.

Dr. Moiya McTier, Human Artistry Campaign Senior Advisor

By establishing clear rules for the new federal voice and image right, the NO FAKES Act will power innovation and responsible, pro-human uses of powerful AI technologies while providing strong protections for artists, minors and others. This important bill has cross-sector support from Human Artistry Campaign members and companies such as OpenAI, Google, Amazon, Adobe and IBM. The NO FAKES Act is a strong step forward for American leadership that erects clear guardrails for AI and real accountability for those who reject the path of responsibility and consent.

Learn more & let your representatives know Congress should pass NO FAKES Act here.

​# # #

ABOUT THE HUMAN ARTISTRY CAMPAIGN: The Human Artistry Campaign is the global initiative for the advancement of responsible AI – working to ensure it develops in ways that strengthen the creative ecosystem, while also respecting and furthering the indispensable value of human artistry to culture. Across 34 countries, more than 180 organizations have united to protect every form of human expression and creative endeavor they represent – journalists, recording artists, photographers, actors, songwriters, composers, publishers, independent record labels, athletes and more. The growing coalition champions seven core principles for keeping human creativity at the center of technological innovation. For further information, please visit humanartistrycampaign.com

@human_artistry Calls Out AI Voice Cloning

Here’s just one reason why we can’t trust Big Tech for opt out (or really any other security that stops them from doing what they want to do)

@ArtistRights Institute’s UK Government Comment on AI and Copyright: Why Can’t Creators Call 911?

We will be posting excerpts from the Artist Rights Institute’s comment in the UK’s Intellectual Property Office proceeding on AI and copyright. That proceeding is called a “consultation” where the Office solicits comments from the public (wherever located) about a proposed policy.

In this case it was the UK government’s proposal to require creators to “opt out” of AI data scraping by expanding the law in the UK governing “text and data mining” which is what Silicon Valley wants in a big way. This idea produced an enormous backlash from the creative community that we’ll also be covering in coming weeks as it’s very important that Trichordist readers be up to speed on the latest skulduggery by Big Tech in snarfing down all the world’s culture to train their AI (which has already happened and now has to be undone). For a backgrounder on the “text and data mining” controversy, watch this video by George York of the Digital Creators Coalition speaking at the Artist Rights Institute in DC.

In this section of the comment we offer a simple rule of thumb or policy guideline by which to measure the Government’s rules (which could equally apply in America): Can an artist file a criminal complaint against someone like Sam Altman?

If an artist is more likely to be able to get the police to stop their car from being stolen off the street than to get the police to stop the artist’s life’s work from being stolen online by a heavily capitalized AI platform, the policy will fail

Why Can’t Creators Call 999 [or 911]?

We suggest a very simple policy guideline—if an artist is more likely to be able to get the police to stop their car from being stolen off the street than to get the police to stop the artist’s life’s work from being stolen online by a heavily capitalized AI platform, the policy will fail.  Alternatively, if an artist can call the police and file a criminal complaint against a Sam Altman or a Sergei Brin for criminal copyright infringement, now we are getting somewhere.

This requires that there be a clear “red light/green light” instruction that can easily be understood and applied by a beat copper.  This may seem harsh, but in our experience with the trillion-dollar market cap club, the only thing that gets their attention is a legal action that affects behavior rather than damages.  Our experience suggests that what gets their attention most quickly is either an injunction to stop the madness or prison to punish the wrongdoing. 

As a threshold matter, it is clear that AI platforms intend to continue scraping all the world’s culture for their purposes without obtaining consent or notifying rightsholders.  It is likely that the bigger platforms already have.  For example, we have found our own writings included in CoPilot outputs.  Not only did we not consent to that use, but we were also never asked.  Moreover, CoPilot’s use of these works clearly violates our terms of service.  This level of content scraping is hardly what was contemplated with the “data mining” exceptions. 

@human_artistry Press Release: Senators Introduce COPIED Act to Combat AI Deepfakes

Senators Cantwell, Blackburn, and Heinrich introduce the Content Origin Protection and Integrity from Edited and Deepfaked Media Act (COPIED Act), Giving Artists New Tools to Protect Against Deepfakes
“Deepfakes pose an existential threat to our culture and society, making it hard to believe what we see and hear and leaving individual creators vulnerable as tech companies use our art without consent while AI-generated content leads to confusion about what is real. Requiring transparency is a meaningful step that will help protect us all – ensuring that nonconsensual, harmful content can be removed quickly and providing a clear origin when our life’s work has been used.” 
– Dr. Moiya McTier, Human Artistry Campaign Senior Advisor
With widespread creative community support from organizations including the Artist Rights Alliance, SAG-AFTRA, the Recording Academy, RIAA, NMPA, NSAI, and more, the bill would set new federal transparency guidelines for marking, authenticating and detecting AI-generated content, protect journalists, actors and artists against AI-driven theft, and hold violators accountable for abuses.  

Creates Transparency Standards: Requires the National Institute of Standards and Technology (NIST) to develop guidelines and standards for content provenance information, watermarking and synthetic content detection. These standards will promote transparency to identify if content has been generated or manipulated by AI, as well as where AI content originated. The bill also directs NIST to develop cybersecurity measures to prevent tampering with provenance and watermarking on AI content. 

Puts Journalists, Artists and Musicians in Control of Their Content: Requires providers of AI tools used to generate creative or journalistic content to allow owners of that content to attach provenance information to it and prohibits its removal. The bill prohibits the unauthorized use of content with provenance information to train AI models or generate AI content. These measures give content owners—journalists, newspapers, artists, songwriters, and others—the ability to protect their work and set the terms of use for their content, including compensation. 

Gives Individuals a Right to Sue Violators: Authorizes the Federal Trade Commission (FTC) and state attorneys general to enforce the bill’s requirements.  It also gives newspapers, broadcasters, artists, and other content owners the right to bring suit in court against platforms or others who use their content without permission.

Prohibits Tampering with or Disabling AI Provenance Information: Currently, there is no law that prohibits removing, disabling, or tampering with content provenance information. The bill prohibits anyone, including internet platforms, search engines and social media companies, from interfering with content provenance information in these ways.  

Search is Ripping Us Off Again, this time for AI

By Chris Castle

In case you think that infringement of copyrights in AI training happens only to rock stars, photographers, record companies or movie studios, let me give you a personal anecdote that might cause you to reconsider. I asked Bing’s Copilot AI tool to list the deal points in a record producer agreement, and this is what I got back:

Copilot

Certainly! When negotiating a producer agreement, several key points should be considered. Here are some common negotiation points from the artist’s perspective:

  1. Scope of Engagement:
    1. The producer is generally engaged for one album project at a time.
    2. Clarify the scope of work, including the number of songs or recordings to be produced.
  2. Budget and Compensation:
    1. Determine the budget for the project (e.g., $10,000 in your case).
    2. Agree on the producer’s compensation, which may include an upfront fee, points on sales, or a combination.
  3. Ownership and Rights:
    1. Address ownership of the master recordings:
      1. Will the artist own and control the masters?
      2. Does the producer retain any rights?
  4. Credits and Royalties:
    1. Specify how the producer will be credited on releases.
    2. Discuss any royalties or backend points for the producer.
  5. Recording Schedule:
    1. Set a timeline for recording sessions.
    2. Define the two-week period for your project.
  6. Logistics:
    1. Determine where the recordings will take place.
    2. Outline the artist’s approval process.
    3. Define the producer’s deliverables.
  7. Termination Clause:
    1. Include provisions for termination if either party breaches the agreement.
    2. Address payment for work completed up to that point.

Remember that each negotiation is unique, and it’s essential to tailor the agreement to your specific needs and circumstances. Legal advice is recommended to ensure compliance with local laws and industry standards1 3 4.

Source(s)

1. An Artist’s Guide to Negotiating Record Producer Agreements – Medium

2. Negotiating a Music Producer Agreement: 7 Key Issues

3. 5 Things to Look for in a Music Producer Agreement – Lawyer Drummer

4. How to Negotiate: Steps, Phrases & Tips to Win the Deal – Selling Signals

Notice that Copilot lists “Source(s)” and that source #1 is an article from the Medium website.

My article.

I’ve written a number of articles about negotiating producer agreements and I can categorically say I haven’t given permission or been asked to give permission for any of those articles to be scraped and used to train AI. So how did it come to be in Copilot’s training materials? 

My guess is that Medium (and probably a lot of other sites) were crawled by Bing’s spider and that copies of Medium pages were stored in Bing’s (or Microsoft’s) data centers. When Microsoft launched its Copilot AI tool, one could imagine that the temptation would have been strong to use those copies as part of the training tools for Copilot. In fact, the Verge reports that “companies around the web are using your site and its data to build massive sets of training data, in order to build models and products that may not acknowledge your existence at all.” Can you say Section 230 or DMCA? Maybe there will be a question at the April 11 House Energy & Commerce hearing on Section 230.

So Google-style “innovation” has a self-destructive flavor consistent with their oikophobic capitalism. Greater Silicon Valley’s view of culture in general and copyright in particular is as sources of wealth extracted by destruction–you know, creative destruction, permissionless innovation, etc. (It’s no wonder Google was inexorably attracted to defense contracting despite all the “don’t be evil” hoorah. After all, what creates massive wealth faster than convincing governments to pay big money to blow things up that must be replaced by ever more big money to blow even more things up.)

Are you surprised then that two of the biggest operators in the AI space are the search engine operators Google and Microsoft? This is another example of how Big Tech helps itself to your data and work product without you even knowing it’s happening. So now what? I now know I’m being ripped off, and I’m wondering if Medium is in on it.

The Verge tells us:

The ability to download, store, organize, and query the modern internet gives any company or developer something like the world’s accumulated knowledge to work with. In the last year or so, the rise of AI products like ChatGPT, and the large language models underlying them, have made high-quality training data one of the internet’s most valuable commodities. That has caused internet providers of all sorts to reconsider the value of the data on their servers, and rethink who gets access to what. 

Ya think?

What would Lars say? Artificial Intelligence: Nobel or RICO?

All the true promise of AI does not require violating writers, artists, photographers, voice actors etc copyrights and rights of publicity. You know, stuff like reading MRIs and X-rays, developing pharmaceuticals, advanced compounds, new industrial processes, etc.

All the shitty aspects of AI DO require intentional mass copyright infringement (a RICO predicate BTW). You know stuff like bots, deep fakes, autogenerated “yoga mat” music, SEO manipulation, autogenerated sports coverage, commercial chat bots, fake student papers, graphic artist knockoffs, robot voice actors etc. But that’s where the no-value-add-parasitic-free-rider-easy-money is to be made. That’s why the parasitic free-riding VCs and private equity want to get a “fair use” copyright exemption.

Policy makers should understand that if they want to reduce the potential harms of AI they need to protect and reinforce intellectual property rights of individuals. It is a natural (and already existing) brake on harmful AI. What we don’t need is legislative intervention that makes it easier to infringe IP rights and then try to mitigate (the easily predictable and obvious) harms with additional regulation.

This is what happened with Napster and internet 1.0. The DMCA copyright infringement safe harbor for platforms unleashed all sorts of negative externalities that were never fairly mitigated by subsequent regulation.

Why do songwriters get 0.0009 a stream on streaming platforms? Because the platforms used the threat of the DMCA copyright safe harbor by “bad actors” (often connected to the “good actors” via shared board members and investors*) to create a market failure that destroyed the value of songs. To “fix” the problem federal legislation tasks the Copyright Royalty Board in LOC to set royalty rates and forced songwriters to license to the digital platforms (songwriters can not opt out). The royalty setting process was inevitably captured by the tech companies and that’s how you end up with 0.0009 per stream.

TBF the DMCA safe harbor requires the platforms to set up “technical measures” to prevent unlicensed use of copyrights, but this part of the DMCA safe harbor were never implemented and the federal government never bothered to enforce this part of the law. This is the Napster playbook all over again.

1. Unleash a technology that you know will be exploited by bad actors**.

2. Ask for federal intervention that essentially legalizes the infringing behavior.

3. The federal legislation effectively creates private monopoly or duopoly.

4. Trillions of dollars in wealth transferred from creators to a tiny cabal of no-value-add-parasitic-free-rider-easy-money VCs in silicon valley.

5. Lots of handwringing about the plight of creators.

6. Bullshit legislation that claims to help creators but actually mandates a below market rate for creators.

The funny thing is Lars Ulrich was right about Napster. [See our 2012 post Lars Was First and Lars Was Right.] At the time he was vilified by what in reality was a coordinated DC communication firm (working for Silicon Valley VCs) that masqueraded as grassroots operation.

But go back and watch the Charlie Rose debate between Lars Ulrich and Chuck D, everything Lars Ulrich said was gonna happen happened.

If Lars Ulrich hadn’t been cowed by a coordinated campaign by no-value-add-parasitic-free-rider-easy-money Silicon Valley VCs, he’d probably say the same thing about AI.

And he’d be right again.

@justinebateman wants to see AI contract terms in actor agreements and music folk should be checking, too

Chris Castle says: If you are seeing contract language that allows the other side to use your name, image, likeness, voice, etc., take a close look at the rights granted. It may not be that obvious. Like all other configuration changes in the past, artists, producers and songwriters need to look at their existing agreements and see how old language will be interpreted to cover AI.

Example: Label will have (i) exclusive record artwork merchandising rights; and (ii) exclusive rights to sell merchandise embodying three (3) exclusive designs per contract period, alone and in conjunction with Artist’s names and approved pictures, likenesses and other identifications, subject to Artist’s approval with respect to such matters as product design and manufacturing.

“in conjunction with”, “other identifications” and “approval with respect to such matters as” each take on new meaning.

Example: Artist acknowledges that Label is the exclusive owner of all rights of copyright in Masters and Records embodying the results and proceeds of Artist’s recording services made pursuant to the Recording Agreement or during its term, including the exclusive right to copyright same as “sound recordings” in the name of Label, to renew and extend such copyrights (and all rights in and thereto are hereby assigned to Label), and to exercise all rights of the copyright proprietor thereunder as provided in the Recording Agreement.

Recordings “made pursuant to the Recording Agreement or during its term” could mean AI works.

Example: As used in this agreement, “Other Entertainment Services” shall mean any and all entertainment industry activity that are not otherwise provided for in this recording agreement, including, without limitation, the following: (a) the exploitation in any and all media of the name(s) likeness(es), visual representations, biographical material and/or logo(s) of or relating to Artist or any member of Artist (all of the intellectual properties relating to Artist referred to above are sometimes referred to herein collectively and individually as “Artist Properties”), either alone or in conjunction with other elements, including without limitation merchandise for sale at the site(s) of any and all live concert engagements performed by Artist or any member of Artist, premiums such as products which bear a third party’s trademarks or logos together with Artist Properties, tie-ins, “bounceback” merchandising, and fan club merchandise, whether or not in connection with Master Recordings, including, without limitation, exploitation by any Person other than Label of any rights granted in this recording agreement; (b) endorsements, special marketing arrangements, sponsorships (including tour sponsorships), strategic partnerships or other business relationships with third parties; (c) live performance engagements as a musician, vocalist and/or performer by the Artist or any member of Artist in all media, including but not limited to musical performances on tour, in concerts, on television broadcast or cable casts (including pay-per-view telecasts), radio, “webcast” and all other means.