@SAGAFTRA and Major Labels Reach a New Sound Recording Agreement With AI Protections

Looks like both sides listened and respected each other so no strike necessary. And yet there are groundbreaking first-time AI protections for musicians and vocalists. Amazing what can happen when you actually respect the workers, looking at you YouTube. And guess what? The workers get to vote on the deal. Imagine if songwriters could do the same.

PRESS RELEASE

SAG-AFTRA and leading record labels reached a tentative multiyear agreement on a successor contract to the SAG-AFTRA National Code of Fair Practice for Sound Recordings.

Covering the period starting Jan. 1, 2021, and ending Dec. 31, 2026, the agreement includes Warner Music Group, Sony Music Entertainment, Universal Music Group and Disney Music Group.

The SAG-AFTRA Executive Committee unanimously approved the tentative agreement on Wednesday, April 10. It will now be sent to members for ratification.

SAG-AFTRA National Executive Director & Chief Negotiator Duncan Crabtree-Ireland said, “SAG-AFTRA and the music industry’s largest record labels have reached a groundbreaking agreement establishing, for the first time, collective bargaining guardrails assuring singers and recording artists ethical and responsible treatment in the use of artificial intelligence in the music industry. It is a testament to our mutual unwavering commitment to work together to safeguard the rights, dignity and creative freedom of our members.

“This agreement ensures that our members are protected. SAG-AFTRA stands firm in the belief that while technology can enhance the creative process, the essence of music must always be rooted in genuine human expression and experience. We look forward to working alongside our industry partners to foster an environment where innovation serves to elevate, not diminish, the unique value of each artist’s contribution to our rich cultural tapestry.”

The Record Label Negotiating Committee said, “We are pleased to reach this agreement with SAG-AFTRA and continue our strong partnership as we enter this exciting and fast-moving new era for music and artists. Together, we’ll chart a successful course forward, embracing new opportunities and facing our common challenges, strengthened by our shared values and commitment to human artistry.”

The artificial intelligence guardrails take effect immediately upon ratification. The terms “artist,” “singer,” and “royalty artist,” under this agreement only include humans. In this agreement, clear and conspicuous consent, along with minimum compensation requirements and specific details of intended use, are required prior to the release of a sound recording that uses a digital replication of an artist’s voice.

Additional highlights among the contract gains include increased minimums, health and retirement improvements, and an increase in the percentage of streaming revenue to be covered by contributions, among other gains.

Search is Ripping Us Off Again, this time for AI

By Chris Castle

In case you think that infringement of copyrights in AI training happens only to rock stars, photographers, record companies or movie studios, let me give you a personal anecdote that might cause you to reconsider. I asked Bing’s Copilot AI tool to list the deal points in a record producer agreement, and this is what I got back:

Copilot

Certainly! When negotiating a producer agreement, several key points should be considered. Here are some common negotiation points from the artist’s perspective:

  1. Scope of Engagement:
    1. The producer is generally engaged for one album project at a time.
    2. Clarify the scope of work, including the number of songs or recordings to be produced.
  2. Budget and Compensation:
    1. Determine the budget for the project (e.g., $10,000 in your case).
    2. Agree on the producer’s compensation, which may include an upfront fee, points on sales, or a combination.
  3. Ownership and Rights:
    1. Address ownership of the master recordings:
      1. Will the artist own and control the masters?
      2. Does the producer retain any rights?
  4. Credits and Royalties:
    1. Specify how the producer will be credited on releases.
    2. Discuss any royalties or backend points for the producer.
  5. Recording Schedule:
    1. Set a timeline for recording sessions.
    2. Define the two-week period for your project.
  6. Logistics:
    1. Determine where the recordings will take place.
    2. Outline the artist’s approval process.
    3. Define the producer’s deliverables.
  7. Termination Clause:
    1. Include provisions for termination if either party breaches the agreement.
    2. Address payment for work completed up to that point.

Remember that each negotiation is unique, and it’s essential to tailor the agreement to your specific needs and circumstances. Legal advice is recommended to ensure compliance with local laws and industry standards1 3 4.

Source(s)

1. An Artist’s Guide to Negotiating Record Producer Agreements – Medium

2. Negotiating a Music Producer Agreement: 7 Key Issues

3. 5 Things to Look for in a Music Producer Agreement – Lawyer Drummer

4. How to Negotiate: Steps, Phrases & Tips to Win the Deal – Selling Signals

Notice that Copilot lists “Source(s)” and that source #1 is an article from the Medium website.

My article.

I’ve written a number of articles about negotiating producer agreements and I can categorically say I haven’t given permission or been asked to give permission for any of those articles to be scraped and used to train AI. So how did it come to be in Copilot’s training materials? 

My guess is that Medium (and probably a lot of other sites) were crawled by Bing’s spider and that copies of Medium pages were stored in Bing’s (or Microsoft’s) data centers. When Microsoft launched its Copilot AI tool, one could imagine that the temptation would have been strong to use those copies as part of the training tools for Copilot. In fact, the Verge reports that “companies around the web are using your site and its data to build massive sets of training data, in order to build models and products that may not acknowledge your existence at all.” Can you say Section 230 or DMCA? Maybe there will be a question at the April 11 House Energy & Commerce hearing on Section 230.

So Google-style “innovation” has a self-destructive flavor consistent with their oikophobic capitalism. Greater Silicon Valley’s view of culture in general and copyright in particular is as sources of wealth extracted by destruction–you know, creative destruction, permissionless innovation, etc. (It’s no wonder Google was inexorably attracted to defense contracting despite all the “don’t be evil” hoorah. After all, what creates massive wealth faster than convincing governments to pay big money to blow things up that must be replaced by ever more big money to blow even more things up.)

Are you surprised then that two of the biggest operators in the AI space are the search engine operators Google and Microsoft? This is another example of how Big Tech helps itself to your data and work product without you even knowing it’s happening. So now what? I now know I’m being ripped off, and I’m wondering if Medium is in on it.

The Verge tells us:

The ability to download, store, organize, and query the modern internet gives any company or developer something like the world’s accumulated knowledge to work with. In the last year or so, the rise of AI products like ChatGPT, and the large language models underlying them, have made high-quality training data one of the internet’s most valuable commodities. That has caused internet providers of all sorts to reconsider the value of the data on their servers, and rethink who gets access to what. 

Ya think?

In the fight for AI Justice, “The music industry is the tip of the spear” @MikeHuppe #IRespectMusic

Get smart about the NO AI Fraud Act with this MTP Mini Brief on the NO AI Fraud Act.

@RepMariaSalazar and @RepDean Introduce No AI Fraud Act to protect artists against AI Fakes #irespectmusic @human_artistry

Press Release

SUPPORT THE No AI FRAUD ACT

AI-Generated Fakes Threaten All Americans

New personalized generative artificial intelligence (AI) cloning models and services have enabled human impersonation and allow users to make unauthorized fakes using the images and voices of others. The abuse of this quickly advancing technology has affected everyone from musical artists to high school students whose personal rights have been violated.

AI-generated fakes and forgeries are everywhere. While AI holds incredible promise, Americans deserve common sense rules to ensure that a person’s voice and likeness cannot be exploited without their permission.

The Threat Is Here

Protection from AI fakes is needed now. We have already seen the kinds of harm these cloning models can inflict, and the problem won’t resolve itself.

From an AI-generated Drake/The Weeknd duet, to Johnny Cash singing “Barbie Girl,” to “new” songs by Bad Bunny that he never recorded to a false dental plan endorsement featuring Tom Hanks, unscrupulous businesses and individuals are hijacking professionals’ voices and images, undermining the legitimate works and aspirations of essential contributors to American culture and commerce.

But AI fakes aren’t limited to famous icons. Last year, nonconsensual, intimate AI fakes of high school girls shook a New Jersey town. Such lewd and abusive AI fakes can be generated and disseminated with ease. And without prompt action, confusion will continue to grow about what is real, undermining public trust and risking harm to reputations, integrity, and human wellbeing.   

Inconsistent State Laws Aren’t Enough

The existing patchwork of state laws needs bolstering with a federal solution that provides baseline protections, offering meaningful recourse nationwide.

The No AI FRAUD Act Provides Needed Protection

The No AI Fake Replicas and Unauthorized Duplications (No AI FRAUD) Act of 2024 builds on effective elements of state and federal law to:

  • Reaffirm that everyone’s likeness and voice is protected, giving individuals the right to control the use of their identifying characteristics.
  • Empower individuals to enforce this right against those who facilitate, create, and spread AI frauds without their permission.
  • Balance the rights against the 1st Amendment to safeguard speech and innovation.

The No AI FRAUD Act is an important and necessary step to protect our valuable and unique personal identities.

What would Lars say? Artificial Intelligence: Nobel or RICO?

All the true promise of AI does not require violating writers, artists, photographers, voice actors etc copyrights and rights of publicity. You know, stuff like reading MRIs and X-rays, developing pharmaceuticals, advanced compounds, new industrial processes, etc.

All the shitty aspects of AI DO require intentional mass copyright infringement (a RICO predicate BTW). You know stuff like bots, deep fakes, autogenerated “yoga mat” music, SEO manipulation, autogenerated sports coverage, commercial chat bots, fake student papers, graphic artist knockoffs, robot voice actors etc. But that’s where the no-value-add-parasitic-free-rider-easy-money is to be made. That’s why the parasitic free-riding VCs and private equity want to get a “fair use” copyright exemption.

Policy makers should understand that if they want to reduce the potential harms of AI they need to protect and reinforce intellectual property rights of individuals. It is a natural (and already existing) brake on harmful AI. What we don’t need is legislative intervention that makes it easier to infringe IP rights and then try to mitigate (the easily predictable and obvious) harms with additional regulation.

This is what happened with Napster and internet 1.0. The DMCA copyright infringement safe harbor for platforms unleashed all sorts of negative externalities that were never fairly mitigated by subsequent regulation.

Why do songwriters get 0.0009 a stream on streaming platforms? Because the platforms used the threat of the DMCA copyright safe harbor by “bad actors” (often connected to the “good actors” via shared board members and investors*) to create a market failure that destroyed the value of songs. To “fix” the problem federal legislation tasks the Copyright Royalty Board in LOC to set royalty rates and forced songwriters to license to the digital platforms (songwriters can not opt out). The royalty setting process was inevitably captured by the tech companies and that’s how you end up with 0.0009 per stream.

TBF the DMCA safe harbor requires the platforms to set up “technical measures” to prevent unlicensed use of copyrights, but this part of the DMCA safe harbor were never implemented and the federal government never bothered to enforce this part of the law. This is the Napster playbook all over again.

1. Unleash a technology that you know will be exploited by bad actors**.

2. Ask for federal intervention that essentially legalizes the infringing behavior.

3. The federal legislation effectively creates private monopoly or duopoly.

4. Trillions of dollars in wealth transferred from creators to a tiny cabal of no-value-add-parasitic-free-rider-easy-money VCs in silicon valley.

5. Lots of handwringing about the plight of creators.

6. Bullshit legislation that claims to help creators but actually mandates a below market rate for creators.

The funny thing is Lars Ulrich was right about Napster. [See our 2012 post Lars Was First and Lars Was Right.] At the time he was vilified by what in reality was a coordinated DC communication firm (working for Silicon Valley VCs) that masqueraded as grassroots operation.

But go back and watch the Charlie Rose debate between Lars Ulrich and Chuck D, everything Lars Ulrich said was gonna happen happened.

If Lars Ulrich hadn’t been cowed by a coordinated campaign by no-value-add-parasitic-free-rider-easy-money Silicon Valley VCs, he’d probably say the same thing about AI.

And he’d be right again.

Must Read Post by @ednewtonrex on Why He Resigned from Stability AI Over Fake Fair Use Defense

I’ve resigned from my role leading the Audio team at Stability AI, because I don’t agree with the company’s opinion that training generative AI models on copyrighted works is ‘fair use’. 

First off, I want to say that there are lots of people at Stability who are deeply thoughtful about these issues. I’m proud that we were able to launch a state-of-the-art AI music generation product trained on licensed training data, sharing the revenue from the model with rights-holders. I’m grateful to my many colleagues who worked on this with me and who supported our team, and particularly to Emad for giving us the opportunity to build and ship it. I’m thankful for my time at Stability, and in many ways I think they take a more nuanced view on this topic than some of their competitors. 

But, despite this, I wasn’t able to change the prevailing opinion on fair use at the company. 

This was made clear when the US Copyright Office recently invited public comments on generative AI and copyright, and Stability was one of many AI companies to respond. Stability’s 23-page submission included this on its opening page: 

“We believe that Al development is an acceptable, transformative, and socially-beneficial use of existing content that is protected by fair use”. 

For those unfamiliar with ‘fair use’, this claims that training an AI model on copyrighted works doesn’t infringe the copyright in those works, so it can be done without permission, and without payment. This is a position that is fairly standard across many of the large generative AI companies, and other big tech companies building these models — it’s far from a view that is unique to Stability. But it’s a position I disagree with. 

I disagree because one of the factors affecting whether the act of copying is fair use, according to Congress, is “the effect of the use upon the potential market for or value of the copyrighted work”. Today’s generative AI models can clearly be used to create works that compete with the copyrighted works they are trained on. So I don’t see how using copyrighted works to train generative AI models of this nature can be considered fair use. 

But setting aside the fair use argument for a moment — since ‘fair use’ wasn’t designed with generative AI in mind — training generative AI models in this way is, to me, wrong. Companies worth billions of dollars are, without permission, training generative AI models on creators’ works, which are then being used to create new content that in many cases can compete with the original works. I don’t see how this can be acceptable in a society that has set up the economics of the creative arts such that creators rely on copyright. 

To be clear, I’m a supporter of generative AI. It will have many benefits — that’s why I’ve worked on it for 13 years. But I can only support generative AI that doesn’t exploit creators by training models — which may replace them — on their work without permission. 

I’m sure I’m not the only person inside these generative AI companies who doesn’t think the claim of ‘fair use’ is fair to creators. I hope others will speak up, either internally or in public, so that companies realise that exp

Five Points for Potential AI Framework Agreements

By Chris Castle

This post first appeared on MusicTechPolicy

When you see Big Tech start to make Newspeak noises about wanting to license creative works for artificial intelligence, it’s well to remember a couple facts about how they treat people, business practices that they don’t talk about at parties. Or to Congress.

Take their supply chain, particularly their manufacturing supply chain in China where some of all their products use slave labor. And the cobalt that goes into every battery powered device like your smartphone is obtained through the equally Newspeak “artisanal mining” otherwise known as impossibly poor children mining cobalt by clawing it out of the dirt with their bare hands. You know, “artisanal”. (Read Cobalt Red by Sid Kara for that story.). Not to mention the grotesque and parasitic waste of electricity and the resources that provide it whether they are crowding out the public investment in renewables or driving coal powered generators. They don’t talk about it because they feel entitled to all of it which is to be expected from that feeder school for the Silicon Valley elites built with blood money from the Central Pacific Railroad.

So when you sit down at the negotiating table with these people, this is who they really are. That realization tells you a few things, but it mainly tells you they simply cannot be trusted in either life choices or in business choices.

Universal has taken a real leadership role in the AI negotiations that has both respected their artists and songwriters and given teeth to the principles of the Human Artistry Campaign. First of all, the company has made it clear that they are going to support their artists and songwriters in having a meaningful seat at the table. They will not send their artists to the charnel house. The only artists who participate will be the artists who decide to participate–opt in rather than Google’s preferred “opt out” structure which relies on the abuse of various safe harbors at scale. 

It appears that until such time as both the artists and songwriters and Universal are comfortable with the integrity of the creative and business model of YouTube’s AI music suite of tools, there’s no deal. Negotiations presumably will continue so there may be at least a commercial frameworks. 

To that end, here are five points that might prove useful.

  1. Artists and songwriters need to be at the table: One takeaway from the frozen mechanicals experience is how necessary it is for the creators to be included–not through an organization but actual individuals who speak for themselves and are not influenced by lobbyists.  Universal has proven that this is possible. This is a huge advancement in label-artist relations and publisher-writer relations, particularly because it’s obvious from the creators who stepped forward that these are articulate independent thinkers who are not going to tow the party line.  That is the whole idea. If you don’t trust your artists and writers enough to give them freedom to speak their minds, then let’s face it–there’s something wrong with your business model.
  2. All AI licenses should be opt in: Most of YouTube’s many artist relations issues arise from artists not having the right and ability to withhold their work from whatever the platform is. This is particularly true with UGC and advertising supported platforms. When you have poured out your soul in a recording that ends up with ads for drugs or miracle hair replacement treatments, it’s deflating and if anyone asked for approval, you’d probably decline. Which is why you negotiate marketing restrictions that prevent your music being used in advertising.
  3. No blind check deals and no “big pool” royalties: We haven’t gotten to the royalty rates yet, but there will be riots in the streets if anyone tries to perpetuate YouTube-style accountings, the grotesquely unfair TikTok blind check deals or “big pool” market centric royalties. AI gives us all a chance to get it right and build a new system that is artist centric. It’s encouraging that Lucian Grainge’s blog post announcing the relationship with YouTube is entitled “An artist centric approach to AI innovation” which is consistent with his prior statements about making streaming royalties more fair.
  4. Ability to track and account is a precondition: It should go without saying that in order to have meaningful royalty accounting, the service must have the ability to track and account. This is especially challenging in AI given the “training” issues. I will be pleasantly shocked if Google engineers designing the music AI tools have not entirely ignored tracking and accounting which they typically have viewed as a bug, not a feature. This is what gives rise to the blind check deals and other unworkable approaches which are most definitely not “artist centric.” Accordingly, the need to issue per work reports is essential.
  5. Audits should be much more frequent: This new product is a chance to revisit the standard approaches to auditing which have unfortunately become perpetuated in digital deals and most prominently in the Music Modernization Act (Title I). There is not much difference between the MMA audit rights and the audit clause from a 30 year old record deal notwithstanding the vast difference in commerce between the two. With AI, not only have the DSPs blown up the album to a commercial singles world, they are now trying to blow up the single to mind-numbing fragmentation. Potentially, this world will be like selling stems. This ushers in a whole new need for minimum viable data laws and enforcement for using standard identifiers.

There will be many other issues to address, but I think if we don’t address these key points we’ll find ourselves to be artisanal workers scratching out a living for ChatGPT. 

Press Release: Human Artistry Campaign Launches, Announces Artificial Intelligence Principles to Sustain Artists

40+ groups representing artists, performers, writers, athletes & more launch campaign for AI that supports human creativity and accomplishment

WASHINGTON, DC / AUSTIN, TX (March 16, 2023) – A broad coalition announced the launch of the Human Artistry Campaign to ensure artificial intelligence technologies are developed and used in ways that support human culture and artistry – and not ways that replace or erode it. With more than 40 members including major unions, trade associations, and policy experts representing individual creators and rightsholders from across the entire tapestry of creative endeavor, the Human Artistry Campaign is positioned to be a leading voice in the rapidly unfolding debate over the costs and benefits of different forms of AI.

The group outlined principles advocating AI best practices, emphasizing respect for artists, their work, and their personas; transparency; and adherence to existing law including copyright and intellectual property. 

The campaign urges supporters to sign a petition to advance these fundamental principles.

The launch was announced at SXSW in Austin today at an event featuring voice actor and prolific songwriter Dan Navarro, GRAMMY-nominated singer-songwriter Jessy Wilson and UT Austin professor and immersive technology expert Erin Reilly – and moderated by Rob Levine, Billboard’s Deputy Editorial Director.

Core Principles for Artificial Intelligence Applications in Support of Human Creativity and Accomplishments

  1. Technology has long empowered human expression, and AI will be no different.

For generations, various technologies have been used successfully to support human creativity. Take music, for example… From piano rolls to amplification to guitar pedals to synthesizers to drum machines to digital audio workstations, beat libraries and stems and beyond, musical creators have long used technology to express their visions through different voices, instruments, and devices. AI already is and will increasingly play that role as a tool to assist the creative process, allowing for a wider range of people to express themselves creatively. 

Moreover, AI has many valuable uses outside of the creative process itself, including those that amplify fan connections, hone personalized recommendations, identify content quickly and accurately, assist with scheduling, automate and enhance efficient payment systems – and more. We embrace these technological advances. 

  • Human-created works will continue to play an essential role in our lives. 

Creative works shape our identity, values, and worldview. People relate most deeply to works that embody the lived experience, perceptions, and attitudes of others. Only humans can create and fully realize works written, recorded, created, or performed with such specific meaning. Art cannot exist independent of human culture.

  • Use of copyrighted works, and use of the voices and likenesses of professional performers, requires authorization, licensing, and compliance with all relevant state and federal laws.

We fully recognize the immense potential of AI to push the boundaries for knowledge and scientific progress. However, as with predecessor technologies, the use of copyrighted works requires permission from the copyright owner. AI must be subject to free-market licensing for the use of works in the development and training of AI models. Creators and copyright owners must retain exclusive control over determining how their content is used. AI developers must ensure any content used for training purposes is approved and licensed from the copyright owner, including content previously used by any pre-trained AIs they may adopt. Additionally, performers’ and athletes’ voices and likenesses must only be used with their consent and fair market compensation for specific uses.

  • Governments should not create new copyright or other IP exemptions that allow AI developers to exploit creators without permission or compensation.

AI must not receive exemptions from copyright law or other intellectual property laws and must comply with core principles of fair market competition and compensation. Creating special shortcuts or legal loopholes for AI would harm creative livelihoods, damage creators’ brands, and limit incentives to create and invest in new works.

  • Copyright should only protect the unique value of human intellectual creativity.

Copyright protection exists to help incentivize and reward human creativity, skill, labor, and judgment -not output solely created and generated by machines. Human creators, whether they use traditional tools or express their creativity using computers, are the foundation of the creative industries and we must ensure that human creators are paid for their work.  

  • Trustworthiness and transparency are essential to the success of AI and protection of creators. 

Complete recordkeeping of copyrighted works, performances, and likenesses, including the way in which they were used to develop and train any AI system, is essential. Algorithmic transparency and clear identification of a work’s provenance are foundational to AI trustworthiness. Stakeholders should work collaboratively to develop standards for technologies that identify the input used to create AI-generated output. In addition to obtaining appropriate licenses, content generated solely by AI should be labeled describing all inputs and methodology used to create it — informing consumer choices, and protecting creators and rightsholders. 

  • Creators’ interests must be represented in policymaking. 

Policymakers must consider the interests of human creators when crafting policy around AI. Creators live on the forefront of, and are building and inspiring, evolutions in technology and as such need a seat at the table in any conversations regarding legislation, regulation, or government priorities regarding AI that would impact their creativity and the way it affects their industry and livelihood.

About the Human Artistry Campaign: The Human Artistry Campaign was launched at SXSW 2023 for open dialogue and guidance from the united creative community in shaping the AI debate. Visit HumanArtistryCampaign.com to join.

Members include: AFL-CIO; American Association of Independent Music; American Federation of Musicians; Americana Music Association; American Photographic Artists; Artist Rights Alliance; Artist Rights Watch; ASCAP; Association of American Publishers; Authors Guild; Black Music Action Coalition; BPI; Christian Music Trade Association; Church Music Publishers Association; Concept Art Association; Department of Professional Employees, AFL-CIO; European Composer and Songwriter Alliance; Folk Alliance International; Future of Music Coalition; Georgia Music Partners; Global Music Rights; Gospel Music Association; Graphic Artists Guild; IFPI; International Federation of Actors; #IRespectMusic; Living Legends Foundation; MLB Players Association; Music Artists Coalition; Music Managers Forum – US; Music Tech Policy; Music Workers Alliance; National Music Publishers’ Association; News Media Alliance; NFL Players Association; NHL Players’ Association; Professional Photographers of America; Recording Academy; Recording Industry Association of America; Rhythm & Blues Foundation; SAG-AFTRA; SESAC, Songwriters of North America; SoundExchange and The Trichordist.

***

www.HumanArtistryCampaign.com

contact@humanartistrycampaign.com