
Category: Human Artistry Campaign
@Abbie_Llewelyn: UK Government defeated in House of Lords over protecting copyright from AI data scraping
Good news on the AI fight posted by the @artistrights Institute’s ArtistRightsWatch.com]
The Government has been defeated in the Lords over measures to protect creatives from having their copyrighted work used to train AI models without permission or remuneration. [The House of Lords is the “upper chamber” of the UK Parliament, similar to the US Senate.]
Peers [Members of the House of Lords] voted 145 to 126, majority 19, in favour of a package of amendments to the Data (Use and Access) Bill aiming to tackle the unauthorised use of intellectual property by big tech companies scraping data for AI.
Proposing the amendments, digital rights campaigner Baroness Kidron said they would help enforce existing property rights by improving transparency and laying out a redress procedure.
The measures would explicitly subject AI companies to UK copyright law, regardless of where they are based, reveal the names and owners of web crawlers that currently operate anonymously and allow copyright owners to know when, where and how their work is used.
Updates for Nov. 20 @ArtistRights Symposium at @AmericanU @KogodBiz in Washington DC

We are announcing the time schedule and speakers for the 4th annual Artist Rights Symposium on November 20. The symposium is supported by the Artist Rights Institute and was founded by Dr. David C. Lowery, Lecturer at the University of Georgia Terry College of Business.
This year the symposium is hosted in Washington, DC, by American University’s Kogod School of Business at American’s Constitution Hall, 4400 Massachusetts Avenue, NW, Washington, DC 20016. We are also pleased to have a Kogod student presentation on speculative ticketing as part of the speaker lineup.
Admission is free, but please reserve a spot with Eventbrite, seating is limited!
The symposium starts at 8:30 am and ends with a reception at 4:30pm. The symposium will be recorded as an audiovisual presentation for distribution at a later date, but will not be live-streamed. If you attend, understand that you may be filmed in any audience shots, questions from the floor or still images. The symposium social media hashtag is #ArtistRightsKogod.
Schedule
8:30 — Doors open, networking coffee.
9:00-9:10 — Welcome remarks by David Marchick, Dean, Kogod School of Business
9:10-9:15 — Welcome remarks by Christian L. Castle, Esq., Director, Artist Rights Institute
9:15-10:15 — THE TROUBLE WITH TICKETS: The Challenges of Ticket Resellers and Legislative Solutions:
Kevin Erickson, Director, Future of Music Coalition, Washington DC
Dr. David C. Lowery, Co-founder of Cracker and Camper Van Beethoven, University of Georgia
Terry College of Business, Athens, Georgia
Stephen Parker, Executive Director, National Independent Venue Association, Washington DC
Mala Sharma, President, Georgia Music Partners, Atlanta, Georgia
Moderator: Christian L. Castle, Esq., Director, Artist Rights Institute, Austin, Texas
10:15-10:30: NIVA Speculative Ticketing Project Presentation by Kogod students
10:30-10:45: Coffee break
10:45-11:00: OVERVIEW OF CURRENT ISSUES IN ARTIFICIAL INTELLIGENCE LITIGATION: Kevin Madigan, Vice President, Legal Policy and Copyright Counsel, Copyright Alliance
11:00-12 pm: SHOW ME THE CREATOR – Transparency Requirements for AI Technology:
Danielle Coffey, President & CEO, News Media Alliance, Arlington, Virginia
Dahvi Cohen, Legislative Assistant, U.S. Congressman Adam Schiff, Washington, DC
Ken Doroshow, Chief Legal Officer, Recording Industry Association of America, Washington DC
Moderator: Linda Bloss-Baum, Director of the Kogod School of Business’s Business & Entertainment Program
12:00-12:30: Lunch break
12:30-1:30: Keynote: Graham Davies, President and CEO of the Digital Media Association, Washington DC.
1:30-1:45: Coffee break
1:45-2:45: CHICKEN AND EGG SANDWICH: Bad Song Metadata, Unmatched Funds, KYC and What You Can Do About It
Richard James Burgess, MBE, President & CEO, American Association of Independent Music, New York
Helienne Lindvall, President, European Composer & Songwriter Alliance, London, England
Abby North, President, North Music Group, Los Angeles
Anjula Singh, Chief Financial Officer and Chief Operating Officer, SoundExchange, Washington DC
Moderator: Christian L. Castle, Esq, Director, Artist Rights Institute, Austin, Texas
2:45-3:15: Reconvene across street to International Service Founders Room for concluding speakers and reception
3:15-3:30: OVERVIEW OF INTERNATIONAL ARTIFICIAL INTELLIGENCE LEGISLATION: George York, Senior Vice President International Policy from RIAA.
3:30-4:30: NAME, IMAGE AND LIKENESS RIGHTS IN THE AGE OF AI: Current initiatives to protect creator rights and attribution
Jeffrey Bennett, General Counsel, SAG-AFTRA, Washington, DC
Jen Jacobsen, Executive Director, Artist Rights Alliance, Washington DC
Jalyce E. Mangum, Attorney-Advisor, U.S. Copyright Office, Washington DC
Moderator: John Simson, Program Director Emeritus, Business & Entertainment, Kogod School of Business, American University
4:30-5:30: Concluding remarks by Linda Bloss-Baum, Director of the Kogod School of Business’s Business & Entertainment Program and reception.
CHICKEN AND EGG SANDWICH: Bad Song Metadata, Unmatched Funds, KYC and What You Can Do About It: Speaker Update for Nov. 20 @ArtistRights Symposium at @AmericanU @KogodBiz in Washington DC
We’re pleased to announce additional speakers for the 4th annual Artist Rights Symposium on November 20, this year hosted in Washington, DC, by American University’s Kogod School of Business at American’s Constitution Hall, 4400 Massachusetts Avenue, NW, Washington, DC 20016. The symposium is also supported by the Artist Rights Institute and was founded by Dr. David Lowery, Lecturer at the University of Georgia Terry College of Business.
The Symposium has four panels and a lunchtime keynote. Panels will begin at 8:30 am and end by 5 pm, with lunch and refreshments. More details to follow. Contact the Artist Rights Institute for any questions.
Admission is free, but please reserve a spot with Eventbrite, seating is limited! (Eventbrite works best with Firefox)
Keynote: Graham Davies, President and CEO of the Digital Media Association, Washington DC. Graham will speak around lunchtime.
We have confirmed speakers for another topic!
CHICKEN AND EGG SANDWICH: Bad Song Metadata, Unmatched Funds, KYC and What You Can Do About It
Richard James Burgess, MBE, President & CEO, American Association of Independent Music, New York
Helienne Lindvall, President, European Composer & Songwriter Alliance, London, England
Abby North, President, North Music Group, Los Angeles
Anjula Singh, Chief Financial Officer and Chief Operating Officer, SoundExchange, Washington DC
Moderator: Christian L. Castle, Esq, Director, Artist Rights Institute, Austin, Texas
Previously confirmed panelists are:
SHOW ME THE CREATOR – Transparency Requirements for AI Technology:
Danielle Coffey, President & CEO, News Media Alliance, Arlington, Virginia
Dahvi Cohen, Legislative Assistant, U.S. Congressman Adam Schiff, Washington, DC
Ken Doroshow, Chief Legal Officer, Recording Industry Association of America, Washington DC
Moderator: Linda Bloss-Baum, Director of the Kogod School of Business’s Business & Entertainment Program
THE TROUBLE WITH TICKETS: The Economics and Challenges of Ticket Resellers and Legislative Solutions:
Kevin Erickson, Director, Future of Music Coalition, Washington DC
Dr. David C. Lowery, Co-founder of Cracker and Camper Van Beethoven, University of Georgia
Terry College of Business, Athens, Georgia
Stephen Parker, Executive Director, National Independent Venue Association, Washington DC
Mala Sharma, President, Georgia Music Partners, Atlanta, Georgia
Moderator: Christian L. Castle, Esq., Director, Artist Rights Institute, Austin, Texas
@human_artistry Press Release: Senators Introduce COPIED Act to Combat AI Deepfakes
| Senators Cantwell, Blackburn, and Heinrich introduce the Content Origin Protection and Integrity from Edited and Deepfaked Media Act (COPIED Act), Giving Artists New Tools to Protect Against Deepfakes |
| “Deepfakes pose an existential threat to our culture and society, making it hard to believe what we see and hear and leaving individual creators vulnerable as tech companies use our art without consent while AI-generated content leads to confusion about what is real. Requiring transparency is a meaningful step that will help protect us all – ensuring that nonconsensual, harmful content can be removed quickly and providing a clear origin when our life’s work has been used.” – Dr. Moiya McTier, Human Artistry Campaign Senior Advisor |
| With widespread creative community support from organizations including the Artist Rights Alliance, SAG-AFTRA, the Recording Academy, RIAA, NMPA, NSAI, and more, the bill would set new federal transparency guidelines for marking, authenticating and detecting AI-generated content, protect journalists, actors and artists against AI-driven theft, and hold violators accountable for abuses. Creates Transparency Standards: Requires the National Institute of Standards and Technology (NIST) to develop guidelines and standards for content provenance information, watermarking and synthetic content detection. These standards will promote transparency to identify if content has been generated or manipulated by AI, as well as where AI content originated. The bill also directs NIST to develop cybersecurity measures to prevent tampering with provenance and watermarking on AI content. Puts Journalists, Artists and Musicians in Control of Their Content: Requires providers of AI tools used to generate creative or journalistic content to allow owners of that content to attach provenance information to it and prohibits its removal. The bill prohibits the unauthorized use of content with provenance information to train AI models or generate AI content. These measures give content owners—journalists, newspapers, artists, songwriters, and others—the ability to protect their work and set the terms of use for their content, including compensation. Gives Individuals a Right to Sue Violators: Authorizes the Federal Trade Commission (FTC) and state attorneys general to enforce the bill’s requirements. It also gives newspapers, broadcasters, artists, and other content owners the right to bring suit in court against platforms or others who use their content without permission. Prohibits Tampering with or Disabling AI Provenance Information: Currently, there is no law that prohibits removing, disabling, or tampering with content provenance information. The bill prohibits anyone, including internet platforms, search engines and social media companies, from interfering with content provenance information in these ways. |
@human_artistry: Scarlett Johansson calls for federal protections against generative AI
Reality is that this could happen to anyone, but like other online pirates they come for the famous people first to free ride on their brand.
@SAGAFTRA and Major Labels Reach a New Sound Recording Agreement With AI Protections
Looks like both sides listened and respected each other so no strike necessary. And yet there are groundbreaking first-time AI protections for musicians and vocalists. Amazing what can happen when you actually respect the workers, looking at you YouTube. And guess what? The workers get to vote on the deal. Imagine if songwriters could do the same.
PRESS RELEASE
SAG-AFTRA and leading record labels reached a tentative multiyear agreement on a successor contract to the SAG-AFTRA National Code of Fair Practice for Sound Recordings.
Covering the period starting Jan. 1, 2021, and ending Dec. 31, 2026, the agreement includes Warner Music Group, Sony Music Entertainment, Universal Music Group and Disney Music Group.
The SAG-AFTRA Executive Committee unanimously approved the tentative agreement on Wednesday, April 10. It will now be sent to members for ratification.
SAG-AFTRA National Executive Director & Chief Negotiator Duncan Crabtree-Ireland said, “SAG-AFTRA and the music industry’s largest record labels have reached a groundbreaking agreement establishing, for the first time, collective bargaining guardrails assuring singers and recording artists ethical and responsible treatment in the use of artificial intelligence in the music industry. It is a testament to our mutual unwavering commitment to work together to safeguard the rights, dignity and creative freedom of our members.
“This agreement ensures that our members are protected. SAG-AFTRA stands firm in the belief that while technology can enhance the creative process, the essence of music must always be rooted in genuine human expression and experience. We look forward to working alongside our industry partners to foster an environment where innovation serves to elevate, not diminish, the unique value of each artist’s contribution to our rich cultural tapestry.”
The Record Label Negotiating Committee said, “We are pleased to reach this agreement with SAG-AFTRA and continue our strong partnership as we enter this exciting and fast-moving new era for music and artists. Together, we’ll chart a successful course forward, embracing new opportunities and facing our common challenges, strengthened by our shared values and commitment to human artistry.”
The artificial intelligence guardrails take effect immediately upon ratification. The terms “artist,” “singer,” and “royalty artist,” under this agreement only include humans. In this agreement, clear and conspicuous consent, along with minimum compensation requirements and specific details of intended use, are required prior to the release of a sound recording that uses a digital replication of an artist’s voice.
Additional highlights among the contract gains include increased minimums, health and retirement improvements, and an increase in the percentage of streaming revenue to be covered by contributions, among other gains.
Search is Ripping Us Off Again, this time for AI
By Chris Castle
In case you think that infringement of copyrights in AI training happens only to rock stars, photographers, record companies or movie studios, let me give you a personal anecdote that might cause you to reconsider. I asked Bing’s Copilot AI tool to list the deal points in a record producer agreement, and this is what I got back:
Copilot
Certainly! When negotiating a producer agreement, several key points should be considered. Here are some common negotiation points from the artist’s perspective:
- Scope of Engagement:
- The producer is generally engaged for one album project at a time.
- Clarify the scope of work, including the number of songs or recordings to be produced.
- Budget and Compensation:
- Determine the budget for the project (e.g., $10,000 in your case).
- Agree on the producer’s compensation, which may include an upfront fee, points on sales, or a combination.
- Ownership and Rights:
- Address ownership of the master recordings:
- Will the artist own and control the masters?
- Does the producer retain any rights?
- Address ownership of the master recordings:
- Credits and Royalties:
- Specify how the producer will be credited on releases.
- Discuss any royalties or backend points for the producer.
- Recording Schedule:
- Set a timeline for recording sessions.
- Define the two-week period for your project.
- Logistics:
- Determine where the recordings will take place.
- Outline the artist’s approval process.
- Define the producer’s deliverables.
- Termination Clause:
- Include provisions for termination if either party breaches the agreement.
- Address payment for work completed up to that point.
Remember that each negotiation is unique, and it’s essential to tailor the agreement to your specific needs and circumstances. Legal advice is recommended to ensure compliance with local laws and industry standards1 3 4.
Source(s)
1. An Artist’s Guide to Negotiating Record Producer Agreements – Medium
2. Negotiating a Music Producer Agreement: 7 Key Issues
3. 5 Things to Look for in a Music Producer Agreement – Lawyer Drummer
4. How to Negotiate: Steps, Phrases & Tips to Win the Deal – Selling Signals
Notice that Copilot lists “Source(s)” and that source #1 is an article from the Medium website.
My article.
I’ve written a number of articles about negotiating producer agreements and I can categorically say I haven’t given permission or been asked to give permission for any of those articles to be scraped and used to train AI. So how did it come to be in Copilot’s training materials?
My guess is that Medium (and probably a lot of other sites) were crawled by Bing’s spider and that copies of Medium pages were stored in Bing’s (or Microsoft’s) data centers. When Microsoft launched its Copilot AI tool, one could imagine that the temptation would have been strong to use those copies as part of the training tools for Copilot. In fact, the Verge reports that “companies around the web are using your site and its data to build massive sets of training data, in order to build models and products that may not acknowledge your existence at all.” Can you say Section 230 or DMCA? Maybe there will be a question at the April 11 House Energy & Commerce hearing on Section 230.
So Google-style “innovation” has a self-destructive flavor consistent with their oikophobic capitalism. Greater Silicon Valley’s view of culture in general and copyright in particular is as sources of wealth extracted by destruction–you know, creative destruction, permissionless innovation, etc. (It’s no wonder Google was inexorably attracted to defense contracting despite all the “don’t be evil” hoorah. After all, what creates massive wealth faster than convincing governments to pay big money to blow things up that must be replaced by ever more big money to blow even more things up.)
Are you surprised then that two of the biggest operators in the AI space are the search engine operators Google and Microsoft? This is another example of how Big Tech helps itself to your data and work product without you even knowing it’s happening. So now what? I now know I’m being ripped off, and I’m wondering if Medium is in on it.
The Verge tells us:
The ability to download, store, organize, and query the modern internet gives any company or developer something like the world’s accumulated knowledge to work with. In the last year or so, the rise of AI products like ChatGPT, and the large language models underlying them, have made high-quality training data one of the internet’s most valuable commodities. That has caused internet providers of all sorts to reconsider the value of the data on their servers, and rethink who gets access to what.
Ya think?



You must be logged in to post a comment.