Search is Ripping Us Off Again, this time for AI

By Chris Castle

In case you think that infringement of copyrights in AI training happens only to rock stars, photographers, record companies or movie studios, let me give you a personal anecdote that might cause you to reconsider. I asked Bing’s Copilot AI tool to list the deal points in a record producer agreement, and this is what I got back:

Copilot

Certainly! When negotiating a producer agreement, several key points should be considered. Here are some common negotiation points from the artist’s perspective:

  1. Scope of Engagement:
    1. The producer is generally engaged for one album project at a time.
    2. Clarify the scope of work, including the number of songs or recordings to be produced.
  2. Budget and Compensation:
    1. Determine the budget for the project (e.g., $10,000 in your case).
    2. Agree on the producer’s compensation, which may include an upfront fee, points on sales, or a combination.
  3. Ownership and Rights:
    1. Address ownership of the master recordings:
      1. Will the artist own and control the masters?
      2. Does the producer retain any rights?
  4. Credits and Royalties:
    1. Specify how the producer will be credited on releases.
    2. Discuss any royalties or backend points for the producer.
  5. Recording Schedule:
    1. Set a timeline for recording sessions.
    2. Define the two-week period for your project.
  6. Logistics:
    1. Determine where the recordings will take place.
    2. Outline the artist’s approval process.
    3. Define the producer’s deliverables.
  7. Termination Clause:
    1. Include provisions for termination if either party breaches the agreement.
    2. Address payment for work completed up to that point.

Remember that each negotiation is unique, and it’s essential to tailor the agreement to your specific needs and circumstances. Legal advice is recommended to ensure compliance with local laws and industry standards1 3 4.

Source(s)

1. An Artist’s Guide to Negotiating Record Producer Agreements – Medium

2. Negotiating a Music Producer Agreement: 7 Key Issues

3. 5 Things to Look for in a Music Producer Agreement – Lawyer Drummer

4. How to Negotiate: Steps, Phrases & Tips to Win the Deal – Selling Signals

Notice that Copilot lists “Source(s)” and that source #1 is an article from the Medium website.

My article.

I’ve written a number of articles about negotiating producer agreements and I can categorically say I haven’t given permission or been asked to give permission for any of those articles to be scraped and used to train AI. So how did it come to be in Copilot’s training materials? 

My guess is that Medium (and probably a lot of other sites) were crawled by Bing’s spider and that copies of Medium pages were stored in Bing’s (or Microsoft’s) data centers. When Microsoft launched its Copilot AI tool, one could imagine that the temptation would have been strong to use those copies as part of the training tools for Copilot. In fact, the Verge reports that “companies around the web are using your site and its data to build massive sets of training data, in order to build models and products that may not acknowledge your existence at all.” Can you say Section 230 or DMCA? Maybe there will be a question at the April 11 House Energy & Commerce hearing on Section 230.

So Google-style “innovation” has a self-destructive flavor consistent with their oikophobic capitalism. Greater Silicon Valley’s view of culture in general and copyright in particular is as sources of wealth extracted by destruction–you know, creative destruction, permissionless innovation, etc. (It’s no wonder Google was inexorably attracted to defense contracting despite all the “don’t be evil” hoorah. After all, what creates massive wealth faster than convincing governments to pay big money to blow things up that must be replaced by ever more big money to blow even more things up.)

Are you surprised then that two of the biggest operators in the AI space are the search engine operators Google and Microsoft? This is another example of how Big Tech helps itself to your data and work product without you even knowing it’s happening. So now what? I now know I’m being ripped off, and I’m wondering if Medium is in on it.

The Verge tells us:

The ability to download, store, organize, and query the modern internet gives any company or developer something like the world’s accumulated knowledge to work with. In the last year or so, the rise of AI products like ChatGPT, and the large language models underlying them, have made high-quality training data one of the internet’s most valuable commodities. That has caused internet providers of all sorts to reconsider the value of the data on their servers, and rethink who gets access to what. 

Ya think?

What would Lars say? Artificial Intelligence: Nobel or RICO?

All the true promise of AI does not require violating writers, artists, photographers, voice actors etc copyrights and rights of publicity. You know, stuff like reading MRIs and X-rays, developing pharmaceuticals, advanced compounds, new industrial processes, etc.

All the shitty aspects of AI DO require intentional mass copyright infringement (a RICO predicate BTW). You know stuff like bots, deep fakes, autogenerated “yoga mat” music, SEO manipulation, autogenerated sports coverage, commercial chat bots, fake student papers, graphic artist knockoffs, robot voice actors etc. But that’s where the no-value-add-parasitic-free-rider-easy-money is to be made. That’s why the parasitic free-riding VCs and private equity want to get a “fair use” copyright exemption.

Policy makers should understand that if they want to reduce the potential harms of AI they need to protect and reinforce intellectual property rights of individuals. It is a natural (and already existing) brake on harmful AI. What we don’t need is legislative intervention that makes it easier to infringe IP rights and then try to mitigate (the easily predictable and obvious) harms with additional regulation.

This is what happened with Napster and internet 1.0. The DMCA copyright infringement safe harbor for platforms unleashed all sorts of negative externalities that were never fairly mitigated by subsequent regulation.

Why do songwriters get 0.0009 a stream on streaming platforms? Because the platforms used the threat of the DMCA copyright safe harbor by “bad actors” (often connected to the “good actors” via shared board members and investors*) to create a market failure that destroyed the value of songs. To “fix” the problem federal legislation tasks the Copyright Royalty Board in LOC to set royalty rates and forced songwriters to license to the digital platforms (songwriters can not opt out). The royalty setting process was inevitably captured by the tech companies and that’s how you end up with 0.0009 per stream.

TBF the DMCA safe harbor requires the platforms to set up “technical measures” to prevent unlicensed use of copyrights, but this part of the DMCA safe harbor were never implemented and the federal government never bothered to enforce this part of the law. This is the Napster playbook all over again.

1. Unleash a technology that you know will be exploited by bad actors**.

2. Ask for federal intervention that essentially legalizes the infringing behavior.

3. The federal legislation effectively creates private monopoly or duopoly.

4. Trillions of dollars in wealth transferred from creators to a tiny cabal of no-value-add-parasitic-free-rider-easy-money VCs in silicon valley.

5. Lots of handwringing about the plight of creators.

6. Bullshit legislation that claims to help creators but actually mandates a below market rate for creators.

The funny thing is Lars Ulrich was right about Napster. [See our 2012 post Lars Was First and Lars Was Right.] At the time he was vilified by what in reality was a coordinated DC communication firm (working for Silicon Valley VCs) that masqueraded as grassroots operation.

But go back and watch the Charlie Rose debate between Lars Ulrich and Chuck D, everything Lars Ulrich said was gonna happen happened.

If Lars Ulrich hadn’t been cowed by a coordinated campaign by no-value-add-parasitic-free-rider-easy-money Silicon Valley VCs, he’d probably say the same thing about AI.

And he’d be right again.

@justinebateman wants to see AI contract terms in actor agreements and music folk should be checking, too

Chris Castle says: If you are seeing contract language that allows the other side to use your name, image, likeness, voice, etc., take a close look at the rights granted. It may not be that obvious. Like all other configuration changes in the past, artists, producers and songwriters need to look at their existing agreements and see how old language will be interpreted to cover AI.

Example: Label will have (i) exclusive record artwork merchandising rights; and (ii) exclusive rights to sell merchandise embodying three (3) exclusive designs per contract period, alone and in conjunction with Artist’s names and approved pictures, likenesses and other identifications, subject to Artist’s approval with respect to such matters as product design and manufacturing.

“in conjunction with”, “other identifications” and “approval with respect to such matters as” each take on new meaning.

Example: Artist acknowledges that Label is the exclusive owner of all rights of copyright in Masters and Records embodying the results and proceeds of Artist’s recording services made pursuant to the Recording Agreement or during its term, including the exclusive right to copyright same as “sound recordings” in the name of Label, to renew and extend such copyrights (and all rights in and thereto are hereby assigned to Label), and to exercise all rights of the copyright proprietor thereunder as provided in the Recording Agreement.

Recordings “made pursuant to the Recording Agreement or during its term” could mean AI works.

Example: As used in this agreement, “Other Entertainment Services” shall mean any and all entertainment industry activity that are not otherwise provided for in this recording agreement, including, without limitation, the following: (a) the exploitation in any and all media of the name(s) likeness(es), visual representations, biographical material and/or logo(s) of or relating to Artist or any member of Artist (all of the intellectual properties relating to Artist referred to above are sometimes referred to herein collectively and individually as “Artist Properties”), either alone or in conjunction with other elements, including without limitation merchandise for sale at the site(s) of any and all live concert engagements performed by Artist or any member of Artist, premiums such as products which bear a third party’s trademarks or logos together with Artist Properties, tie-ins, “bounceback” merchandising, and fan club merchandise, whether or not in connection with Master Recordings, including, without limitation, exploitation by any Person other than Label of any rights granted in this recording agreement; (b) endorsements, special marketing arrangements, sponsorships (including tour sponsorships), strategic partnerships or other business relationships with third parties; (c) live performance engagements as a musician, vocalist and/or performer by the Artist or any member of Artist in all media, including but not limited to musical performances on tour, in concerts, on television broadcast or cable casts (including pay-per-view telecasts), radio, “webcast” and all other means.

Fans and Trust and Trust by Fans Are Essential for AI to Succeed

By Chris Castle

[This post first appeared on MusicTechPolicy]

We are told that artificial intelligence is a powerful tool that may end up being either the end of humanity through automated super soldiers making autonomous decisions regarding their own AI devised rules of engagement, or life saving medical procedures and diagnostic tools like House meets HAL. As usual–both outcomes are probably equally likely if humanity doesn’t keep the deus in the machina. We really don’t want them thinking “Hell is other machines.”

The question I have is how will we keep humanity around when companies like Google are hell-bent on achieving the Singularity ASAP. This is particularly true of creators–let’s not kid ourselves that the Google Books project was some altruistic motivation to build the digital library of Alexandria rather than a massive digitization project to build a large language model to train artificial intelligence through corpus machine translation.  And still is. As Kurt Sutter (show runner for Sons of Anarchytaught us about Google, “[t]he truth is, they don’t give a shit about free speech, and are the antithesis of their own mantra, ‘Don’t be evil.’” That was 2014 and boy was he right. And he still is. It’s not just Google, but Google is emblematic of Silicon Valley.

One of the lessons we learned from the 1990s is the calvary is not coming. We have to take our own steps to work both cooperatively and defensively against a tech threat. The Human Artistry Campaign and its AI Principles effort is a hopeful indicator that the creative community and its partners are coming together to get ahead of both the threat and the promise of AI.

Let’s not forget that it’s not just about us, it’s also about the fan, our “consumers” if you will. The biggest threat to creators in my view is destroying the relationship of trust that exists between fans and creators. If AI can allow a machine to impersonate a creator, that deception harms the creator, surely. But it also harms the fan. 

One of the AI principles from the Human Artistry Campaign jumped out at me as addressing this vital issue:

  • Trustworthiness and transparency are essential to the success of AI and protection of creators. 

Complete recordkeeping of copyrighted works, performances, and likenesses, including the way in which they were used to develop and train any AI system, is essential. Algorithmic transparency and clear identification of a work’s provenance are foundational to AI trustworthiness. Stakeholders should work collaboratively to develop standards for technologies that identify the input used to create AI-generated output. In addition to obtaining appropriate licenses, content generated solely by AI should be labeled describing all inputs and methodology used to create it — informing consumer choices, and protecting creators and rightsholders. 

Informing consumer choices. For a moment forget the artistic integrity, forget the human intervention, forget the free riding, just for a moment because these are all vital issues, too. At the core of the AI problem is deception and that issue is as old as time. You can’t essentially deceive fans about the origin of a work and you certainly can’t build a machine that does this all the livelong day and pretend you didn’t.

In Book 2 of Plato’s Republic, he uses the legend of a magic ring that turns the bearer invisible to illustrate a dialog on the nature of justice. The ring turns the wearer invisible so that they are capable of doing all manner of things while invisible–or anonymous–that would clearly be both unjust and punishable without the ring. Plato asks if an act is unjust solely because you get caught or is it unjust regardless of whether you are hidden from sight or apprehension. Yep, those Greeks were onto this early.

Deception is not genius. At the core of our concerns about AI is keeping them honest to protect our fans and the bedrock of the creator-fan relationship. Consumers should be able to rely on the reality of what appears to be an artist’s work that it actually does come from that artist. 

We do this with almost any other product or service that is placed into commerce, so why not with creative works? After all, artist rights are human rights.

We were happy to endorse the AI principles and encourage you to find out more about it at the Human Artistry Campaign or Artist Rights Watch and sign the petition.

National Association of Voice Actors: AI/Synthetic Voice Rider–Don’t lose your voice forever

[Chris Castle says: It’s like the antichrist without the morals. Voice over actors are being attacked by purveyors of artificial intelligence so that the actor’s voices can be re-used without consent or compensation even if they didn’t consent or at least didn’t object. Not only that, but voices can be used to train AI to speak in a completely different context. This is way worse that Netflix composer buyouts.

Check your name/image/likeness clauses folks–voice actors will not be the only ones caught up in the AI hellscape.]

AN OPEN LETTER FROM NAVA AND THE VOCAL VARIANTS TO THE VOICE OVER COMMUNITY

AI or Synthetic Voices are on the rise. We’re a group of concerned voiceactors working with union and non-union performers alike to make sure we don’t lose our voices forever by signing away our rights to various companies. Long story short, any contract that allows a producer to use your voice forever in all known media (and any new media developed in the future) across the universe is one we want to avoid. 

So we have put together some things we can all do to avoid the decimation of our industry.

Read the post on NAVA Voices site and stay in touch with your unions.