@human_artistry Campaign Letter Opposing AI Safe Harbor Moratorium in Big Beautiful Bill HR 1

Artist Rights Institute is pleased to support the Human Artistry Campaign’s letter to Senators Thune and Schumer opposing the AI safe harbor in the One Big Beautiful Bill Act. ARI joins with:

Opposition is rooted in the most justifiable reasons:

By wiping dozens of state laws off the books, the bill would undermine public safety, creators’ rights, and the ability of local communities to protect themselves from a fast-moving technology that is being rushed to the market by tech giants. State laws protecting people from invasive AI deepfakes would be at risk, along with a range of proposals designed to eliminate discrimination and bias in AI. For artists and creators, preempting state laws requiring Big tech to disclose the material they used to train their models, often to create new products that compete with the human creators’ originals, would make it difficult or impossible to prove this theft has occurred. As the Copyright Office’s Fair Use Report recently reaffirmed, many forms of this conduct are illegal under longstanding federal law. 

The moratorium is so vague that it is unclear whether it would actually prohibit states from addressing construction of data centers or the vast drain on the power grid to implement AI placement in states. This is a safe harbor on steroids and terrible for all creators.

@JayGilbert Discusses Record Release Marketing Strategies

Our friend and long time music marketing consultant Jay Gilbert sits down with Chris Castle to discuss release planning and strategies on Part 3 of the Artist Rights Institute’s Record Release Checklist. You may have seen Jay on podcasts like Your Morning Coffee, Behind the Setlist (with Glenn Peoples) and Michael Brandvold’s Music Biz Weekly.

Jay discusses his excellent Release Planner and made a copy available for download on the Artist Rights Institute Artist Financial Education vertical. You can also listen to the podcast on The Artist Rights Watch podcast.

Don’t miss Parts 1 and 2 on getting your record ready with legal and business issues available on the Financial Education Vertical here and here and checklist for YouTube videos here.

Massive State Opposition to AI Regulation Safe Harbor Moratorium in the ‘One Big Beautiful Bill Act’ (H. Con. Res. 14)

As of this morning, the loathsome AI Safe Harbor is still in the ‘One Big Beautiful Bill Act’ (H. Con. Res. 14) as far as we can tell. There is supposed to be a “manager’s amendment” released this morning yet which will include any changes that were made in last night’s late night session. Watch this space at House Rules for when that managers amendment is released. You will be looking for Section 43201 around page 292. As far as we can tell, the safe harbor is still in there, which is not surprising as it is coming from David Sacks the Silicon Valley Viceroy and White House Crypto Czar.

But the safe harbor—a 10-year moratorium on state and local regulation of artificial intelligence (AI)—has ignited significant opposition from a broad coalition of state officials, lawmakers, and organizations. As you would expect, opponents argue that this measure would undermine existing protections and hinder the ability of states to address AI-related harms. Despite the fact that it was snuck through in the middle of the night, opposition is increasing all the time, but we cannot relent for a moment as Silicon Valley is at it again and wants to hang an AI safe harbor in the lobbyists hunting lodge, right next to the DMCA, Section 230 and Title I of the Music Modernization Act.

State-Level Opposition

A bipartisan group of 40 state attorneys general have voiced strong opposition through NAAG to the AI regulation moratorium. In a letter to Congress, they emphasized that the moratorium would disrupt hundreds of measures being both considered by state legislatures and those that have already passed in states led by Republicans and Democrats. They argue that, in the absence of comprehensive federal AI legislation, states have been at the forefront of protecting consumers.

Organizational Opposition

Beyond state officials, a coalition of 141 organizations—including unions, advocacy groups, non-profits, and academic institutions—have expressed alarm over the proposed safe harbor moratorium. In a letter to Congressional leaders, they warned that the provision could lead to unfettered abuse of AI technologies, undermining critical safeguards such as civil rights protections, privacy standards, and accountability for harmful AI applications.

Notable organizations opposing the moratorium include:

  • Alphabet Workers Union
  • Amazon Employees for Climate Justice
  • Mozilla
  • American Federation of Teachers
  • Center for Democracy and Technology 

We don’t ask you pick up the phone and call your Congress representative very often, but this is one of those times. If you’re not sure who your representatives are, you can go to the House of Representatives website here and look at the upper right hand corner for this box:

You can also go to the 5calls webpage opposing the safe harbor moratorium which is here. They have developed some collateral and talking points for you to draw on if you like.

This is a big damn deal. Let’s get it done. We’ve all done it before, let’s do it again.

A Long-Overdue Win for Artists: CRB’s Web VI Rates Mark Major Step Toward Fairer @SoundExchange Streaming Royalties

In a landmark development for recording artists, the Copyright Royalty Board (CRB) has proposed new royalty rates under the “Web VI” proceeding, covering the period 2026 through 2030. These rates govern how much commercial broadcasters must pay for streaming sound recordings under the statutory licenses set forth in Sections 112 and 114 of the U.S. Copyright Act.

The new rates reflect the culmination of years of advocacy by SoundExchange and artist-rights groups and represent another meaningful upward adjustments in royalty rates. The Copyright Royalty Judges have adopted a meaningful schedule of increases—both in per-stream royalties and in the minimum annual fees webcasters must pay—designed to better align statutory streaming compensation with market realities. (Unlike streaming mechanical rates, webcasting royalties are a penny rate per play.)

A Clear Victory in Numbers

YearWeb V Per-Performance RateWeb VI Per-Performance Rate% Increase Over Web VWeb V Min. Annual FeeWeb VI Min. Annual Fee / % Increase
2026$0.0021$0.0028+33.33%$1,000$1,100 / +10.00%
2027$0.0021$0.0029+38.10%$1,000$1,150 / +15.00%
2028$0.0021$0.0030+42.86%$1,000$1,200 / +20.00%
2029$0.0021$0.0031+47.62%$1,000$1,250 / +25.00%
2030$0.0021$0.0032+52.38%$1,000$1,250 / +25.00%

These increases aren’t merely arithmetic; they represent a philosophical shift in how creators are valued in the digital economy.

Structural Adjustments

Beyond the rate hikes, the CRB has adopted operational changes proposed by SoundExchange to royalty reporting and distribution. For example:

– The late fee for audit-based underpayments is reduced from 1.5% to 1.0% per month, capped at 75% of the total underpayment.
– Starting in 2027, webcasters using third-party vendors must obtain transmission and usage data or contractually guarantee its delivery.
– If a commercial broadcaster fails to file a report of use, SoundExchange may now distribute royalties based on proxy data.

These tweaks aim to close loopholes and increase reliability in royalty tracking—critical steps toward a more transparent system.

The Road Ahead

While the Web VI proposal rule will be final after June 16, 2025, it is already being hailed as a pivotal win by artist advocates. For too long, streaming-era economics have undervalued creators in favor of platforms and intermediaries.

This ruling is a recognition—long overdue and hard-won. When finalized, the Web VI clear and easy to understand rates and terms will not only ensure a greater financial contribution for featured and nonfeatured recording artists and rights holders, but also reassert the foundational principle that creators should be paid fairly when their work fuels billion-dollar platforms.

For artists and musicians navigating a shifting industry, the law is catching up with the market it governs on the side of the creators who drive the business.

Of course, don’t forget that some of these same broadcasters who pay under the statutory license for streaming do not pay anything to artists for over the air broadcast of terrestrial radio for the exact same plays of the exact same records–another reason that Congress must finally pass the American Music Fairness Act. That’s why we support the #IRespectMusic campaign and the MusicFirst Coalition. Ask Congress to support musicians here.

The AI Safe Harbor is an Unconstitutional Violation of State Protections for Families and Consumers

By Chris Castle

The AI safe harbor slavered onto President Trump’s “big beautiful bill” is layered with intended consequences. Not the least of these is the affect on TikTok.

One of the more debased aspects of TikTok (and that’s a long list) is their promotion through their AI driven algorithms of clearly risky behavior to their pre-teen audience. Don’t forget: TikTok’s algorithm is not just any algorithm. The Chinese government claims it as a state secret. And when the CCP claims a state secret they ain’t playing. So keep that in mind.

One of these risky algorithms that was particularly depraved was called the “Blackout Challenge.” The TikTok “blackout challenge” has been linked to the deaths of at least 20 children over an 18-month period. One of the dead children was Nylah Anderson. Nylah’s mom sued TikTok for her daughter because that’s what moms do. If you’ve ever had someone you love hang themselves, you will no doubt agree that you live with that memory every day of your life. This unspeakable tragedy will haunt Nylah’s mother forever.

Even lowlifes like TikTok should have settled this case and it should never have gotten in front of a judge. But no–TikTok tried to get out of it because Section 230. Yes, that’s right–they killed a child and tried to get out of the responsibility. The District Court ruled that the loathsome Section 230 applied and Nylah’s mom could not pursue her claims. She appealed.

The Third Circuit Court of Appeals reversed and remanded, concluding that “Section 230 immunizes only information ‘provided by another’” and that “here, because the information that forms the basis of Anderson’s lawsuit—i.e., TikTok’s recommendations via its FYP algorithm—is TikTok’s own expressive activity, § 230 does not bar Anderson’s claims.”

So…a new federal proposal threatens to slam the door on these legal efforts: the 10-year artificial intelligence (AI) safe harbor recently introduced in the House Energy and Commerce Committee. If enacted, this safe harbor would preempt state regulation of AI systems—including the very algorithms and recommendation engines that Nylah’s mom and other families are trying to challenge. 

Section 43201(c) of the “Big Beautiful Bill” includes pork, Silicon Valley style, entitled the “Artificial Intelligence and Information Technology Modernization Initiative: Moratorium,” which states:

no State or political subdivision thereof may enforce any law or regulation regulating artificial intelligence models, artificial intelligence systems, or automated decision systems during the 10-year period beginning on the date of the enactment of this Act.

The “Initiative” also appropriates “$500,000,000, to remain available until September 30, 2035, to modernize and secure Federal information technology systems through the deployment of commercial artificial intelligence, the deployment of automation technologies, and the replacement of antiquated business systems….” So not only did Big Tech write themselves a safe harbor for their crimes, they also are taking $500,000,000 of corporate welfare to underwrite it courtesy of the very taxpayers they are screwing over. Step aside Sophocles, when it comes to tragic flaws, Oedipus Rex got nothing on these characters.

Platforms like TikTok, YouTube, and Instagram use AI-based recommendation engines to personalize and optimize content delivery. These systems decide what users see based on a combination of behavioral data, engagement metrics, and predictive algorithms. While effective for keeping users engaged, these AI systems have been implicated in promoting harmful content—ranging from pro-suicide material to dangerous ‘challenges’ that have directly resulted in injury or death.

Families across the country have sued these companies, alleging that the AI-driven algorithms knowingly promoted hazardous content to vulnerable users. In many cases, the claims are based on state consumer protection laws, negligence, or wrongful death statutes. Plaintiffs argue that the companies failed in their duty to design safe systems or to warn users about foreseeable dangers. These cases are not attacks on free speech or user-generated content; they focus specifically on the design and operation of proprietary AI systems. 

If you don’t think that these platforms are depraved enough to actually raise safe harbor defenses, just remember what they did to Nylah’s mom–raised the exceptionally depraved Section 230 as a defense to their responsibility in the death of a child.

The AI safe harbor would prohibit states from enacting or enforcing any law that regulates AI systems or automated decision-making technologies for the next 10 years. This sweeping language could easily be interpreted to cover civil liability statutes that hold platforms accountable for the harms their AI systems cause. This is actually even worse than the vile Section 230–the safe harbor would be expressly targeting actual state laws. Maybe after all the appeals, say 20 years from now, we’ll find out that the AI safe harbor is unconstitutional commandeering, but do we really want to wait to find out?

Because these wrongful death lawsuits rely on arguments that an AI algorithm caused harm—either through its design or its predictive content delivery—the companies could argue that the moratorium shields them from liability. They might claim that the state tort claims are an attempt to “regulate” AI in violation of the federal preemption clause. If courts agree, these lawsuits could be dismissed before ever reaching a jury.

This would create a stunning form of corporate immunity even beyond the many current safe harbors for Big Tech: tech companies would be free to deploy powerful, profit-driven AI systems with no accountability in state courts, even when those systems lead directly to preventable deaths. 

The safe harbor would be especially devastating for families who have already suffered tragic losses and are seeking justice. These families rely on state wrongful death laws to hold powerful platforms accountable. Removing that path to accountability would not only deny them closure, but also prevent public scrutiny of the algorithms at the center of these tragedies.

States have long held the authority to define standards of care and impose civil liability for harms caused by negligence or defective products. The moratorium undermines this traditional role by barring states from addressing the specific risks posed by AI systems, even in the context of established tort principles. It would represent one of the broadest federal preemptions of state law in modern history—in the absence of federal regulation of AI platforms.

• In Pennsylvania, the parents of a teenager who committed suicide alleged that Instagram’s algorithmic feed trapped their child in a cycle of depressive content.
• Multiple lawsuits filed under consumer protection and negligence statutes in states like New Jersey, Florida, and Texas seek to hold platforms accountable for designing algorithms that systematically prioritize engagement over safety.
• TikTok faced multiple class action multidistrict litigation claims it illegally harvested user information from its in-app browser.

All of such suits could be in jeopardy if courts interpret the AI moratorium as barring state laws that impose liability on algorithm-driven systems and you can bet that Big Tech platforms will litigate the bejeezus out of the issue. Even if the moratorium was not intended to block wrongful death and other state law claims, its language may be broad enough to do so in practice—especially when leveraged by well-funded corporate legal teams.

Even supporters of federal AI regulation should be alarmed by the breadth of this safe harbor. It is not a thoughtful national framework based on a full record, but a shoot-from-the-hip blanket prohibition on consumer protection and civil justice. By freezing all state-level responses to AI harms, the AI safe harbor is intent on consolidating power in the hands of federal bureaucrats and corporate lobbyists, leaving ordinary Americans with fewer options for recourse, not to mention a clear violation of state police powers and the 10th Amendment.

To add insult to injury, the use of reconciliation to pass this policy—without full hearings, bipartisan debate, or robust public input—only underscores the cynical nature of the strategy. It has nothing to do with the budget aside from the fact that Big Tech is snarfing down $500 million of taxpayer money for no good reason just so they can argue their land grab is “germane” to shoehorn it into reconciliation under the Byrd Rule. It’s a maneuver designed to avoid scrutiny and silence dissent, not to foster a responsible or democratic conversation about how AI should be governed.

At its core, the AI safe harbor is not about fostering innovation—it is about shielding tech platforms from accountability just like the DMCA, Section 230 and Title I of the Music Modernization Act. By preempting state regulation, it could block families from using long-standing wrongful death statutes to seek justice for the loss of their children and laws protecting Americans from other harms. It undermines the sovereignty of states, the dignity of grieving families, and the public’s ability to scrutinize the AI systems that increasingly shape our lives. 

Congress must reject this overreach, and the American public must remain vigilant in demanding transparency, accountability, and justice. The Initiative must go.

[A version of this post first appeared on MusicTechPolicy]

@ArtistRights Institute Newsletter 5/5/25

The Artist Rights Watch podcast returns for another season! This week’s episode features Chris Castle on An Artist’s Guide to Record Releases Part 2. Download it here or subscribe wherever you get your audio podcasts.

New Survey for Songwriters: We are surveying songwriters about whether they want to form a certified union. Please fill out our short Survey Monkey confidential survey here! Thanks!

Texas Scalpers Bill of Rights Legislation

Can this Texas House bill help curb high ticket prices? Depends whom you ask (Marcheta Fornoff/KERA News)

Texas lawmakers target ticket fees and resale restrictions in new legislative push (Abigail Velez/CBS Austin)

@ArtistRights Institute opposes Texas Ticketing Legislation the “Scalpers’ Bill of Rights” (Chris Castle/Artist Rights Watch)

Streaming

Spotify’s Earnings Points To A “Catch Up” On Songwriter Royalties At Crb For Royalty Justice (Chris Castle/MusicTechPolicy)

Streaming Is Now Just As Crowded With Ads As Old School TV (Rick Porter/Hollywood Reporter)

Spotify Stock Falls On Music Streamer’s Mixed Q1 Report (Patrick Seitz/Investors Business Daily)

Economy

The Slowdown at Ports Is a Warning of Rough Economic Seas Ahead (Aarian Marshall/Wired)

What To Expect From Wednesday’s Federal Reserve Meeting (Diccon Hyatt/Investopedia)

Spotify Q1 2025 Earnings Call: Daniel Ek Talks Growth, Pricing, Superfan Products, And A Future Where The Platform Could Reach 1bn Subscribers (Murray Stassen/Music Business Worldwide)

Artist Rights and AI

SAG-AFTRA National Board Approves Commercials Contracts That Prevent AI, Digital Replicas Without Consent (JD Knapp/The Wrap)

Generative AI providers see first steps for EU code of practice on content labels (Luca Bertuzzi/Mlex)

A Judge Says Meta’s AI Copyright Case Is About ‘the Next Taylor Swift’ (Kate Knibbs/Wired)

Antitrust

Google faces September trial on ad tech antitrust remedies (David Shepardson and Jody Godoy/Reuters)

TikTok

Ireland fines TikTok 530 million euros for sending EU user data to China (Ryan Browne/CNBC)

@ArtistRights Institute opposes Texas Ticketing Legislation the “Scalpers’ Bill of Rights”

By Chris Castle

Coming soon to a state house near you, it looks like the StubHubs and SeatGeeks of this world are at it again. Readers will remember the “Trouble with Ticketing” panel at the Artist Rights Symposium last year and our discussion of the model “Scalpers’ Bill of Rights” that had been introduced at ALEC shortly before the panel convened.

A quick update, the “model” bill was so bad it couldn’t even get support at ALEC, which is saying something. However, the very same bill has shown up and been introduced in both the Texas and North Carolina state legislatures. I posted about it on MusicTechPolicy here.

The Texas House bill (HB 3621) is up for a hearing tomorrow. If you live in Texas you can comment and show up for public comments at the Legislature:

Submit Written Testimony (must be a Texas resident):
• Submit here: https://comments.house.texas.gov/home?c=c473
• Select HB 3621 by Bumgarner
• Keep comments under 3,000 characters

Testify In Person at the Capitol in Austin:
• Hearing Date: Wednesday, April 23 at 8:00 AM CT
• Location: Room E2.014, Texas Capitol
• Register here: https://house.texas.gov/committees/witness-registration
• You must create an account in advance: https://hwrspublicprofile.house.texas.gov/CreateAccount.aspx

ARI has submitted written comments through the Texas House comment portal, but we’re also sending the letter below to the committee so that we can add the color commentary and spin out the whole sordid tale of how this bill came to exist.

Can RICO Be Far Behind?  President Trump and Kid Rock Announce Whole of Government Enforcement of the BOTS Act

By Chris Castle

Yes, the sound you hear echoing from Silicon Valley is the sound of gnashing teeth and rending garments—some freaking guitar player did an end run around Big Tech’s brutal lobbying power and got to the President of the United States.  Don’t you just hate it when that happens?  Maybe not, but trust me, they really hate it because in those dark hours they don’t talk about at parties, they really hate us and think we are beneath them.  Remember that when you deal with YouTube and Spotify.

But to no avail.  President Trump signed an executive order yesterday that can only be described as taking a whole of government approach to enforcement of the BOTS Act.  As readers will recall, I have long said when it comes to StubHub, SeatGeek and their ilk, no bots, no billionaires.  It is hard to imagine a world where StubHub & Co.  are not basing their entire business model on the use of bots and other automated processes to snarf up tickets before the fans can get them.  This was also the subject of our ticketing panel at the 2024 Artist Rights Symposium in Washington, DC.

Remember, the BOTS Act, sponsored by Senator Marsha Blackburn and signed into law by President Obama in 2016, was designed to curb the use of automated software (bots) that purchase large quantities of event tickets, often within seconds of their release, to resell them at inflated prices through market makers like StubHub. It was so under-enforced that until the Executive Order it was entirely possible that StubHub could have sneaked out an IPO to slurp up money from the pubic trough before anyone knows better.

The government’s enforcement of the BOTS Act is so poor that Senator Blackburn found it necessary to introduce even more legislation to try to get the FTC to do their job. The Mitigating Automated Internet Networks for (MAIN) Event Ticketing Act is a bill introduced in 2023 by Senators Blackburn and Ben Ray Luján that aims to give the FTC even fewer excuses not to enforce the BOTS Act. It would further the FTC’s consumer protection mission against IPO-driven ticket scalping.

There are entire business lines built around furthering illegal ticket scalping that are so blatant they actually hold trade shows.  For example, NITO complained to the FTC that their investigators found multiple software platforms on the trade show floor  at a ticket brokers conference that are illegal under the BOTS Act and possibly under other laws such as Treasury Department regulations, financial crimes, wire fraud and the like.

The NITO FTC complaint details how multiple technology companies, many of whom exhibited at World Ticket Conference hosted by The National Association of Ticket Brokers in Nashville on July 24-26, 2024, provide tools that enable scalpers to circumvent ticket purchasing limits. These tools include sophisticated browser extensions, proxy services, and virtual credit card platforms designed to bypass security measures implemented by primary ticket sellers.

As is mentioned in the Executive Order, the sad truth is that the FTC didn’t take its first action to enforce the 2016 law until 2021. And that’s the only action it has ever taken.   Which is why President Trump’s executive order is so critical in stopping these scoundrels.

Kid Rock apparently had a chance to present these issues to President Trump and was present at the signing ceremony for the Executive Order. He said:

First off thank you Mr. President because this has happened at lightning speed.  I want to make sure Alina Habba gets her credit too because I know she worked very hard in this but thank you for making this happen so quick.

Anyone who’s bought a concert ticket in the last decade, maybe 20 years, no matter what your politics are knows it is a conundrum.  You buy a ticket for $100 but by the time you check out it’s $170.  You don’t know what you were charged for, but more importantly these bots come in and get all the good tickets to your favorite shows you want to go to.  Then they’re relisted immediately for sometimes a four or five hundred percent markup—the artists don’t get that money!

Ultimately I think this is a great first step. I would love down the road if there be some legislation that we could actually put a cap on the resale of tickets.

Yes, folks, we may be onto something here.

The reason I say that the EO establishes a “whole of government” approach is because of what else is in the order.  The actual EO was published, and the press release on the White House site says this:

  • The Order directs the Federal Trade Commission (FTC) to:
    • Work with the Attorney General to ensure that competition laws are appropriately enforced in the concert and entertainment industry.
    • Rigorously enforce the Better Online Ticket Sales (BOTS) Act and promote its enforcement by state consumer protection authorities.
    • Ensure price transparency at all stages of the ticket-purchase process, including the secondary ticketing market.
    • Evaluate and, if appropriate, take enforcement action to prevent unfair, deceptive, and anti-competitive conduct in the secondary ticketing market.
  • The Order directs the Secretary of the Treasury and Attorney General to ensure that ticket scalpers are operating in full compliance with the Internal Revenue Code and other applicable law.
  • Treasury, the Department of Justice, and the FTC will also deliver a report within 180 days summarizing actions taken to address the issue of unfair practices in the live concert and entertainment industry and recommend additional regulations or legislation needed to protect consumers in this industry.

In other words, the EO directs other Executive Branch agencies at the DOJ, FTC, Treasury to take enforcement seriously.  If the Department of Justice is involved, that could very well lead to enforcement of the BOTS Act’s criminal penalties.  And it’s kind of hard to have a StubHub IPO from prison although President Trump may want to add the Securities and Exchange Commission to the list of agencies he is calling into action.

In addition to fines, individuals convicted under the BOTS Act could face imprisonment for up to 1 year for a first offense. Repeat offenders may face longer prison sentences, depending on the nature of the violation and if there are aggravating factors involved (such as fraud or large-scale operations).  And remember, wire fraud is a common RICO predicate under the racketeering laws which is where I personally think this whole situation needs to go and go quickly. Remember, StubHub narrowly escaped a claim for civil RICO already.

So we shall see who is serious and who isn’t.  But I will say I’m hopeful. If you wanted to seriously go after actually solving the problem on the law enforcement side, this is how you would do it.

If you wanted to go after it on the property rights side, Kid Rock’s line about establishing a cap is how you would start.  The guy has clearly thought this through and we’re lucky that he has.  We’ll get around to speculative ticketing and taking out some of the other trash down the road if that’s even a problem after getting after bots.  But on property rights, let’s start with respecting the artist’s rights to set their own prices and have them followed instead of the current catastrophe.

The other take away from this is that Marsha was right—BOTS Act is probably enough law to handle the problem.  You just need to enforce it.

I always say you can’t get Silicon Valley to behave with fines alone because they print money due to the income transfer.  Prison, though, prison is the key that picks the lock.

[Editor Charlie sez: This post first appeared on MusicTechPolicy]

@Artist Rights Institute Newsletter 3/31/25

The Artist Rights Institute’s news digest Newsletter from Artist Rights Watch.

New Survey for Songwriters: We are surveying songwriters about whether they want to form a certified union. Please fill out our short Survey Monkey confidential survey here! Thanks!

Ticketing

Executive Order on Combating Unfair Practices in the Live Entertainment Market

Music Industry reacts to Executive Order on Ticket Scalping (Bruce Houghton/Hypebot)

What Hath Trump Wrought: The Effect of the Anti-Scalping Executive Order on StubHub’s IPO (Chris Castle/MusicTech.Solutions)

StubHub IPO Filing

Copyright Litigation

Merlin sues TikTok rival Triller for breach of contract over allegedly unpaid music licensing fees (Daniel Tencer/Music Business Worldwide)

Artificial Intelligence: Legislation

Artificial intelligence firms should pay artists and musicians for using their work amid uproar over Labour’s plans to exempt them from copyright laws, according to a new poll of Brits (Chris Pollard/Daily Mail)

European Union’s latest draft AI Code of Practice renders copyright ‘meaningless,’ rightsholders warn (Mandy Dalugdug/Music Business Worldwide)

Artificial Intelligence
The Style Returns: Some notes on ChatGPT and Studio Ghibli
 (Andres Guadamuz/TechnoLlama) 

OpenAI’s Preemption Request Highlights State Laws’ Downsides (Oliver Roberts/Bloomberg Law)

Copyright: Termination Rights

Update on Vetter v. Resnik case (Chris Castle/MusicTechPolicy)