FTC Cracks Down on Ticket Scalpers in Major BOTS Act Enforcement

The wheels of justice turn slowly, but they do turn.

In what appears to be a response to NITO’s complaint filed last year with FTC, pressure from Senator Marsha Blackburn and President Trump’s executive order on ticket scalping, Hypebot reports that the Federal Trade Commission is going after large-scale ticket resellers for violating the Better Online Ticket Sales (BOTS) Act (authored by Senators Blackburn and Richard Blumenthal). 

The enforcement action seeks tens of millions of dollars in damages and signals that federal regulators are finally prepared to tackle the systemic abuse of automated tools and deceptive practices in the live event ticketing market.

According to Hypebot, the FTC alleges that the companies used bots and a web of pseudonymous accounts to bypass ticket purchasing limits—snagging prime seats to high-demand concerts and reselling them at inflated prices on platforms like StubHub and SeatGeek. The case represents one of the largest BOTS Act enforcement efforts to date. 

“The FTC is finally doing what artists, managers, and fans have been asking for: holding scalpers accountable,” said Randy Nichols, artist manager for Underoath and advocate for ticketing reform. “This sends a message to bad actors that the days of unchecked resale are numbered.”

As Hypebot reports, this enforcement may just be the beginning. The case is likely to test the limits of the BOTS Act and could set new precedent for what counts as deceptive or unfair conduct in the ticket resale market—even when bots aren’t directly involved.

Read the full story via HypebotFTC Goes After Ticket Scalpers, Seeks Tens of Millions in Damages

United for Artists’ Rights: Amicus Briefs Filed in Vetter v. Resnik Support Global Copyright Termination for Songwriters and Authors: Brief by the National Society of Entertainment & Arts Lawyers

In Vetter v. Resnik, songwriter Cyril Vetter won his trial case in Baton Rouge allowing him to recover worldwide rights in his song “Double Shot of My Baby’s Love” after serving his 35 year termination notice on his former publisher, Resnik Music Group. The publisher appealed. The Fifth Circuit Court of Appeals will hear the case and currently is weighing whether U.S. copyright termination rights include “foreign” territories—a question that strikes at the heart of artists’ ability to reclaim their work worldwide (whatever “foreign” means).

Cyril’s attorney Tim Kappel explains the case if you need an explainer:

An astonishing number of friend of the court briefs were filed by many songwriter groups. We’re going to post them all and today’s brief is by the National Society of Arts & Entertainment Lawyers. The brief argues that the Copyright Act’s plain text and legislative history support a unified, comprehensive termination right that revokes all rights granted in a prior transfer, regardless of geographic scope. It rejects the notion of a “multiverse” of national copyrights, citing international treaties like the Berne Convention and longstanding U.S. policy favoring artist protection. Limiting terminations to U.S. territory, the brief warns, would gut the statute’s purpose, harm artists, and impose impossible burdens on creators seeking to reclaim their rights.

We believe the answer on appeal must be yes–affirm the District Court’s well-reasoned decision. Congress gave creators and their heirs the right a “second bite at the apple” to regain control of their work after decades, and that promise means little if global rights are excluded. The outcome of this case could either reaffirm that promise—or open the door for multinational publishers to sidestep it entirely.

That’s why we’re sharing friend of the court briefs from across the creative communities. Each one brings a different perspective—but all defend the principle that artists deserve a real, global right to take back what’s theirs, because as Chris said, Congress did not give authors a second bite at half the apple.

Read the brief below, watch this space for case updates.

The AI Safe Harbor is an Unconstitutional Violation of State Protections for Families and Consumers

By Chris Castle

The AI safe harbor slavered onto President Trump’s “big beautiful bill” is layered with intended consequences. Not the least of these is the affect on TikTok.

One of the more debased aspects of TikTok (and that’s a long list) is their promotion through their AI driven algorithms of clearly risky behavior to their pre-teen audience. Don’t forget: TikTok’s algorithm is not just any algorithm. The Chinese government claims it as a state secret. And when the CCP claims a state secret they ain’t playing. So keep that in mind.

One of these risky algorithms that was particularly depraved was called the “Blackout Challenge.” The TikTok “blackout challenge” has been linked to the deaths of at least 20 children over an 18-month period. One of the dead children was Nylah Anderson. Nylah’s mom sued TikTok for her daughter because that’s what moms do. If you’ve ever had someone you love hang themselves, you will no doubt agree that you live with that memory every day of your life. This unspeakable tragedy will haunt Nylah’s mother forever.

Even lowlifes like TikTok should have settled this case and it should never have gotten in front of a judge. But no–TikTok tried to get out of it because Section 230. Yes, that’s right–they killed a child and tried to get out of the responsibility. The District Court ruled that the loathsome Section 230 applied and Nylah’s mom could not pursue her claims. She appealed.

The Third Circuit Court of Appeals reversed and remanded, concluding that “Section 230 immunizes only information ‘provided by another’” and that “here, because the information that forms the basis of Anderson’s lawsuit—i.e., TikTok’s recommendations via its FYP algorithm—is TikTok’s own expressive activity, § 230 does not bar Anderson’s claims.”

So…a new federal proposal threatens to slam the door on these legal efforts: the 10-year artificial intelligence (AI) safe harbor recently introduced in the House Energy and Commerce Committee. If enacted, this safe harbor would preempt state regulation of AI systems—including the very algorithms and recommendation engines that Nylah’s mom and other families are trying to challenge. 

Section 43201(c) of the “Big Beautiful Bill” includes pork, Silicon Valley style, entitled the “Artificial Intelligence and Information Technology Modernization Initiative: Moratorium,” which states:

no State or political subdivision thereof may enforce any law or regulation regulating artificial intelligence models, artificial intelligence systems, or automated decision systems during the 10-year period beginning on the date of the enactment of this Act.

The “Initiative” also appropriates “$500,000,000, to remain available until September 30, 2035, to modernize and secure Federal information technology systems through the deployment of commercial artificial intelligence, the deployment of automation technologies, and the replacement of antiquated business systems….” So not only did Big Tech write themselves a safe harbor for their crimes, they also are taking $500,000,000 of corporate welfare to underwrite it courtesy of the very taxpayers they are screwing over. Step aside Sophocles, when it comes to tragic flaws, Oedipus Rex got nothing on these characters.

Platforms like TikTok, YouTube, and Instagram use AI-based recommendation engines to personalize and optimize content delivery. These systems decide what users see based on a combination of behavioral data, engagement metrics, and predictive algorithms. While effective for keeping users engaged, these AI systems have been implicated in promoting harmful content—ranging from pro-suicide material to dangerous ‘challenges’ that have directly resulted in injury or death.

Families across the country have sued these companies, alleging that the AI-driven algorithms knowingly promoted hazardous content to vulnerable users. In many cases, the claims are based on state consumer protection laws, negligence, or wrongful death statutes. Plaintiffs argue that the companies failed in their duty to design safe systems or to warn users about foreseeable dangers. These cases are not attacks on free speech or user-generated content; they focus specifically on the design and operation of proprietary AI systems. 

If you don’t think that these platforms are depraved enough to actually raise safe harbor defenses, just remember what they did to Nylah’s mom–raised the exceptionally depraved Section 230 as a defense to their responsibility in the death of a child.

The AI safe harbor would prohibit states from enacting or enforcing any law that regulates AI systems or automated decision-making technologies for the next 10 years. This sweeping language could easily be interpreted to cover civil liability statutes that hold platforms accountable for the harms their AI systems cause. This is actually even worse than the vile Section 230–the safe harbor would be expressly targeting actual state laws. Maybe after all the appeals, say 20 years from now, we’ll find out that the AI safe harbor is unconstitutional commandeering, but do we really want to wait to find out?

Because these wrongful death lawsuits rely on arguments that an AI algorithm caused harm—either through its design or its predictive content delivery—the companies could argue that the moratorium shields them from liability. They might claim that the state tort claims are an attempt to “regulate” AI in violation of the federal preemption clause. If courts agree, these lawsuits could be dismissed before ever reaching a jury.

This would create a stunning form of corporate immunity even beyond the many current safe harbors for Big Tech: tech companies would be free to deploy powerful, profit-driven AI systems with no accountability in state courts, even when those systems lead directly to preventable deaths. 

The safe harbor would be especially devastating for families who have already suffered tragic losses and are seeking justice. These families rely on state wrongful death laws to hold powerful platforms accountable. Removing that path to accountability would not only deny them closure, but also prevent public scrutiny of the algorithms at the center of these tragedies.

States have long held the authority to define standards of care and impose civil liability for harms caused by negligence or defective products. The moratorium undermines this traditional role by barring states from addressing the specific risks posed by AI systems, even in the context of established tort principles. It would represent one of the broadest federal preemptions of state law in modern history—in the absence of federal regulation of AI platforms.

• In Pennsylvania, the parents of a teenager who committed suicide alleged that Instagram’s algorithmic feed trapped their child in a cycle of depressive content.
• Multiple lawsuits filed under consumer protection and negligence statutes in states like New Jersey, Florida, and Texas seek to hold platforms accountable for designing algorithms that systematically prioritize engagement over safety.
• TikTok faced multiple class action multidistrict litigation claims it illegally harvested user information from its in-app browser.

All of such suits could be in jeopardy if courts interpret the AI moratorium as barring state laws that impose liability on algorithm-driven systems and you can bet that Big Tech platforms will litigate the bejeezus out of the issue. Even if the moratorium was not intended to block wrongful death and other state law claims, its language may be broad enough to do so in practice—especially when leveraged by well-funded corporate legal teams.

Even supporters of federal AI regulation should be alarmed by the breadth of this safe harbor. It is not a thoughtful national framework based on a full record, but a shoot-from-the-hip blanket prohibition on consumer protection and civil justice. By freezing all state-level responses to AI harms, the AI safe harbor is intent on consolidating power in the hands of federal bureaucrats and corporate lobbyists, leaving ordinary Americans with fewer options for recourse, not to mention a clear violation of state police powers and the 10th Amendment.

To add insult to injury, the use of reconciliation to pass this policy—without full hearings, bipartisan debate, or robust public input—only underscores the cynical nature of the strategy. It has nothing to do with the budget aside from the fact that Big Tech is snarfing down $500 million of taxpayer money for no good reason just so they can argue their land grab is “germane” to shoehorn it into reconciliation under the Byrd Rule. It’s a maneuver designed to avoid scrutiny and silence dissent, not to foster a responsible or democratic conversation about how AI should be governed.

At its core, the AI safe harbor is not about fostering innovation—it is about shielding tech platforms from accountability just like the DMCA, Section 230 and Title I of the Music Modernization Act. By preempting state regulation, it could block families from using long-standing wrongful death statutes to seek justice for the loss of their children and laws protecting Americans from other harms. It undermines the sovereignty of states, the dignity of grieving families, and the public’s ability to scrutinize the AI systems that increasingly shape our lives. 

Congress must reject this overreach, and the American public must remain vigilant in demanding transparency, accountability, and justice. The Initiative must go.

[A version of this post first appeared on MusicTechPolicy]

@ArtistRights Institute opposes Texas Ticketing Legislation the “Scalpers’ Bill of Rights”

By Chris Castle

Coming soon to a state house near you, it looks like the StubHubs and SeatGeeks of this world are at it again. Readers will remember the “Trouble with Ticketing” panel at the Artist Rights Symposium last year and our discussion of the model “Scalpers’ Bill of Rights” that had been introduced at ALEC shortly before the panel convened.

A quick update, the “model” bill was so bad it couldn’t even get support at ALEC, which is saying something. However, the very same bill has shown up and been introduced in both the Texas and North Carolina state legislatures. I posted about it on MusicTechPolicy here.

The Texas House bill (HB 3621) is up for a hearing tomorrow. If you live in Texas you can comment and show up for public comments at the Legislature:

Submit Written Testimony (must be a Texas resident):
• Submit here: https://comments.house.texas.gov/home?c=c473
• Select HB 3621 by Bumgarner
• Keep comments under 3,000 characters

Testify In Person at the Capitol in Austin:
• Hearing Date: Wednesday, April 23 at 8:00 AM CT
• Location: Room E2.014, Texas Capitol
• Register here: https://house.texas.gov/committees/witness-registration
• You must create an account in advance: https://hwrspublicprofile.house.texas.gov/CreateAccount.aspx

ARI has submitted written comments through the Texas House comment portal, but we’re also sending the letter below to the committee so that we can add the color commentary and spin out the whole sordid tale of how this bill came to exist.

@ArtistRights Newsletter 4/14/25

The Artist Rights Watch podcast returns for another season! This week’s episode features AI Legislation, A View from Europe: Helienne Lindvall, President of the European Composer and Songwriter Alliance (ECSA) and ARI Director Chris Castle in conversation regarding current issues for creators regarding the EU AI Act and the UK Text and Data Mining legislation. Download it here or subscribe wherever you get your audio podcasts.

New Survey for Songwriters: We are surveying songwriters about whether they want to form a certified union. Please fill out our short Survey Monkey confidential survey here! Thanks!

AI Litigation: Kadrey v. Meta

Law Professors Reject Meta’s Fair Use Defense in Friend of the Court Brief

Ticketing
Viagogo failing to prevent potentially unlawful practices, listings on resale site suggest that scalpers are speculatively selling tickets they do not yet have (Rob Davies/The Guardian)

ALEC Astroturf Ticketing Bill Surfaces in North Carolina Legislation

ALEC Ticketing Bill Surfaces in Texas to Rip Off Texas Artists (Chris Castle/MusicTechPolicy)

International AI Legislation

Brazil’s AI Act: A New Era of AI Regulation (Daniela Atanasovska and Lejla Robeli/GDPR Local)

Why robots.txt won’t get it done for AI Opt Outs (Chris Castle/MusicTechPolicy)

Feature TranslationHow has the West’s misjudgment of China’s AI ecosystem distorted the global technology competition landscape (Jeffrey Ding/ChinAI)

Unethical AI Training Harms Creators and Society, Argues AI Pioneer (Ed Nawotka/Publishers Weekly) 

AI Ethics

Céline Dion Calls Out AI-Generated Music Claiming to Feature the Iconic Singer Without Her Permission (Marina Watts/People)

Splice CEO Discusses Ethical Boundaries of AI in Music​ (Nilay Patel/The Verge)

Spotify’s Bold AI Gamble Could Disrupt The Entire Music Industry (Bernard Marr/Forbes)

Books

Apple in China: The Capture of the World’s Greatest Company by Patrick McGee (Coming May 13)

Spotify Makes Kate Nash’s Argument With the Usual Blame Game

Daniel Ek is indifferent to whether the economics of streaming causes artists to give up or actually starve to actual death. He’s already got the tracks and he’ll keep selling them forever like an evil self-licking ice cream cone.

Kate Nash is the latest artist to slam Spotify’s pathetic royalty payments even after the payola and the streaming manipulation with the Orwellian “Discovery Mode” as discovered by Liz Pelly. According to Digital Music News, Kate Nash says: 

“‘Foundations’ has over 100 million plays on Spotify — and I’m shocked I’m not a millionaire when I hear that! I’m shocked at the state of the music industry and how the industry has allowed this to happen,” said Nash. “We’re paid very, very, very poorly and unethically for our recorded music: it’s like 0.003 of a penny per stream. I think we should not only be paid fairly, but we should be paid very well. People love music and it’s a growing economy and there are plenty of millionaires in the industry because of that, and our music.”

But then she said the quiet part out loud that will get them right in their Portlandia hearts:

She added: “And what they’re saying to artists from non-rich privileged backgrounds, which is you’re not welcome here, you can’t do this, we don’t want to hear from you. Because it’s not possible to even imagine having a career if you don’t have a privileged background or a privileged situation right now.”

This, of course, comes the same time that Spotify board members have cashed out over $1 billion in stock including hundreds of millions to Daniel Ek personally, speaking of privilege.

Using forks and knives to eat their bacon

Spotify responds with the same old whine that starts with the usual condescending drivel, deflection and distraction:

“We’re huge fans of Kate Nash. For streams of her track ‘Foundations’ alone — which was released before Spotify existed — Spotify has paid out around half a million pounds in revenue to Kate Nash’s rights holders,” reads Spotify’s statement.

“Her most streamed songs were released via Universal Music Group. Spotify has no visibility over the deals that Kate signed with her rights holders. Therefore, we have no knowledge of the payment terms that were agreed upon between her and her partners.”

This is a very carefully worded statement–notice that they switch from the specific to the general and start talking about “her rights holders”. That means no doubt that they are including the songwriters and publishers of the compositions, so that’s bullshit for starters. But notice how they are making Kate’s own argument here by trying to get you to focus on the “big check” that they wrote to Universal.

Well, last time I checked in the world of arithmetic, “around half a million pounds” (which means less than, but OK) divided by 100,000,000 streams is…wait for it…shite. £0.005 per stream–at the Universal level but all-in by the sound of it, i.e., artist share, label share, songwriters and publishers. This is why Spotify is making Kate’s argument at the same time they are trying to deflect attention onto Universal.

Then–always with an eye on the DCMS authorities in the UK and the UK Parliament, Spotify says:

“We do know that British artists generated revenues of over £750 million on Spotify alone in 2023 — a number that is on the rise year on year — so it’s disappointing to hear that Spotify’s payments are not making it through to Kate herself,” the company concluded.

Oh, so “disappointed.” Please spare us. What’s disappointing is that the streaming services participate in this charade where their executives make more in one day of stock trading than the company’s entire payments to UK artists and songwriters.

This race to the bottom is not lost on artists.  Al Yankovic, a card-carrying member of the pantheon of music parodists from Tom Leher to Spinal Tap to The Rutles, released a hysterical video about his “Spotify Wrapped” account.  

Al said he’d had 80 million streams and received enough cash from Spotify to buy a $12 sandwich.  This was from an artist who made a decades-long career from—parody.  Remember that–parody.

Do you think he really meant he actually got $12 for 80 million streams?  Or could that have been part of the gallows humor of calling out Spotify Wrapped as a propaganda tool for…Spotify?  Poking fun at the massive camouflage around the Malthusian algebra of streaming royalties gradually choking the life out of artists and songwriters? Gallows humor, indeed, because a lot of artists and especially songwriters are gradually collapsing as the algebra predicted.

The services took the bait Al dangled, and they seized upon Al’s video poking fun at how ridiculously low Spotify payments are to make a point about how Al’s sandwich price couldn’t possibly be 80 million streams and if it were, it’s his label’s fault.  Just like Spotify is blaming Universal rather than take responsibility for once in their lives. 

Nothing if not on message, right? As Daniel Ek told MusicAlly, “There is a narrative fallacy here, combined with the fact that, obviously, some artists that used to do well in the past may not do well in this future landscape, where you can’t record music once every three to four years and think that’s going to be enough.” This is kind of like TikTok bragging about how few children hung themselves in the latest black out challenge compared to the number of all children using the platform. Pretty Malthusian. It’s not a fallacy; it’s all too true.

I’d suggest that Al and Kate Nash were each making the point–if you think of everyday goods, like bacon for example, in terms of how many streams you would have to sell in order to buy a pound of bacon, a dozen eggs, a gallon of gasoline, Internet access, or a sandwich in a nice restaurant, you start to understand that the joke really is on us. The best way to make a small fortune in the streaming business is to start with a large one. Unless you’re a Spotify executive, of course.

PRESS RELEASE: @Human_Artistry Campaign Endorses NO FAKES Act to Protect Personhood from AI

For Immediate Release

HUMAN ARTISTRY CAMPAIGN ENDORSES NO FAKES ACT

Bipartisan Bill Reintroduced by Senators Blackburn, Coons, Tillis, & Klobuchar and Representatives Salazar, Dean, Moran, Balint and Colleagues

Create New Federal Right for Use of Voice and Visual Likeness
in Digital Replicas

Empowers Artists, Voice Actors, and Individual Victims to Fight Back Against
AI Deepfakes and Voice Clones

WASHINGTON, DC (April 9, 2025) – Amid global debate over guardrails needed for AI, the Human Artistry Campaign today announced its support for the reintroduced “Nurture Originals, Foster Art, and Keep Entertainment Safe Act of 2025” (“NO FAKES Act”) – landmark legislation giving every person an enforceable new federal intellectual property right in their image and voice. 

Building off the original NO FAKES legislation introduced last Congress, the updated bill was reintroduced today by Senators Marsha Blackburn (R-TN), Chris Coons (D-DE), Thom Tillis (R-NC), Amy Klobuchar (D-MN) alongside Representatives María Elvira Salazar (R-FL-27), Madeleine Dean (D-PA-4), Nathaniel Moran (R-TX-1), and Becca Balint (D-VT-At Large) and bipartisan colleagues.

The legislation sets a strong federal baseline protecting all Americans from invasive AI-generated deepfakes flooding digital platforms today. From young students bullied by non-consensual sexually explicit deepfakes to families scammed by voice clones to recording artists and performers replicated to sing or perform in ways they never did, the NO FAKES Act provides powerful remedies requiring platforms to quickly take down unconsented deepfakes and voice clones and allowing rights​​holders to seek damages from creators and distributors of AI models designed specifically to create harmful digital replicas.

The legislation’s thoughtful, measured approach preserves existing state causes of action and rights of publicity, including Tennessee’s groundbreaking ELVIS Act. It also contains carefully calibrated exceptions to protect free speech, open discourse and creative storytelling – without trampling the underlying need for real, enforceable protection against the vast range of invasive and harmful deepfakes and voice clones.

Human Artistry Campaign Senior Advisor Dr. Moiya McTier released the following statement in support of the legislation:

​“The Human Artistry Campaign stands for preserving essential qualities of all individuals – beginning with a right to their own voice and image. The NO FAKES Act is an important step towards necessary protections that also support free speech and AI development. The Human Artistry Campaign commends Senators Blackburn, Coons, Tillis, and Klobuchar and Representatives Salazar, Dean, Moran, Balint, and their colleagues for shepherding bipartisan support for this landmark legislation, a necessity for every American to have a right to their own identity as highly realistic voice clones and deepfakes become more pervasive.

Dr. Moiya McTier, Human Artistry Campaign Senior Advisor

By establishing clear rules for the new federal voice and image right, the NO FAKES Act will power innovation and responsible, pro-human uses of powerful AI technologies while providing strong protections for artists, minors and others. This important bill has cross-sector support from Human Artistry Campaign members and companies such as OpenAI, Google, Amazon, Adobe and IBM. The NO FAKES Act is a strong step forward for American leadership that erects clear guardrails for AI and real accountability for those who reject the path of responsibility and consent.

Learn more & let your representatives know Congress should pass NO FAKES Act here.

​# # #

ABOUT THE HUMAN ARTISTRY CAMPAIGN: The Human Artistry Campaign is the global initiative for the advancement of responsible AI – working to ensure it develops in ways that strengthen the creative ecosystem, while also respecting and furthering the indispensable value of human artistry to culture. Across 34 countries, more than 180 organizations have united to protect every form of human expression and creative endeavor they represent – journalists, recording artists, photographers, actors, songwriters, composers, publishers, independent record labels, athletes and more. The growing coalition champions seven core principles for keeping human creativity at the center of technological innovation. For further information, please visit humanartistrycampaign.com

@Artist Rights Institute Newsletter 4/7/25

The Artist Rights Institute’s news digest Newsletter

The Artist Rights Watch podcast returns for another season!  First episode is Tim Kappel discussing the Vetter v. Resnik landmark copyright termination case. Follow us wherever you get your podcasts.

New Survey for Songwriters: We are surveying songwriters about whether they want to form a certified union. Please fill out our short Survey Monkey confidential survey here! Thanks!

Streaming Meltdown

White Noise Is Hugely Popular on Streaming Services. Should It Be Devalued? (Kristin Robinson/Billboard) (Subscription)

Polly Pockets Strikes Again: DANIEL EK POCKETS ANOTHER $27.6M FROM SELLING SPOTIFY SHARES – CASHING OUT OVER $750M SINCE 2023 (Mandy Daludgug/MusicBusinessWorldwide)

THY ART IS MURDER Vocalist Quits Over Finances: “I Can’t Live Like This Anymore” (Robert Pasbani/Metal Injection)

AI Litigation

U.S. District Judge Sidney Stein order in New York Times et al v. Microsoft, OpenAI et al

NYT v MSFT-OpenAI MTDDownload

Judge explains order for New York Times in OpenAI copyright case (Blake Brittan/Reuters)

OpenAI, Google reject UK’s AI copyright plan (Joseph Bambridge/Politico EU)

Mechanical Licensing Collective

Shhh…It’s a Secret! How is the MLC “Hedge Fund” Performing in the Global Market Crash (Chris Castle/MusicTechPolicy)

Ticketing

If it Looks Like a Duck and Quacks Like a Duck, Deny Everything: The ALEC Ticketing Bill Surfaces in Texas to Rip Off Artists (Chris Castle/MusicTechPolicy)

Tickets to Beyonce’s ‘Cowboy Carter’ Shows Bottoming Out at $25 In LA, New Jersey (Ashley King/Digital Music News)

TikTok Divestment

TikTok Extended Again (Chris Castle/MusicTech.Solutions)

And After All That, TikTok Could Still Go Poof (Paul Resnikoff/Digital Music News)

Books

Understanding the China Threat by Lianchao Han and Bradley A. Thayer

Brookings experts’ reading list on US-China strategic relations

Global Soft Power Index 2024 by Konrad Jagodzinski/Brand Finance

TikTok Sale Extended…Again

By Chris Castle

Imagine if the original Napster had received TikTok-level attention from POTUS?  Forget I said that.  The ongoing divestment of TikTok from its parent company ByteDance has reached yet another critical point with yet another bandaid.  Congress originally set a January 19, 2025 deadline for ByteDance to either sell TikTok’s U.S. operations or face a potential ban in the United States as part of the Protecting Americans from Foreign Adversary Controlled Applications Act or “PAFACA” (I guess “covfefe” was taken). The US Supreme Court upheld that law in TikTok v. Garland.

When January 20 came around, President Trump gave ByteDance an extension to April 5, 2025 by executive order. When that deadline came, President Trump granted an extension to the extension to the January 19 deadline by another executive order, providing additional time for ByteDance to finalize a deal to divest. The extended deadline now pushes the timeline for divestment negotiations to July 1, 2025.

This new extension is designed to allow for further negotiation time among ByteDance, potential buyers, and regulatory authorities, while addressing the ongoing trade issues and concerns raised by both the U.S. and Chinese governments. 

It’s getting mushy, but I’ll take a stab at the status of the divestment process. I might miss someone as they’re all getting into the act. 

I would point out that all these bids anticipate a major overhaul in how TikTok operates which—just sayin’—means it likely would no longer be TikTok as its hundreds of millions of users now know it.  I went down this path with Napster, and I would just say that it’s a very big deal to change a platform that has inherent legal issues into one that satisfies a standard that does not yet exist.  I always used the rule of thumb that changing old Napster to new Napster (neither of which had anything to do with the service that eventually launched with the “Napster” brand but bore no resemblance to original Napster or its DNA) would result in an initial loss of 90% of the users. Just sayin’.

Offers and Terms

Multiple parties have expressed interest in acquiring TikTok’s U.S. operations, but the terms of these offers remain fluid due to ongoing negotiations and the complexity of the deal. Key bidders include:

ByteDance Investors:  According to Reuters, “the biggest non-Chinese investors in parent company ByteDance to up their stakes and acquire the short video app’s U.S. operations.” This would involve Susquehanna International Group, General Atlantic, and KKR. ByteDance looks like it retains a minority ownership position of less than 20%, which I would bet probably means 19.99999999% or something like that. Reuters describes this as the front runner bid, and I tend to buy into that characterization. From a cap table point of view, this would be the cleanest with the least hocus pocus. However, the Reuters story is based on anonymous sources and doesn’t say how the deal would address the data privacy issues (other than that Oracle would continue to hold the data), or the algorithm. Remember, Oracle has been holding the data and that evidently has been unsatisfactory to Congress which is how we got here. Nothing against Oracle, but I suspect this significant wrinkle will have to get fleshed out.

Lawsuit by Bidder Company Led by Former Myspace Executive:  In a lawsuit in Florida federal court by TikTok Global LLC filed April 3, TikTok Global accuses ByteDance, TikTok Inc., and founder Yiming Zhang of sabotaging a $33 billion U.S.-based TikTok acquisition deal by engaging in fraud, antitrust violations, and breach of contract. TikTok Global LLC is led by Brad Greenberg the former MySpace executive and Internet entrepreneur. The factual allegations in the complaint start in 2020 with the executive order in Trump I, and alleges that:

This set the stage for what should have been a straightforward process of acquisition and divestment, but instead, it became a twisted tale of corporate intrigue, conspiracy, and antitrust violations….Plaintiff would soon discover, the game was rigged from the start because ByteDance had other plans, plans that circumvented proper procedures, stifled competition, and maintained ByteDance’s control over TikTok’s U.S. operations – all under the guise of compliance with the executive order.

The fact-heavy complaint alleges ByteDance misled regulators, misappropriated the “TikTok Global” brand, and conspired to maintain control of TikTok in violation of U.S. government directives. The suit brings six causes of action, including tortious interference and unjust enrichment, underscoring a complex clash over corporate deception and national security compliance. Emphasis on “alleged” as the case is pretty fact-dependent and plaintiff will have to prove their case, but the well-drafted complaint makes some extensive claims that may give a window into the behind the scenes in the world of Mr. Tok. Watch this space, it could be a sleeper that eventually wakes up to bite, no pun intended.

Oracle and Walmart: This proposal, which nearly closed in 2024 (I guess), involved a sale of TikTok’s U.S. business to a consortium of U.S.-based companies, with Oracle managing data security and infrastructure. ByteDance was to retain a minority stake in the new entity. However, this deal has not closed, who knows why aside from competition and then there’s those trade tariffs and the need for approval from both U.S. and Chinese regulators who have to be just so chummy right at the moment.

AppLovin: A preliminary bid has been submitted by AppLovin, an adtech company, to acquire TikTok’s U.S. operations. It appears that AppLovin’s offer includes managing TikTok’s user base and revenue model, with a focus on ad-driven strategies, although further negotiations are still required.  According to Pitchbook, “AppLovin is a vertically integrated advertising technology company that acts as a demand-side platform for advertisers, a supply-side platform for publishers, and an exchange facilitating transactions between the two. About 80% of AppLovin’s revenue comes from the DSP, AppDiscovery, while the remainder comes from the SSP, Max, and gaming studios, which develop mobile games. AppLovin announced in February 2025 its plans to divest from the lower-margin gaming studios to focus exclusively on the ad tech platform.”  It’s a public company trading as APP and seems to be worth about $100 billion.   Call me crazy, but I’m a bit suspicious of a public company with “lovin” in its name.  A bit groovy for the complexity of this negotiation, but you watch, they’ll get the deal.

Amazon and Blackstone: Amazon and Blackstone have also expressed interest in acquiring TikTok or a stake in a TikTok spinoff in Blackstone’s case. These offers would likely involve ByteDance retaining a minority interest in TikTok’s U.S. operations, though specifics of the terms remain unclear.  Remember, Blackstone owns HFA through SESAC.  So there’s that.

Frank McCourt/Project Liberty:  The “People’s Bid” for TikTok is spearheaded by Project Liberty, founded by Frank McCourt. This initiative aims to acquire TikTok and change its platform to prioritize user privacy, data control, and digital empowerment. The consortium includes notable figures such as Tim Berners-Lee, Kevin O’Leary, and Jonathan Haidt, alongside technologists and academics like Lawrence Lessig.  This one gives me the creeps as readers can imagine; anything with Lessig in it is DOA for me.

The bid proposes migrating TikTok to a new open-source protocol to address concerns raised by Congress while preserving its creative essence. As of now, the consortium has raised approximately $20 billion to support this ambitious vision.  Again, these people act like you can just put hundreds of millions of users on hold while this changeover happens.  I don’t think so, but I’m not as smart as these city fellers.

PRC’s Reaction

The People’s Republic of China (PRC) has strongly opposed the forced sale of TikTok’s U.S. operations, so there’s that. PRC officials argue that such a divestment would be a dangerous precedent, potentially harming Chinese tech companies’ international expansion. And they’re not wrong about that, it’s kind of the idea. Furthermore, the PRC’s position seems to be that any divestment agreement that involves the transfer of TikTok’s algorithm to a foreign entity requires Chinese regulatory approval.  Which I suspect would be DOA.

They didn’t just make that up– the PRC, through the Cyberspace Administration of China (CAC), owns a “golden share” in ByteDance’s main Chinese subsidiary. This 1% stake, acquired in 2021, grants the PRC significant influence over ByteDance including the ability to influence content and business strategies.

Unsurprisingly, ByteDance must ensure that the PRC government (i.e., the Chinese Communist Party) maintains control over TikTok’s core algorithm, a key asset for the company. PRC authorities have been clear that they will not approve any sale that results in ByteDance losing full control over TikTok’s proprietary technology, complicating the negotiations with prospective buyers.  

So a pressing question is whether TikTok without the algorithm is really TikTok from the users experience.  And then there’s that pesky issue of valuation—is TikTok with an unknown algo worth as much as TikTok with the proven, albeit awful, current algo.

Algorithm Lease Proposal

In an attempt to address both U.S. security concerns and the PRC’s objections, a novel solution has been proposed: leasing TikTok’s algorithm. Under this arrangement, ByteDance would retain ownership of the algorithm, while a U.S.-based company, most likely Oracle, would manage the operational side of TikTok’s U.S. business.

ByteDance would maintain control over its technology, while allowing a U.S. entity to oversee the platform’s operation within the U.S. The U.S. company would be responsible for ensuring compliance with U.S. data privacy laws and national security regulations, while ByteDance would continue to control its proprietary algorithm and intellectual property.

Under this leasing proposal, Oracle would be in charge of managing TikTok’s data security and ensuring that sensitive user data is handled according to U.S. regulations. This arrangement would allow ByteDance to retain its technological edge while addressing American security concerns regarding data privacy.

The primary concern is safeguarding user data rather than the algorithm itself. The proposal aims to address these concerns while avoiding the need for China’s approval of a full sale.

Now remember, the reason we are in this situation at all is that Chinese law requires TikTok to turn over on demand any data it gathers on TikTok users which I discussed on MTP back in 2020. The “National Intelligence Law” even requires TikTok to allow the PRC’s State Security police to take over the operation of TikTok for intelligence gathering purposes on any aspect of the users’ lives.  And if you wonder what that really means to the CCP, I have a name for you:  Jimmy Lai. You could ask that Hong Konger, but he’s in prison. 

This leasing proposal has sparked debate because it doesn’t seem to truly remove ByteDance’s influence over TikTok (and therefore the PRC’s influence). It’s being compared to “Project Texas 2.0,” a previous plan to secure TikTok’s data and operations.  I’m not sure how the leasing proposal solves this problem. Or said another way, if the idea is to get the PRC’s hands off of Americans’ user data, what the hell are we doing?

Next Steps

As the revised deadline approaches, I’d expect a few steps, each of which has its own steps within steps:

Finalization of a Deal: This is the biggest one–easy to say, nearly impossible to accomplish.  ByteDance will likely continue negotiating with interested parties while they snarf down user data, working to secure an agreement that satisfies both U.S. regulatory requirements and Chinese legal constraints. The latest extension provides runway for both sides to close key issues that are closable, particularly concerning the algorithm lease and ByteDance’s continued role in the business.

Operational Contingency:  I suppose at some point the buyer is going to be asked if whatever their proposal is will actually function and whether the fans will actually stick around to justify whatever the valuation is.  One of the problems with rich people getting ego involved in a fight over something they think is valuable is that they project all kinds of ideas on it that show how smart they are, only to find that once they get the thing they can’t actually do what they thought they would do.  By the time they figure out that it doesn’t work, they’ve moved on to the next episode in Short Attention Span Theater and it’s called Myspace.

China’s Approval: ByteDance will need to secure approval from PRC regulatory authorities for any deal involving the algorithm lease or a full divestment. So why introduce the complexity of the algo lease when you have to go through that step anyway?  Without PRC approval, any sale or lease of TikTok’s technology is likely dead, or at best could face significant legal and diplomatic hurdles.

Legal Action: If an agreement is not reached by the new deadline of July 1, 2025, further legal action could be pursued, either by ByteDance to contest the divestment order or by the U.S. government to enforce a ban on TikTok’s operations.  I doubt that President Trump is going to keep extending the deadline if there’s no significant progress.

If I were a betting man, I’d bet on the whole thing collapsing into a shut down and litigation, but watch this space.

[This post first appeared on MusicTech.Solutions]