The AI Safe Harbor is an Unconstitutional Violation of State Protections for Families and Consumers

By Chris Castle

The AI safe harbor slavered onto President Trump’s “big beautiful bill” is layered with intended consequences. Not the least of these is the affect on TikTok.

One of the more debased aspects of TikTok (and that’s a long list) is their promotion through their AI driven algorithms of clearly risky behavior to their pre-teen audience. Don’t forget: TikTok’s algorithm is not just any algorithm. The Chinese government claims it as a state secret. And when the CCP claims a state secret they ain’t playing. So keep that in mind.

One of these risky algorithms that was particularly depraved was called the “Blackout Challenge.” The TikTok “blackout challenge” has been linked to the deaths of at least 20 children over an 18-month period. One of the dead children was Nylah Anderson. Nylah’s mom sued TikTok for her daughter because that’s what moms do. If you’ve ever had someone you love hang themselves, you will no doubt agree that you live with that memory every day of your life. This unspeakable tragedy will haunt Nylah’s mother forever.

Even lowlifes like TikTok should have settled this case and it should never have gotten in front of a judge. But no–TikTok tried to get out of it because Section 230. Yes, that’s right–they killed a child and tried to get out of the responsibility. The District Court ruled that the loathsome Section 230 applied and Nylah’s mom could not pursue her claims. She appealed.

The Third Circuit Court of Appeals reversed and remanded, concluding that “Section 230 immunizes only information ‘provided by another’” and that “here, because the information that forms the basis of Anderson’s lawsuit—i.e., TikTok’s recommendations via its FYP algorithm—is TikTok’s own expressive activity, § 230 does not bar Anderson’s claims.”

So…a new federal proposal threatens to slam the door on these legal efforts: the 10-year artificial intelligence (AI) safe harbor recently introduced in the House Energy and Commerce Committee. If enacted, this safe harbor would preempt state regulation of AI systems—including the very algorithms and recommendation engines that Nylah’s mom and other families are trying to challenge. 

Section 43201(c) of the “Big Beautiful Bill” includes pork, Silicon Valley style, entitled the “Artificial Intelligence and Information Technology Modernization Initiative: Moratorium,” which states:

no State or political subdivision thereof may enforce any law or regulation regulating artificial intelligence models, artificial intelligence systems, or automated decision systems during the 10-year period beginning on the date of the enactment of this Act.

The “Initiative” also appropriates “$500,000,000, to remain available until September 30, 2035, to modernize and secure Federal information technology systems through the deployment of commercial artificial intelligence, the deployment of automation technologies, and the replacement of antiquated business systems….” So not only did Big Tech write themselves a safe harbor for their crimes, they also are taking $500,000,000 of corporate welfare to underwrite it courtesy of the very taxpayers they are screwing over. Step aside Sophocles, when it comes to tragic flaws, Oedipus Rex got nothing on these characters.

Platforms like TikTok, YouTube, and Instagram use AI-based recommendation engines to personalize and optimize content delivery. These systems decide what users see based on a combination of behavioral data, engagement metrics, and predictive algorithms. While effective for keeping users engaged, these AI systems have been implicated in promoting harmful content—ranging from pro-suicide material to dangerous ‘challenges’ that have directly resulted in injury or death.

Families across the country have sued these companies, alleging that the AI-driven algorithms knowingly promoted hazardous content to vulnerable users. In many cases, the claims are based on state consumer protection laws, negligence, or wrongful death statutes. Plaintiffs argue that the companies failed in their duty to design safe systems or to warn users about foreseeable dangers. These cases are not attacks on free speech or user-generated content; they focus specifically on the design and operation of proprietary AI systems. 

If you don’t think that these platforms are depraved enough to actually raise safe harbor defenses, just remember what they did to Nylah’s mom–raised the exceptionally depraved Section 230 as a defense to their responsibility in the death of a child.

The AI safe harbor would prohibit states from enacting or enforcing any law that regulates AI systems or automated decision-making technologies for the next 10 years. This sweeping language could easily be interpreted to cover civil liability statutes that hold platforms accountable for the harms their AI systems cause. This is actually even worse than the vile Section 230–the safe harbor would be expressly targeting actual state laws. Maybe after all the appeals, say 20 years from now, we’ll find out that the AI safe harbor is unconstitutional commandeering, but do we really want to wait to find out?

Because these wrongful death lawsuits rely on arguments that an AI algorithm caused harm—either through its design or its predictive content delivery—the companies could argue that the moratorium shields them from liability. They might claim that the state tort claims are an attempt to “regulate” AI in violation of the federal preemption clause. If courts agree, these lawsuits could be dismissed before ever reaching a jury.

This would create a stunning form of corporate immunity even beyond the many current safe harbors for Big Tech: tech companies would be free to deploy powerful, profit-driven AI systems with no accountability in state courts, even when those systems lead directly to preventable deaths. 

The safe harbor would be especially devastating for families who have already suffered tragic losses and are seeking justice. These families rely on state wrongful death laws to hold powerful platforms accountable. Removing that path to accountability would not only deny them closure, but also prevent public scrutiny of the algorithms at the center of these tragedies.

States have long held the authority to define standards of care and impose civil liability for harms caused by negligence or defective products. The moratorium undermines this traditional role by barring states from addressing the specific risks posed by AI systems, even in the context of established tort principles. It would represent one of the broadest federal preemptions of state law in modern history—in the absence of federal regulation of AI platforms.

• In Pennsylvania, the parents of a teenager who committed suicide alleged that Instagram’s algorithmic feed trapped their child in a cycle of depressive content.
• Multiple lawsuits filed under consumer protection and negligence statutes in states like New Jersey, Florida, and Texas seek to hold platforms accountable for designing algorithms that systematically prioritize engagement over safety.
• TikTok faced multiple class action multidistrict litigation claims it illegally harvested user information from its in-app browser.

All of such suits could be in jeopardy if courts interpret the AI moratorium as barring state laws that impose liability on algorithm-driven systems and you can bet that Big Tech platforms will litigate the bejeezus out of the issue. Even if the moratorium was not intended to block wrongful death and other state law claims, its language may be broad enough to do so in practice—especially when leveraged by well-funded corporate legal teams.

Even supporters of federal AI regulation should be alarmed by the breadth of this safe harbor. It is not a thoughtful national framework based on a full record, but a shoot-from-the-hip blanket prohibition on consumer protection and civil justice. By freezing all state-level responses to AI harms, the AI safe harbor is intent on consolidating power in the hands of federal bureaucrats and corporate lobbyists, leaving ordinary Americans with fewer options for recourse, not to mention a clear violation of state police powers and the 10th Amendment.

To add insult to injury, the use of reconciliation to pass this policy—without full hearings, bipartisan debate, or robust public input—only underscores the cynical nature of the strategy. It has nothing to do with the budget aside from the fact that Big Tech is snarfing down $500 million of taxpayer money for no good reason just so they can argue their land grab is “germane” to shoehorn it into reconciliation under the Byrd Rule. It’s a maneuver designed to avoid scrutiny and silence dissent, not to foster a responsible or democratic conversation about how AI should be governed.

At its core, the AI safe harbor is not about fostering innovation—it is about shielding tech platforms from accountability just like the DMCA, Section 230 and Title I of the Music Modernization Act. By preempting state regulation, it could block families from using long-standing wrongful death statutes to seek justice for the loss of their children and laws protecting Americans from other harms. It undermines the sovereignty of states, the dignity of grieving families, and the public’s ability to scrutinize the AI systems that increasingly shape our lives. 

Congress must reject this overreach, and the American public must remain vigilant in demanding transparency, accountability, and justice. The Initiative must go.

[A version of this post first appeared on MusicTechPolicy]

Press Release: Opposition grows worldwide about TikTok’s decision to stop negotiations with @MerlinNetwork

[Editor Charlie sez: This post from Worldwide Independent Network is available here.]

TikTok’s decision to disintermediate Merlin and walk away from negotiations to renew its current license has sparked widespread concern across the global music industry. The platform is contacting independent music companies directly to try to reach individual deals. Many fear that with this move TikTok intends to pay less for music.

Merlin acts as the licensing partner for the independent sector, playing a crucial role in providing efficiencies for digital platforms, promoting diversity and consumer choice, as well as delivering market access and value for its members. With more than 500 members representing over 30.000 record labels, distributors, and rights holders around the world, Merlin currently accounts for 15% of the global recorded music market and has deals with over 40 digital services.

“TikTok’s decision to walk away from Merlin puts independent labels in an impossible place with their artists: it’s a choice between their music being available on the platform or ensuring fair license terms.” explains Zena WhiteWIN ChairNoemí PlanasWIN CEO, adds that “Merlin was created by independent music companies to compete at the highest level and ensure they can access the best terms. TikTok’s decision poses risks to cultural diversity, market access, and fair payment for independents. But this is not just about TikTok. We urge policymakers around the world to regulate the tech sector to ensure a truly competitive market where creators’ rights are protected from abusive and monopolistic behavior.” TikTok continues to resist calls from the sector to address the existing ‘value gap’, which has a negative impact on the independents’ ability to defend their music and rights.

Asia

Owned by Chinese company ByteDance, TikTok is the world’s largest social media platform after Facebook, YouTube, and Instagram. Asia is home to 6 of the top 10 countries by number of users and local music companies fear TikTok’s decision threatens the level playing field. Jong-Gill ShinSecretary General of the Record Label Industry Association of Korea (LIAK) says: “LIAK expresses profound concern over the current circumstances, which pose a significant risk of fostering discrimination against creative works. It is imperative that all music, regardless of whether it originates from major or independent sectors, be accorded equal value and recognition. We unequivocally oppose TikTok’s recent attempts that threaten to undermine our efforts to secure equitable terms. Aligned with our fellow WIN members globally, we stand resolute in our commitment to upholding and safeguarding the intrinsic value of independent music.” China’s neighbors have also raised concerns about TikTok’s compliance with data protection laws, with India banning the app over national security concerns.

North America

In the United States, the second-largest market by number of TikTok users with 120.5 million, concerns are raised about abuse of power from the platform. In April, President Biden signed a law that would ban TikTok unless ByteDance sells its stake within a year. Richard BurgessCEO of the American Association of Independent Music (A2IM), comments: “TikTok’s unwillingness to negotiate a licensing deal with Merlin is just the latest example of the platform doing whatever it can to avoid compensating artists fairly. Now, more than ever, we need Congress to enact the Protect Working Musicians Act and give musicians, songwriters, independent labels, and publishers the ability to negotiate collectively in the marketplace.”

Similar concerns are raised in Canada, where the music community is actively engaged in the regulatory process around the Online Streaming Act, which extends broadcasters’ requirements to invest in Canada’s music sector to digital platforms and is being met with mounting resistance from the tech sector. “By bypassing local regulations and enforcing unfavourable terms on rights holders, platforms create a significant power imbalance,” says Gord Dimitrieff, Chair of Government Relations at the Canadian Independent Music Association (CIMA)“It stifles competition, reduces cultural diversity, and limits consumer choice.” Andrew Cash, President and CEO of CIMA adds that TikTok’s decision “should act as a wake-up call to Canadian policy makers and politicians engaged in regulating the tech sector.”

Latin America

TikTok was the fastest-growing social media platform in Latin America in 2023. “From a Brazilian perspective, TikTok’s decision not to renew the agreement with Merlin could weaken the representation of independent music, which plays a crucial role in promoting cultural and regional diversity,” says Felippe Llerena, President of the Associação Brasileira da Música Independente (ABMI). “Without a collective agreement, small labels may have more difficulty negotiating individually, negatively impacting their visibility and participation on a platform as relevant as TikTok.” The Brazilian organization claims that this move not only compromises the diversity of content available on the platform, but also does not make sense from a commercial and strategic point of view. Brazil ranks third in TikTok users by country, with 105.3 million, followed by Mexico, with 77.5 million users, but concerns are also raised in other markets of the region. “It is extremely detrimental for the independent sector in Latin America that TikTok is applying this pressure to bypass Merlin. The very purpose of Merlin is to ensure fairer and more equitable representation for all, especially in regions like ours, and we stand by it. The most affected will be the smaller players, who will have few options, and our biggest fear is that they will end up facing the worst conditions.”adds Francisca Sandoval, President of Asociación Gremial Industria Musical Independiente de Chile (IMICHILE).

Europe

Following value gap concerns raised in April, the Independent Music Companies Association (IMPALA) has opposed TikTok’s attempt to boycott Merlin. The European organization highlights the importance of collective deals for diversity and consumer choice, and notes that it is vital that independents and digital services work together and explore ways to grow the value of the moment economy as a key part of the music ecosystem, as proposed in IMPALA’s ten-step plan to make the most of streaming. “We believe giving labels the option to work under a collective deal is the best way for TikTok to achieve these aims and work with artists and genres from across Europe,” says Dario Draštata, IMPALA Chair and Chair of RUNDA Adria. “We respect freedom of choice in entrepreneurship. The growth of the independent sector across all platforms is fundamental to provide fans and consumers with choice and diversity, exactly what TikTok stands for. The easiest way to achieve that is through Merlin.” says Helen Smith, IMPALA’s Executive Chair. She adds: “We invite TikTok to see the value of a renewed collective deal through Merlin and collaborate on growing this important part of the ecosystem. We hope that efficiency and choice for TikTok users, as well as access for artists and labels whatever their country or genre or level of success, and of course joint and standardised efforts on fraud, will prevail.” FranceBelgiumGermany, and other European countries have also come forward in support of Merlin.

Australasia

TikTok is crucial to the music industry, and music is crucial to TikTok. An experimentconducted by TikTok in Australia in 2023 to analyze how music is accessed and used on the platform showed that limiting the licensed music users can experience caused the number of users and the time they spend on the app to decline. “We are highly alarmed at the news of TikTok’s decision to walk away from the negotiating table with Merlin before any licensing renewal discussions could even begin. As if that wasn’t onerous enough, TikTok have stated their intention to seek direct deals, and provided a very, very short runway for labels to sign an NDA. This would be hilarious, if it wasn’t so disrespectful and further demonstrates that TikTok’s behaviour completely undermines their previously stated support of worldwide independent rights holders. IMNZ, as representative and advocate for New Zealand artists and labels, joins with our global compatriots in the hope that TikTok makes the right decision – and finds its way back to the licensing table with Merlin, and smartly”, says Dylan Pellett, General Manager at Independent Music New Zealand (IMNZ).

WIN is committed to ensuring that all businesses in the music sector are best equipped to maximize the value of their rights, regardless of their size and origin, and Merlin is a key partner in this. The global independent music community remains steadfast in its support for collective licensing negotiations and calls on TikTok to return to the table and work on solutions that benefit all parties involved.

Astroturf Spotting: “The People’s Bid for TikTok”

We’ve had a pretty good track record over years of spotting astroturf operations from the European Copyright Directive to ad-supported piracy. Here’s what we believe is the latest–“the People’s Bid for TikTok,” pointed out to us by one of our favorite artists.

The first indication that something is fake–we call these “clues”–is in the premise of the campaign. Remember that the key asset of TikTok is the company’s algorithm. That algorithm is apparently responsible for curating the content users see on their feeds. This algorithm is highly sophisticated and is considered a key factor in TikTok’s success. The U.S. government has argued that the algorithm could be manipulated by the government of the People’s Republic of China to influence what messaging is promoted or suppressed.

In April, President Joe Biden signed a law requiring TikTok’s PRC-based parent company ByteDance to sell TikTok or face a ban in the U.S. by mid-January 2025. This law was the culmination of years of Congressional scrutiny and debate over the app’s potential risks.

At the core of President Biden’s concerns about TikTok is the algorithm. Not surprisingly, the People’s Republic of China has made it very clear that the algorithm is not for sale. This position was confirmed when TikTok itself admitted that the Chinese government would not allow the sale of its algorithm. China’s Commerce Minister Wang Wentao indicated that officials would seek to block any transfer of the app’s technology, stating that the country would “firmly oppose” a forced sale. That likely means that even if ByteDance were to sell TikTok–to “the people” or otherwise–the algorithm would remain under Chinese control, which undermines the U.S. government’s objective.

So–who is behind the “People’s Bid” since given that the “People’s Bid” seems to be making a proposal that will only be acceptable to the People’s Republic of China? We say that because of this FAQ on the People’s Bid site disclaiming any interest in acquiring the algorithm that PRC has essentially claimed as a state secret for some reason:

The People’s Bid has no interest in acquiring TikTok’s algorithm [which is nice since the algo is not for sale]. This is not an attempt to rinse and repeat the formula that has allowed Big Tech companies to reap enormous profits by scraping and exploiting user data. The People’s Bid will ensure that TikTok users control their data and experience by using the app on a rebuilt digital infrastructure that gives more power to users.

Oh no, The People’s Bid has no interest in that tacky algorithm which wasn’t for sale anyway. Good of them. So who is “them”? It appears, although it isn’t quite clear, that the entity doing the acquiring isn’t “The People’s Bid” at all, it’s something called “Project Liberty.”

The FAQ tells us a little bit about Project Liberty:

Project Liberty builds solutions that help people take back control of their digital lives. This means working to ensure that everyone has a voice, choice, and stake in the future of the Internet. Project Liberty has invested over half a billion dollars to develop infrastructure and alliances that will return power to the people.

They kind of just let that “half a billion dollars” drop in the dark of the FAQ. What that tells us is that somebody has a shit-ton of money who is interested in stopping the TikTok ban. So who is involved with this “Project Liberty”? The usual suspects, starting with Lawrence Lessig, Jonathan Zittrain and a slew of cronies from Berkman, Stanford, MIT, etc. Color us shocked, just shocked.

But these people never spend their own money and probably aren’t working for free, so who’s got the dough? Someone who doesn’t seem to care about acquiring the TikTok algorithm from the Chinese Communist Party?

Forbes tells us that this transaction is just a little bit different than what “The People’s Bid” or even the “Liberty Project” would have you believe if all you knew about it was from information on their website. The money seems to be coming in part, maybe in very large part, from one Frank McCourt whom you may remember as a former owner of the Los Angeles Dollars…sorry, Dodgers. In fairness, McCourt isn’t exactly making his plans a secret. He had his Project Liberty issue a press release as “The People’s Bid for TikTok”, which is actually Frank McCourt’s bid for TikTok as far as we can tell and as reported by Forbes:

Billionaire investor and entrepreneur Frank McCourt is organizing a bid to buy TikTok through Project Liberty, an organization to which he’s pledged $500 million that aims to fight for a safe, healthier internet where user data is owned by users themselves rather than by tech giants like TikTok parent ByteDance, Meta and Alphabet.

That’s more like it. We knew there was a sugar daddy in there somewhere. That’s much more in the Lessig style. Big favor, little bad mouth.

Of course, users owning their data is not the entire story by a long shot. Authors owned their books and Google still used the vast Google Books project to train AI.

Forbes adds this insight about Mr. McCourt:

Best known as the former owner of the Los Angeles Dodgers, McCourt spent most of the past decade focused on investing the approximately $850 million in proceeds from the team’s 2012 sale via his company McCourt Global. 

He sprinkled money into sports, real estate, technology, media and an investment firm focused on private credit. In January 2023, McCourt stepped down as CEO of McCourt Global to focus on Project Liberty but remains executive chairman and 100% owner. 

McCourt’s assets are worth an estimated $1.4 billion, landing him on Forbes’ billionaires list for the first time this year—though his wealth is a far cry from the estimated $220 billion valuation of ByteDance.

Which brings us to ByteDance. Is there another Silicon Valley money funnel with an interest in ByteDance? One is Sequoia Capital, which was also an original investor in Google which was an original investor in Professor Lessig and his various enterprises including Creative Commons. Sequoia’s ByteDance investment came in the form of one Neil Shen who runs Sequoia’s China operation recently spun off from the mothership. If you don’t recognize Neil Shen, he’s the former member (until 2023) of the Chinese People’s Political Consultative Conference, an arm of the Chinese Communist Party and its United Front Work operation. (According to a Congressional investigative report, The United Front operation is a strategic effort to influence and control various groups and individuals both within China and internationally. This strategy involves a mix of engagement, influence activities, and intelligence operations aimed at shaping political environments to favor the CCP’s interests. United front work includes “America Changle Association, which housed a secret PRC police station in New York City that was raided by the FBI in October 2022.”)

)

In plainer terms, it’s about the money. According to CNN:

McCourt said he is working with the investment firm Guggenheim Securities and the law firm Kirkland & Ellis to help assemble the bid, adding that the push is backed by Sir Tim Berners-Lee, the inventor of the World Wide Web [OMG, it must be legit!].

McCourt joins a host of other would-be suitors angling to pick up a platform used by 170 million Americans. Former Treasury Secretary Steven Mnuchin announced in March he’s assembling a bid, as well as Kevin O’Leary, the Canadian chairman of the private venture capital firm O’Leary Ventures.

TikTok, meanwhile, has indicated that it’s not for sale and the company has instead begun to mount a fight against the new law. The company sued to block the law earlier this month, saying that spinning off from its Chinese parent company is not feasible and that the legislation would lead to a ban of the app in the United States starting in January of next year.

But it’s the people‘s bid, right? Don’t be evil, ya’ll.

Let’s boil it down: TikTok would have been, up until President Biden signed the sell-or-ban bill into law, a HUGE IPO. It’s also a big chunk of ByteDance’s valuation, which means it’s a big chunk of Neil Shen’s carried interest in all likelihood. TikTok is no longer a huge IPO, in fact, it probably won’t be an IPO at all in its current configuration, particularly since the CCP has told the world that TikTok doesn’t own its core asset, the very algorithm that has so many people addicted (and addiction which is what a buyer is really buying).

So the astroturf is not the Liberty Project of the People’s Bid. Whatever “the People’s Bid” really is, it’s much more likely to be as the financial press has described it–Frank McCourt’s bid. But only for the most high-minded and pure-souled reasons.

It’s about the money. Stay tuned, we’ll be keeping an eye on this one.

How Do TikTok Executives Sleep at Night?

Read the post on Music Business Worldwide and Associated Press “Former Bytedance executive says Chinese Communist Party tracked Hong Kong protesters via data” (Bytedance is the parent company of TikTok.)

Press Release: Texas Governor Abbott Announces Statewide Plan Banning Use Of TikTok — Artist Rights Watch

“Owned by a Chinese company that employs Chinese Communist Party members, TikTok harvests significant amounts of data from a user’s device, including details about a user’s internet activity.”

Governor Greg Abbott today announced a statewide model security plan for Texas state agencies to address vulnerabilities presented by the use of TikTok and other software on personal and state-issued devices. Following the Governor’s directive, the Texas Department of Public Safety and the Texas Department of Information Resources developed this model plan to guide state agencies on managing personal and state-issued devices used to conduct state business. Each state agency will have until February 15, 2023 to implement its own policy to enforce this statewide plan.

“The security risks associated with the use of TikTok on devices used to conduct the important business of our state must not be underestimated or ignored,” said Governor Abbott. “Owned by a Chinese company that employs Chinese Communist Party members, TikTok harvests significant amounts of data from a user’s device, including details about a user’s internet activity. Other prohibited technologies listed in the statewide model plan also produce a similar threat to the security of Texans. It is critical that state agencies and employees are protected from the vulnerabilities presented by the use of this app and other prohibited technologies as they work on behalf of their fellow Texans. I thank the Texas Department of Public Safety and Texas Department of Information Resources for their hard work helping safeguard the state’s sensitive information and critical infrastructure from potential threats posed by hostile foreign actors.”

To protect Texas’ sensitive information and critical infrastructure from potential threats, the model plan outlines the following objectives for each agency:

  • Ban and prevent the download or use of TikTok and prohibited technologies on any state-issued device identified in the statewide plan. This includes all state-issued cell phones, laptops, tablets, desktop computers, and other devices of capable of internet connectivity. Each agency’s IT department must strictly enforce this ban.
  • Prohibit employees or contractors from conducting state business on prohibited technology-enabled personal devices.
  • Identify sensitive locations, meetings, or personnel within an agency that could be exposed to prohibited technology-enabled personal devices. Prohibited technology-enabled personal devices will be denied entry or use in these sensitive areas.
  • Implement network-based restrictions to prevent the use of prohibited technologies on agency networks by any device.
  • Work with information security professionals to continuously update the list of prohibited technologies.

In December 2022, Governor Abbott directed state agency leaders to immediately ban employees from downloading or using TikTok on any government-issued devices. The Governor also informed Lieutenant Governor Dan Patrick and Speaker Dade Phelan that the Executive Branch is ready to assist in codifying and implementing any necessary cybersecurity reforms passed during the current legislative session, including passing legislation to make permanent the Governor’s directive to state agencies.

Governor Abbott has taken significant action to combat threats to Texas’ cybersecurity, including signing the Lone Star Infrastructure Protection Act in 2021 to fortify certain physical infrastructure against threats that include hostile foreign actors.

View the statewide model security plan here.

Press Release: Texas Governor Abbott Announces Statewide Plan Banning Use Of TikTok — Artist Rights Watch–News for the Artist Rights Advocacy Community

Must Read by @ebakerwhite: TikTok Parent ByteDance Planned To Use TikTok To Monitor The Physical Location Of Specific American Citizens — Artist Rights Watch

[Well, here it is. Two years ago we warned everyone who would listen that TikTok were apparatchiks for the Chinese Communist Party–by law in China because of the CCP’s civil-military fusion–“If Google is the Joe Camel of data, then TikTok is the Joe Camel of intelligence.” We did panels warning about TikTok including the CEO’s struggle session and the CCP constitution–facts, you know. Tim Ingham warned that on top of everything else, the deals suck. And then there’s Twinkletoes, who is in our view a walking, talking Foreign Agent Registration Act violation.

Emily Baker White warns of the harms from TIkTok we identified 2 years ago coming home to roost.

[According to Emily Baker White writing in Forbes:]

China-based team at TikTok’s parent company, ByteDance, planned to use the TikTok app to monitor the personal location of some specific American citizens, according to materials reviewed by Forbes.

The team behind the monitoring project — ByteDance’s Internal Audit and Risk Control department — is led by Beijing-based executive Song Ye, who reports to ByteDance cofounder and CEO Rubo Liang. 

The team primarily conducts investigations into potential misconduct by current and former ByteDance employees. But in at least two cases, the Internal Audit team also planned to collect TikTok data about the location of a U.S. citizen who had never had an employment relationship with the company, the materials show. It is unclear from the materials whether data about these Americans was actually collected; however, the plan was for a Beijing-based ByteDance team to obtain location data from U.S. users’ devices.

Read the post on Forbes

@digitalmusicnws: Is TikTok Safe for Kids? Platform Faces At Least Eight State Investigations Over Its Impact On Children and Teens — Artist Rights Watch

Remember this meme from the Stop Enabling Sex Trafficking Act hearings?

Is your children’s online privacy worth $92 million?

@digitalmusicnws: Is TikTok Safe for Kids? Platform Faces At Least Eight State Investigations Over Its Impact On Children and Teens — Artist Rights Watch–News for the Artist Rights Advocacy Community

This post first appeared on Artist Rights Watch

Eight states (Massachusetts, Florida, California, New Jersey, Vermont, Kentucky, Nebraska, and Tennessee) just recently announced their investigations into TikTok, which settled an Illinois privacy lawsuit for $92 million in 2021. The coordinated scrutiny arrives as TikTok – which has been described as “legitimate spyware” – remains extremely popular, reportedly boasting north of three billion downloads and more traffic than Google.

Furthermore, TikTok’s userbase reportedly skews young, and higher-ups have capitalized upon the platform’s prominence within demographics that are relatively difficult for companies to reach.

Read the post by Dylan Smith on Digital Music News