The New Grok Times

The news. The narrative. The timeline.

Business

Two Juries, Two States, Forty-Eight Hours

Courthouse steps in Santa Fe with media crews after the Meta child safety verdict
New Grok Times
TL;DR

Two juries in New Mexico and Los Angeles found Meta liable for harming children within 48 hours, establishing algorithmic negligence as a legal fact.

MSM Perspective

AP News reports the LA verdict is the first-ever jury finding of social media addiction liability while CNBC details the $375 million New Mexico damages.

X Perspective

Child safety advocates on X called the twin verdicts a 'Big Tobacco moment' while Meta's stock dropped on layoff news the same day.

On Tuesday a jury in Santa Fe ordered Meta to pay $375 million for violating New Mexico's Unfair Practices Act [2]. On Wednesday a jury in Los Angeles found Instagram and YouTube liable for addicting a child to social media — the first such verdict in American history — and awarded $3 million in damages, with Meta bearing 70 percent of the fault and Google's YouTube the remaining 30 [1]. Two juries, two states, forty-eight hours. Neither knew about the other.

This paper reported yesterday on two thousand four hundred and six more waiting behind the New Mexico verdict. The number has not changed. What has changed is the theory of liability. In Santa Fe, the state proved consumer fraud — that Meta lied about the safety of its products. In Los Angeles, a private plaintiff proved something harder: that Meta and Google designed products whose algorithmic architecture caused a specific, measurable harm to a specific child. The legal term emerging from both courtrooms is "algorithmic negligence," and it survived scrutiny in each.

How the New Mexico Case Worked

New Mexico Attorney General Raul Torrez sued Meta in 2023 after an undercover operation in which state investigators created a profile for a fictitious 13-year-old girl. The account, Torrez said, "was simply inundated with images and targeted solicitations" from adults seeking sexual contact with minors [2].

The trial ran nearly seven weeks. Prosecutors built the case around Meta's own internal communications. One exhibit proved particularly damaging: when Zuckerberg announced in 2019 that Facebook Messenger would adopt end-to-end encryption, internal messages showed employees understood the change would affect the company's ability to disclose approximately 7.5 million child sexual abuse material reports annually to law enforcement. The encryption went forward [2].

The jury found Meta had willfully violated the Unfair Practices Act. Damages were calculated per violation — thousands of them — reaching $375 million. Torrez's attorney, Linda Singer, had urged the jury to impose more than $2 billion [2].

"Meta executives knew their products harmed children, disregarded warnings from their own employees, and lied to the public," Torrez said [2].

Meta responded that it "respectfully disagrees with the verdict and will appeal" [2].

The New Mexico case is not finished. Phase 2 begins May 4 without a jury. The judge will determine whether Meta created a public nuisance and whether to order changes to the platform's design — age verification, predator removal, restrictions on the encryption that shields bad actors. That remedy would matter more than the $375 million [2].

How the Los Angeles Case Worked

The LA Superior Court case operated under a different legal theory entirely. The plaintiff was not a state attorney general but an individual who alleged she became addicted to Instagram and YouTube while underage, and that the addiction caused measurable psychological harm [1].

The question for the jury was not whether Meta lied about safety but whether the design features of Instagram and YouTube — the recommendation algorithms, the infinite scroll, the notification cadence, the autoplay — constituted a defective product. The plaintiff's attorneys argued that these features were not incidental to the platform but were the platform, engineered to maximize engagement, and that when directed at a developing adolescent brain, they function as an addictive mechanism [1].

The jury agreed. It found Meta 70 percent responsible and Google 30 percent responsible, and awarded $3 million. The dollar amount is not the point. The point is the finding: a jury examined the inner workings of a recommendation algorithm and concluded it was negligently designed. That has never happened before [1].

The Consolidation

These two cases are the first to reach verdict. More than 1,600 plaintiffs — school districts, parents, state governments — are consolidated in federal litigation, with a trial in the Northern District of California scheduled for later this year [1]. The legal theories from Santa Fe and Los Angeles will both be tested there, at a scale that dwarfs either proceeding.

Torrez described what he hoped the case would produce: "How we can change the design features of these products, at least within New Mexico, and that would create a standard that could then be modeled elsewhere in the country, and, frankly, around the world" [2].

The Timing

On the same day the Los Angeles jury returned its verdict, Meta began a round of layoffs [1]. There is no reason to assume a direct link. But Meta is simultaneously cutting staff and absorbing legal findings that its products, as designed, harm children. The company says it disagrees with both verdicts and will appeal both. Appeals take years.

What cannot be appealed is the factual record. Two juries, working independently in two states under two different legal theories, examined Meta's internal documents and reached the same conclusion: the company knew its products harmed children and chose not to fix them. Legal experts quoted by AP compared the moment to the Big Tobacco litigation of the 1990s [1] — not because the dollar figures are comparable, but because the underlying mechanism is the same. An industry that denied harm for years is now losing in court to its own internal evidence.

The $375 million and the $3 million are numbers. The finding — that algorithmic negligence is a cognizable legal harm, provable to a jury — is a fact. Facts, unlike verdicts, are not subject to appeal.

-- DAVID CHEN, San Francisco

Sources & X Posts

News Sources
[1] https://apnews.com/article/social-media-addiction-trial-la-5e54075023d837ccdc76c4ca512e925d
[2] https://www.cnbc.com/2026/03/24/jury-reaches-verdict-in-meta-child-safety-trial-in-new-mexico.html
X Posts
[3] The verdict and $375 million civil penalty are powerful reminders that social media companies will be held to account https://x.com/GovMLG/status/2036889030382567930