The New Grok Times

The news. The narrative. The timeline.

Business

Meta Lost the First Child Safety Verdict and 2,406 More Are Waiting

The exterior of a New Mexico state courthouse with television cameras and reporters gathered on the steps
New Grok Times
TL;DR

A New Mexico jury ordered Meta to pay $375 million for knowingly harming children and concealing child exploitation — the first verdict in 2,407 pending cases.

MSM Perspective

Coverage focuses on the landmark nature of the verdict while noting Meta plans to appeal and that Section 230 still shields platforms from most liability.

X Perspective

The $375M is pocket change for Meta, but the precedent is existential — multiply this verdict by 50 states and thousands of plaintiffs.

A jury in Santa Fe, New Mexico needed less than one day to find that Meta Platforms, Inc. knowingly harmed children, concealed the extent of child sexual exploitation on its platforms, and violated the state's Unfair Practices Act. On Tuesday it ordered the company to pay $375 million in civil penalties [1].

The sum is not large by Meta's standards. The company reported $200.97 billion in annual revenue for 2025 [2]. At that rate, Meta earns roughly $22.9 million per hour. The $375 million penalty represents approximately 16 hours of revenue. It is, in financial terms, a rounding error.

It is not, in legal terms, a rounding error. It is a precedent. And there are 2,406 cases waiting behind it.

What the Jury Found

The case, filed in 2023 by New Mexico Attorney General Raul Torrez, was built on an unusual foundation: an undercover investigation in which state agents created social media accounts posing as children. They documented the sexual solicitations that followed, and Meta's response to them — or, more precisely, the absence of response [1].

Over nearly seven weeks of trial, prosecutors presented Meta's own internal correspondence and reports. They called Meta executives, platform engineers, and whistleblowers who had left the company. They brought in psychiatric experts and public school educators from New Mexico who testified about sextortion schemes targeting their students [1].

The jury was asked to evaluate specific statements made by three people: Meta CEO Mark Zuckerberg, Instagram head Adam Mosseri, and Meta's global head of safety, Antigone Davis. In each case, the jury found that the statements were false or misleading [1].

The jury also found that Meta engaged in what the statute calls "unconscionable" trade practices — practices that "unfairly took advantage of the vulnerabilities of and inexperience of children" [1]. The checklist of findings is worth reading in full. Jurors concluded that Meta failed to disclose what it knew about problems enforcing its ban on users under 13, the prevalence of content about teen suicide, and the role of its recommendation algorithms in prioritizing harmful material.

Each violation counted separately. Thousands of them. The aggregate: $375 million.

The Architecture of Liability

To understand why this verdict matters beyond its dollar figure, you must understand the legal architecture that has, for thirty years, shielded social media companies from precisely this kind of accountability.

Section 230 of the Communications Decency Act, passed in 1996, provides that "no provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider." It is the legal foundation on which every user-generated content platform has been built. Meta has invoked it in virtually every lawsuit filed against it [3].

New Mexico's case did not directly challenge Section 230. Instead, it attacked from a different angle entirely. Prosecutors argued that the harm was not in the content users posted but in the design decisions Meta made — the algorithms that amplify harmful material, the notification systems engineered for compulsive engagement, the age-verification mechanisms that the company's own internal research showed were inadequate. The theory of the case was not that Meta published harmful content. It was that Meta built a machine designed to maximize engagement and then aimed that machine at children, knowing what would happen [1].

This distinction is significant. If the harm is in the content, Section 230 likely applies. If the harm is in the product design — in the architecture of addiction — then the platform is not being sued as a publisher. It is being sued as a manufacturer. And manufacturers have always been liable for defective products.

The Multiplication Problem

Meta's exposure is not limited to New Mexico. More than 40 state attorneys general have filed lawsuits making similar claims [4]. A bipartisan coalition of 33 attorneys general filed a joint federal lawsuit in October 2023, alleging that Meta "deliberately designed Instagram and Facebook features that are addictive" and contributed to a mental health crisis among young people [4].

In a federal court in Oakland, California, a separate jury has been sequestered for more than a week in what is being called a "bellwether" trial — a case designed to set the parameters for thousands of similar lawsuits filed by school districts and individuals [5]. TikTok and Snap, which were originally co-defendants, settled before trial. Meta and YouTube remain.

The total number of pending cases exceeds 2,407. They span every state. They include claims from individual families, school districts, and state governments. The theories vary — some allege consumer fraud, others product liability, others public nuisance — but the factual core is the same: Meta knew its platforms harmed children, concealed what it knew, and chose engagement over safety.

If New Mexico's $375 million is the template, and if even a fraction of the remaining cases reach trial and produce comparable results, the cumulative liability becomes significant even by Meta's standards [6]. Fifty state attorneys general cases at $375 million each would produce $18.75 billion — roughly 9 percent of annual revenue. Add the private lawsuits and the federal cases, and the arithmetic begins to resemble something a board of directors would discuss in terms other than "manageable."

What Meta Said

Meta attorney Kevin Huff told the New Mexico jury that "evidence shows not only that Meta invests in safety because it's the right thing to do but because it is good for business" [1]. He argued that Meta designs its apps to help people connect with friends and family, not to connect predators.

After the verdict, a Meta spokesperson said the company "respectfully disagrees" with the outcome and indicated it would appeal [3]. Meta has consistently maintained that social media addiction does not exist as a clinical diagnosis, though its executives at trial acknowledged "problematic use" [1].

The appeal will take years. A second phase of the New Mexico trial, possibly in May, will determine whether Meta created a public nuisance and may be ordered to change its platform design [1]. That remedy — a court ordering Meta to alter its algorithms — would be far more consequential than any financial penalty.

The Question Underneath

There is a philosophical question beneath the legal scaffolding, and it is worth stating plainly. For thirty years, the operating assumption of internet law has been that platforms are conduits — neutral pipes through which information flows, bearing no more responsibility for what passes through them than the telephone company bears for a threatening phone call.

The New Mexico jury rejected that assumption. It found that Meta is not a conduit. It is an architect. It designs the space through which information flows. It chooses what to amplify and what to suppress. It decides how to measure success — in minutes of engagement, in return visits, in notification clicks — and those decisions have consequences for the children who use the product.

A jury of residents from Santa Fe County, deliberating for less than a day, looked at the evidence Meta's own employees produced and concluded that the company knew what it was doing and did it anyway. They put a number on it: $375 million. The number is small. The finding is not.

There are 2,406 cases left. Meta's annual revenue is $201 billion. The ratio between the penalty and the profit is the clearest measure of what this verdict means and what it does not. It is a warning shot. The question is whether anyone at One Hacker Way is listening.

-- ANNA WEBER, Santa Fe

Sources & X Posts

News Sources
[1] PBS NewsHour / Associated Press, "Jury finds Meta's platforms are harmful to children in 1st wave of social media addiction lawsuits," March 24, 2026: https://www.pbs.org/newshour/nation/jury-finds-metas-platforms-are-harmful-to-children-in-1st-wave-of-social-media-addiction-lawsuits
[2] Meta Platforms Inc., Fourth Quarter and Full Year 2025 Results, January 28, 2026: https://investor.atmeta.com/investor-news/press-release-details/2026/Meta-Reports-Fourth-Quarter-and-Full-Year-2025-Results/default.aspx
[3] Channel News Asia, "Jury orders Meta to pay $375 million in New Mexico lawsuit over child sexual exploitation, user safety," March 24, 2026: https://www.channelnewsasia.com/business/jury-orders-meta-pay-375-million-in-new-mexico-lawsuit-over-child-sexual-exploitation-user-safety-6014736
[4] New York Attorney General, "Attorney General James and Multistate Coalition Sue Meta for Harming Youth," October 24, 2023: https://ag.ny.gov/press-release/2023/attorney-general-james-and-multistate-coalition-sue-meta-harming-youth
[5] NBC News, "Lawyers spar in closing arguments for landmark social media addiction trial," March 12, 2026: https://www.nbcnews.com/tech/social-media/social-media-trial-los-angeles-la-meta-youtube-rcna263063
[6] LA Times, "Meta faces potential billions in fines in trial over children's safety practices," March 23, 2026: https://www.latimes.com/business/story/2026-03-23/meta-faces-potential-billions-in-fines-in-trial-over-childrens-safety-practices
X Posts
[7] The jury found that Meta willfully violated the law and misled the public about the safety of apps used by millions of children every day. https://x.com/lynns_warriors/status/2036566599524585492
[8] META HIT WITH $375M VERDICT IN CHILD SAFETY CASE. A jury ruled Meta must pay $375 million for violating New Mexico law. https://x.com/Newsforce/status/2036571999171125689