A Los Angeles courtroom is about to become a stress test for Silicon Valley’s favorite legal shields, and a 19-year-old known only as KGM is at the center of it.

What You Should Know

Opening statements began February 9th, 2026, in a Los Angeles County Superior Court bellwether trial accusing Meta and YouTube of addicting and harming children through product design. TikTok and Snap, originally named, settled for undisclosed sums.

The case targets Meta, the owner of Instagram and Facebook, and Google, the owner of YouTube, with plaintiffs arguing the platforms’ features were built to keep minors hooked. The stakes go well beyond one family: This is the first time these child social media addiction claims are being put in front of a jury, and both sides know it.

The label on this one matters. Lawyers call it a bellwether, which is courtroom shorthand for a test case that can signal how juries might treat a larger pile of similar claims. If plaintiffs’ land blows here, it can shape settlement pressure, trial strategy, and how the companies talk about risk. If the defense shuts it down, it can chill the whole wave.

A Trial Built for Leverage

The plaintiffs’ attorney, Mark Lanier, opened with a theme designed to fit on a billboard and stick in jurors’ heads. He told the jury the case was “easy as ABC,” defining it as “addicting the brains of children.” He described Meta and Google as “two of the richest corporations in history” that have “engineered addiction in children’s brains,” according to The Associated Press reporting published by PBS NewsHour.

That framing does two things at once. First, it drags the fight away from messy internet content debates and into product design, the nuts-and-bolts choices companies make to drive usage. Second, it puts money and intent on the table, because jurors tend to understand profit motives faster than they understand algorithmic nuance.

The defendants, for their part, are aiming for the reverse: This is about individual circumstances, parental choices, and third-party content. And, crucially, it is about whether the law even allows these claims to stick in the first place.

The Shield Everyone Keeps Talking About

For years, major platforms have leaned on a combination of First Amendment arguments and Section 230 of the Communications Decency Act, a federal provision that can limit liability for content posted by users. Plaintiffs in cases like this have been trying to thread a needle: If they can persuade courts and juries that the harm came from a platform’s own product design choices, not just what users posted, the usual defenses can look less ironclad.

In the Los Angeles case, the plaintiffs argue that design features were deliberately tuned to maximize youth engagement, which, in turn, boosted ad revenue. The lawsuit language, as reported by AP, compares the approach to the behavioral techniques used by slot machines and echoes lessons from tobacco industry litigation.

That comparison is not subtle. The modern pitch is that the platforms were not just passive hosts; they were active architects.

Public Promises vs. Private Paper

One of the sharpest pressure points in these cases is the gap between what companies say publicly and what plaintiffs say internal documents show. In court, Lanier argued that, while Meta and YouTube publicly emphasize child safety and safeguards, internal records tell a different story about how younger users were discussed as an audience and how engagement features were treated as business drivers, according to AP’s account.

The fight over what those internal documents show, what they mean, and what context is missing is where the case can turn from abstract to personal. If jurors believe internal discussions reveal one set of priorities, while public statements present another, the plaintiffs’ narrative starts to feel less like speculation and more like a paper trail.

Lanier also pushed a specific mechanism, arguing that “like” buttons and similar features are not neutral. He told jurors, “For a teenager, social validation is survival,” and claimed the platforms “engineered a feature that caters to a minor’s craving for social validation,” per AP.

That is a heavy claim, but it is also a carefully chosen one. It invites jurors to see platform features as behavior-shaping levers, not just cute interface details.

What the Companies Say Back

Meta and Google dispute the idea that their products were deliberately built to harm kids. They point to safety tools and policy changes added over time, and they argue they should not be held responsible for third-party content on their platforms, according to AP’s reporting.

Meta has said it strongly disagrees with the allegations. In a statement cited by AP, a Meta spokesperson said the company is “confident the evidence will show our longstanding commitment to supporting young people.”

Google’s YouTube has been similarly direct. Jose Castaneda, identified by AP as a Google spokesperson, said the allegations are “simply not true,” adding, “Providing young people with a safer, healthier experience has always been core to our work.”

In other words, both companies are telling jurors to judge them by their stated mission and their stated safeguards, not by the plaintiffs’ interpretation of product incentives.

The Judge’s Warning Says a Lot About Modern Life

There is also an unusual detail that hints at how hard it is to run a social media trial in 2026 without the outside world leaking into the jury room. Judge Carolyn B. Kuhl instructed jurors not to change how they use social media during the trial, including not changing settings or creating new accounts, according to AP.

That instruction reads like a simple guardrail. It is also a quiet admission that, for many jurors, these platforms are not occasional websites. They are a daily infrastructure.

A Broader Wave Is Building

The Los Angeles case is not the only courtroom where platforms are being forced to explain themselves. AP reported that another trial in New Mexico was set to begin with opening arguments on February 9th, 2026, involving allegations that Meta failed to protect young users from sexual exploitation following an undercover investigation. AP also reported a federal bellwether trial expected to begin in June in Oakland, California, involving school districts suing over harms to children.

Then there is the multistate pressure. AP reported that more than 40 state attorneys general have filed lawsuits against Meta, alleging its products harm young people and contribute to a youth mental health crisis through addictive design. TikTok, AP reported, faces similar lawsuits in more than a dozen states.

That matters because it changes the power balance. A single private lawsuit can be framed as an outlier. A stack of cases filed by states and school districts looks more like a political and legal campaign to force a rewrite of how platforms operate around minors.

Why This One Plaintiff Matters

KGM, the 19-year-old plaintiff identified only by initials, is positioned as a bellwether for thousands of other claims. If jurors buy the argument that a product was intentionally engineered to addict minors, the next question becomes damages, and the question after that becomes how companies change design without conceding liability.

If jurors reject it, the defense gets something close to a proof-of-concept that these cases sound better in press releases than under oath.

Either way, the testimony list itself signals the seriousness. AP reported that executives, including Meta CEO Mark Zuckerberg, are expected to testify in a trial projected to last six to eight weeks.

The Real Contest: Design vs. Responsibility

Strip away the slogans, and the trial is really about attribution. Who is responsible for what happens to minors on these platforms, and which parts of the experience count as the companies’ own conduct?

Plaintiffs want the jury focused on the levers: notifications, feeds, feedback loops, and the business logic of maximizing time on the platform. The companies want the jury focused on the limits: tools offered, choices made by users and families, and the long-standing legal protections that treat platforms differently from publishers.

Outside the courtroom, the argument is bleeding into policy, too. AP reported that other countries have moved toward stricter age rules, including a French measure approved in January to ban social media for children under 15 and an Australian ban for those under 16 that led to millions of accounts being revoked as belonging to children.

The contradiction hanging over all of it is simple: Platforms insist they are building safer experiences for young users, while a growing number of plaintiffs, states, and institutions are asking courts to treat those same experiences as hazardous products.

What to watch next is not just the verdict. It is which evidence the jury seems to reward: internal documents about growth and engagement, or the defendants’ narrative that safeguards and user choice cut off liability. Once a jury draws that line in public, the rest of the country tends to start tracing it.

References

Sign Up for Our Newsletters

Keep Up To Date on the latest political drama. Sign Up Free For National Circus.