A Jury Just Told Meta and YouTube Their Platforms Were Designed to Addict Kids. Now Hundreds of Cases Are Waiting.
A Los Angeles jury awarded $6 million to a young woman who sued Meta and Google over childhood social media addiction, finding both companies acted with "malice, oppression, or fraud." The verdict could reshape how tech companies design products for young users.
A 20-year-old Californian, identified as Kaley, has won what appears to be the first jury verdict holding major social media companies liable for deliberately engineering addictive platforms that damaged a child's mental health. The BBC reported that jurors awarded her $3 million in compensatory damages and an additional $3 million in punitive damages, splitting liability between Meta (70%) and Google (30%). Both companies have said they will appeal.
The ruling doesn't just matter for Kaley. Hundreds of similar lawsuits filed by parents, teens, and school districts are currently moving through US courts. This verdict hands them a template — and a precedent that will be extremely difficult for Silicon Valley to ignore.
What the Jury Found
The core allegation was straightforward: Meta (which owns Instagram and Facebook) and Google (which owns YouTube) intentionally built their platforms to be addictive, and those design choices caused severe psychological harm to a young user. Kaley and her mother originally filed suit in 2023 against Meta, YouTube, Snap, and TikTok, claiming that compulsive use of those platforms from an extremely young age caused serious damage to her mental health.
Snapchat and TikTok settled out of court after Meta and Google failed to get the case dismissed in November leaving the two largest platforms to face a jury.
The jury's finding of "malice, oppression, or fraud" is significant. It's what triggered the punitive damages — money meant not to compensate Kaley but to punish the companies. Punitive damages signal that the jury believed Meta and Google didn't just make negligent design choices. They believed the companies knew what they were doing and did it anyway.
Meta pushed back in a statement: "Teen mental health is profoundly complex and cannot be linked to a single app." Google's spokesperson took a different tack, arguing that "this case misunderstands YouTube, which is a responsibly built streaming platform, not a social media site."
That distinction — streaming platform versus social media — may become a key battleground in Google's appeal.
Zuckerberg on the Stand
One of the trial's most closely watched moments came when Mark Zuckerberg testified in person. According to WIRED's courtroom reporting, the Meta CEO arrived at the Superior Court of Los Angeles County escorted by a security detail that included Department of Homeland Security officers. The courtroom was packed with spectators and media.
Zuckerberg's testimony was notable for what it didn't say. WIRED described his approach as "a playbook of repetitive answers and buzzwords," sticking to safe, rehearsed language rather than engaging substantively with questions about whether Meta's products were engineered for maximum engagement at the expense of younger users' wellbeing. It was a strategy designed to avoid creating soundbites that could be used against Meta in this case or the hundreds that follow.
The plaintiff's legal team had to work against this wall of corporate caution, pressing Zuckerberg on internal company decisions about engagement-boosting strategies and their known effects on teens. The fact that a sitting CEO of one of the world's most valuable companies was compelled to testify in person — rather than via deposition or video — underscored how seriously the court treated the claims.
The Wave Behind the Verdict
This trial didn't emerge in isolation. In a pre-trial overview, hundreds of parents, teens, and school districts across the US have filed claims alleging that social media platforms are intentionally addictive and harmful. The cases have been consolidating in various courts, and the Kaley verdict is the first to reach a jury decision against major platforms.
The legal theory at the heart of these cases treats addictive platform design as a product liability issue. Just as a car manufacturer can be held liable for a defective braking system, plaintiffs argue that features like infinite scroll, autoplay, algorithmic recommendation engines, and notification systems constitute defective — or deliberately harmful — product design when aimed at minors.
This framing sidesteps Section 230 of the Communications Decency Act, which shields platforms from liability for user-generated content. The lawsuits aren't about what users posted. They're about how the platforms were built and whether those design choices were made with knowledge that they would hook young users.
The $6 million award itself is modest by Big Tech standards. Meta's annual revenue runs well into the tens of billions. But the verdict's real financial threat is multiplicative: if hundreds of similar cases follow the same playbook and reach similar outcomes, the cumulative exposure becomes substantial. And punitive damages in future cases could scale significantly if courts determine the companies failed to change their practices after being put on notice.
What Changes Now
The immediate practical impact depends on the appeals process. Both Meta and Google have signaled they'll fight the verdict, and appellate courts could narrow or overturn the ruling. But even if the specific award is reduced, the trial has established something that didn't exist before: a jury, presented with evidence about how these platforms work internally, concluded that the design was deliberately harmful to children.
That finding will shape settlement negotiations in the remaining cases. Plaintiffs' attorneys now have a proven courtroom narrative and a damages framework. Companies facing similar suits may calculate that settling is cheaper and less reputationally damaging than putting their CEOs on the stand.
Beyond the courtroom, the verdict adds fuel to ongoing legislative efforts around children's online safety. Federal and state lawmakers have been pushing various bills to restrict how platforms can target and engage minors. A jury verdict that explicitly finds "malice" in platform design gives legislators concrete language to point to.
For the platforms themselves, the calculus around features like Shorts, Reels, and algorithmically driven feeds just shifted. YouTube's defense — that it's a "streaming platform, not a social media site," - hints at how companies may try to redefine their products to escape liability. Expect more of this semantic maneuvering as cases proceed.
The Bigger Picture
The Kaley verdict arrives at a moment when public patience with self-regulation by tech companies is thin. Years of internal documents, whistleblower testimony, and congressional hearings have painted a consistent picture: platforms understood the risks their products posed to young users and prioritized engagement metrics over safety.
What this trial did was take that narrative out of Senate hearing rooms and put it in front of twelve ordinary people. Those twelve people decided the evidence was convincing enough to warrant not just compensation but punishment.
The parents who gathered outside the courthouse during the trial, weren't part of Kaley's lawsuit. They were there because they believe their own children were harmed in the same way. Many of them have their own cases pending. This verdict tells them, and their lawyers, that a jury can be persuaded.
Meta and Google will spend years and significant legal resources trying to contain the fallout. But the fundamental question the jury answered isn't going away: when you design a product to be as engaging as possible and aim it at children, are you responsible for what happens next? In Los Angeles, at least, the answer is yes.