Jury Finds Instagram and YouTube Addictive for Minors, Awards $6 Million in Landmark Design-Defect Case

The jury’s verdict landed in a downtown Los Angeles courtroom on March 25, but the questions it raises reach far beyond one young woman’s story.

$6 million verdict targets platform design

A Los Angeles County Superior Court jury found that Instagram and YouTube were negligently designed in ways that made the platforms addictive for minors and substantially contributed to a California woman’s anxiety, depression, body dysmorphia and self-harm when she was a child and teenager. Jurors ordered parent companies Meta Platforms Inc. and Google LLC to pay a total of $6 million in compensatory and punitive damages.

The case, K.G.M. v. Meta Platforms Inc. and others, is widely viewed as the first major test of a legal theory that treats social media not as a neutral host of other people’s speech, but as a product whose core design features can be defective and dangerous—especially for young users. The verdict is expected to influence thousands of similar lawsuits pending around the country and intensify political pressure on the industry.

The plaintiff, identified in court papers by her initials and referred to as Kaley in some coverage because she was a minor when the alleged harms began, is now 20. Jurors agreed that design choices on Instagram and YouTube—including infinite scroll, autoplay, engagement-optimizing recommendation algorithms, push notifications, and appearance-focused tools such as beauty filters and likes—were a substantial factor in the mental health problems she developed after she began using the platforms as a child.

The panel awarded $3 million in compensatory damages, then added another $3 million in punitive damages after a brief second phase of trial. Responsibility was split roughly 70-30 between the companies, with Meta liable for about $4.2 million and Google/YouTube for about $1.8 million, according to attorneys involved in the case.

Plaintiff described compulsive use starting in childhood

Kaley testified that she started watching YouTube at age 6 and joined Instagram at 9. By middle school, she said, she was spending hours each day on the two platforms, checking them before school, between classes and late into the night.

“I felt like I couldn’t stop,” she told jurors, describing a pattern of compulsive use despite recognizing that the apps were making her feel worse. Her lawyers argued that as Instagram’s algorithm pushed more appearance-focused content and filters into her feed, she became fixated on how she looked and how many likes she received, which they said fed body dysmorphia and depression. On YouTube, autoplay and “Up next” recommendations kept her watching video after video, often at the expense of sleep and school.

Her attorneys framed the case as a product liability and negligence suit, not a dispute about any particular post or video. They argued that Meta and Google chose to deploy engagement-maximizing designs that they knew, or should have known, were especially risky for adolescents.

“This case was never about one bad video,” plaintiff’s lawyer Mark Lanier said outside the courthouse after the verdict. “It was about an architecture—the way these platforms are engineered to grab and hold a child’s attention in ways their developing brains are not equipped to resist.”

Meta and Google plan to appeal

Meta and Google have said they will appeal.

A Meta spokesperson said the company “respectfully disagrees with the verdict” and pointed to tools it has added in recent years to give parents more control and to nudge teens toward breaks and different content. During the trial, Meta emphasized research suggesting that many young people report positive or neutral experiences on social media and argued that Kaley’s struggles were shaped by a range of factors, including offline stress and preexisting vulnerabilities.

Meta Chief Executive Mark Zuckerberg testified live before the Los Angeles jury in February. He acknowledged the company had been “slow to implement controls” to keep children under 13 off Instagram, saying, “I wish we could’ve gotten there sooner.” But he rejected the characterization of the apps as addictive in a clinical sense.

“There’s a difference between someone using a service a lot and addiction,” Zuckerberg said on the stand. “We see a lot of people using our products in ways they find meaningful and positive.”

Instagram chief Adam Mosseri, in earlier testimony, similarly told jurors he did not equate what critics call “social media addiction” with medically recognized addiction to substances or gambling.

Google, which owns YouTube, has sought to distinguish its flagship video service from friend-based social networks.

“This decision mischaracterizes YouTube as a social media site,” company spokesperson Jose Castañeda said after the verdict. “YouTube is a responsibly built streaming platform, and we provide robust parental controls, dedicated kids’ experiences and tools for families to manage how their children watch.”

A strategy aimed at sidestepping Section 230

The Los Angeles case is being closely watched by lawyers and policymakers because it sidesteps Section 230 of the Communications Decency Act, the federal law that has long shielded online platforms from liability for content posted by users. Instead of arguing that Meta or Google should be held responsible for harmful speech, Kaley’s legal team targeted the underlying design—algorithms, interface mechanics and notifications—as the companies’ own conduct.

A federal judge in the Northern District of California, overseeing a consolidated proceeding known as In re Social Media Adolescent Addiction/Personal Injury Products Liability Litigation, has previously allowed many such design-defect and failure-to-warn claims to go forward against Meta, Google, TikTok, Snap and others. The Los Angeles verdict is the first time a jury has weighed in on those theories against major platforms.

The decision came one day after a separate jury in Santa Fe, New Mexico, ordered Meta to pay $375 million in civil penalties in a lawsuit brought by the state’s attorney general. That jury found that the company violated New Mexico’s Unfair Practices Act by misleading the public about the risks of child sexual exploitation and other harms on its platforms.

Together, the back-to-back losses mark one of the most significant legal setbacks for Meta since it shifted its focus to social media more than a decade ago. They also arrive amid mounting concern from public health officials about youth mental illness.

Youth mental health debate intensifies

In a 2023 advisory, U.S. Surgeon General Vivek Murthy warned that adolescents who spend more than three hours a day on social media face “double the risk” of experiencing symptoms of depression and anxiety compared with lighter users. The advisory cited surveys in which more than one-third of teenage girls said they felt “addicted” to social media, and it urged policymakers to consider measures such as stronger safety standards for platforms, data transparency and limits on certain features for minors.

Nearly all U.S. teens use at least one social platform, according to federal data, and many report spending several hours a day on them. That ubiquity has made it difficult to draw direct causal lines between specific design features and individual mental health outcomes—an issue Meta and Google highlighted throughout the Los Angeles trial. But Kaley’s lawyers and their experts argued that the pattern of her use, beginning in early childhood and intensifying through adolescence, matched what researchers have described as a particularly sensitive period for social comparison and reward-seeking.

What happens next

The verdict is not the final word. Appellate courts will be asked to decide whether claims that infinite scroll and autoplay are “defects” truly fall outside Section 230’s protections and whether the trial judge’s instructions on causation and duty to warn were proper. Technology trade groups are also pursuing separate constitutional challenges to state laws that seek to regulate minors’ use of social media, arguing that many proposals infringe on free speech.

Still, the Los Angeles case has already changed the conversation for an industry long accustomed to fending off lawsuits by pointing to user content and parental responsibility. For plaintiffs’ lawyers representing thousands of other families and school districts in the federal multidistrict litigation, it offers a tested narrative, a set of exhibits and a verdict form that another jury has now endorsed.

For Kaley, the outcome is more personal. “I can’t get back being a kid,” she told the court in the final days of the trial. What happens next—in appellate courts, legislatures and corporate boardrooms—will help determine whether other young users grow up with the same kinds of feeds, or whether the design of the platforms themselves begins to change.

Tags: #socialmedia, #metaplatforms, #google, #youthmentalhealth, #section230