Jury Finds Instagram and YouTube Addictive for Minors, Awards $6 Million in Landmark Design-Defect Case
The juryâs verdict landed in a downtown Los Angeles courtroom on March 25, but the questions it raises reach far beyond one young womanâs story.
$6 million verdict targets platform design
A Los Angeles County Superior Court jury found that Instagram and YouTube were negligently designed in ways that made the platforms addictive for minors and substantially contributed to a California womanâs anxiety, depression, body dysmorphia and self-harm when she was a child and teenager. Jurors ordered parent companies Meta Platforms Inc. and Google LLC to pay a total of $6 million in compensatory and punitive damages.
The case, K.G.M. v. Meta Platforms Inc. and others, is widely viewed as the first major test of a legal theory that treats social media not as a neutral host of other peopleâs speech, but as a product whose core design features can be defective and dangerousâespecially for young users. The verdict is expected to influence thousands of similar lawsuits pending around the country and intensify political pressure on the industry.
The plaintiff, identified in court papers by her initials and referred to as Kaley in some coverage because she was a minor when the alleged harms began, is now 20. Jurors agreed that design choices on Instagram and YouTubeâincluding infinite scroll, autoplay, engagement-optimizing recommendation algorithms, push notifications, and appearance-focused tools such as beauty filters and likesâwere a substantial factor in the mental health problems she developed after she began using the platforms as a child.
The panel awarded $3 million in compensatory damages, then added another $3 million in punitive damages after a brief second phase of trial. Responsibility was split roughly 70-30 between the companies, with Meta liable for about $4.2 million and Google/YouTube for about $1.8 million, according to attorneys involved in the case.
Plaintiff described compulsive use starting in childhood
Kaley testified that she started watching YouTube at age 6 and joined Instagram at 9. By middle school, she said, she was spending hours each day on the two platforms, checking them before school, between classes and late into the night.
âI felt like I couldnât stop,â she told jurors, describing a pattern of compulsive use despite recognizing that the apps were making her feel worse. Her lawyers argued that as Instagramâs algorithm pushed more appearance-focused content and filters into her feed, she became fixated on how she looked and how many likes she received, which they said fed body dysmorphia and depression. On YouTube, autoplay and âUp nextâ recommendations kept her watching video after video, often at the expense of sleep and school.
Her attorneys framed the case as a product liability and negligence suit, not a dispute about any particular post or video. They argued that Meta and Google chose to deploy engagement-maximizing designs that they knew, or should have known, were especially risky for adolescents.
âThis case was never about one bad video,â plaintiffâs lawyer Mark Lanier said outside the courthouse after the verdict. âIt was about an architectureâthe way these platforms are engineered to grab and hold a childâs attention in ways their developing brains are not equipped to resist.â
Meta and Google plan to appeal
Meta and Google have said they will appeal.
A Meta spokesperson said the company ârespectfully disagrees with the verdictâ and pointed to tools it has added in recent years to give parents more control and to nudge teens toward breaks and different content. During the trial, Meta emphasized research suggesting that many young people report positive or neutral experiences on social media and argued that Kaleyâs struggles were shaped by a range of factors, including offline stress and preexisting vulnerabilities.
Meta Chief Executive Mark Zuckerberg testified live before the Los Angeles jury in February. He acknowledged the company had been âslow to implement controlsâ to keep children under 13 off Instagram, saying, âI wish we couldâve gotten there sooner.â But he rejected the characterization of the apps as addictive in a clinical sense.
âThereâs a difference between someone using a service a lot and addiction,â Zuckerberg said on the stand. âWe see a lot of people using our products in ways they find meaningful and positive.â
Instagram chief Adam Mosseri, in earlier testimony, similarly told jurors he did not equate what critics call âsocial media addictionâ with medically recognized addiction to substances or gambling.
Google, which owns YouTube, has sought to distinguish its flagship video service from friend-based social networks.
âThis decision mischaracterizes YouTube as a social media site,â company spokesperson Jose CastaĂąeda said after the verdict. âYouTube is a responsibly built streaming platform, and we provide robust parental controls, dedicated kidsâ experiences and tools for families to manage how their children watch.â
A strategy aimed at sidestepping Section 230
The Los Angeles case is being closely watched by lawyers and policymakers because it sidesteps Section 230 of the Communications Decency Act, the federal law that has long shielded online platforms from liability for content posted by users. Instead of arguing that Meta or Google should be held responsible for harmful speech, Kaleyâs legal team targeted the underlying designâalgorithms, interface mechanics and notificationsâas the companiesâ own conduct.
A federal judge in the Northern District of California, overseeing a consolidated proceeding known as In re Social Media Adolescent Addiction/Personal Injury Products Liability Litigation, has previously allowed many such design-defect and failure-to-warn claims to go forward against Meta, Google, TikTok, Snap and others. The Los Angeles verdict is the first time a jury has weighed in on those theories against major platforms.
The decision came one day after a separate jury in Santa Fe, New Mexico, ordered Meta to pay $375 million in civil penalties in a lawsuit brought by the stateâs attorney general. That jury found that the company violated New Mexicoâs Unfair Practices Act by misleading the public about the risks of child sexual exploitation and other harms on its platforms.
Together, the back-to-back losses mark one of the most significant legal setbacks for Meta since it shifted its focus to social media more than a decade ago. They also arrive amid mounting concern from public health officials about youth mental illness.
Youth mental health debate intensifies
In a 2023 advisory, U.S. Surgeon General Vivek Murthy warned that adolescents who spend more than three hours a day on social media face âdouble the riskâ of experiencing symptoms of depression and anxiety compared with lighter users. The advisory cited surveys in which more than one-third of teenage girls said they felt âaddictedâ to social media, and it urged policymakers to consider measures such as stronger safety standards for platforms, data transparency and limits on certain features for minors.
Nearly all U.S. teens use at least one social platform, according to federal data, and many report spending several hours a day on them. That ubiquity has made it difficult to draw direct causal lines between specific design features and individual mental health outcomesâan issue Meta and Google highlighted throughout the Los Angeles trial. But Kaleyâs lawyers and their experts argued that the pattern of her use, beginning in early childhood and intensifying through adolescence, matched what researchers have described as a particularly sensitive period for social comparison and reward-seeking.
What happens next
The verdict is not the final word. Appellate courts will be asked to decide whether claims that infinite scroll and autoplay are âdefectsâ truly fall outside Section 230âs protections and whether the trial judgeâs instructions on causation and duty to warn were proper. Technology trade groups are also pursuing separate constitutional challenges to state laws that seek to regulate minorsâ use of social media, arguing that many proposals infringe on free speech.
Still, the Los Angeles case has already changed the conversation for an industry long accustomed to fending off lawsuits by pointing to user content and parental responsibility. For plaintiffsâ lawyers representing thousands of other families and school districts in the federal multidistrict litigation, it offers a tested narrative, a set of exhibits and a verdict form that another jury has now endorsed.
For Kaley, the outcome is more personal. âI canât get back being a kid,â she told the court in the final days of the trial. What happens nextâin appellate courts, legislatures and corporate boardroomsâwill help determine whether other young users grow up with the same kinds of feeds, or whether the design of the platforms themselves begins to change.