Courts Find Meta Responsible For Teen Harms; Legal Wave Could Follow
- Andrej Botka
- 1 апр.
- 3 мин. чтения

Meta suffered two major courtroom setbacks this month, as judges and juries separately concluded the company’s product choices contributed to harm among young people — a legal turn that could trigger a surge of cases and force changes in how social apps are built. In New Mexico, a judge found Meta in violation of the state’s consumer-protection law after a six-week trial, and a Los Angeles jury determined the company knowingly engineered features that hooked children and teenagers, contributing to the mental health injury of a young plaintiff. The twin rulings shift the focus of litigation away from user speech and toward the platforms’ engineering decisions.
The New Mexico judgment ordered Meta to pay $5,000 for each violation named in the complaint, a penalty that added up to $375 million. In the Los Angeles civil trial, jurors assigned Meta seven in 10 of the blame for the plaintiff’s distress and pinned the remaining three in 10 on YouTube, resulting in combined damages of $6 million; other defendants including Snap and TikTok reached settlements before the trial began. Meta said it will appeal both outcomes and warned that mental-health issues among teens are complex and not driven by a single cause, while also stressing that many young people find connection online.
Evidence unsealed during the suits painted a picture of product teams prioritizing ways to increase teen engagement, at times discussing tactics to avoid notice from parents and teachers. Internal research from 2019 — based on two dozen one-on-one interviews — flagged that roughly one-eighth of users showed problematic usage patterns and described the platform’s effect on well-being as negative. Emails and memos revealed internal debates about optimizing features to increase daily opens and retention, and some employees wrote lighthearted notes about designing around times when teens would be likely to check their phones during class.
Meta pushed back, noting much of the material stems from nearly a decade ago and saying the company has adopted protections aimed at younger users. A spokesperson pointed to Instagram’s 2024 “Teen Accounts,” which default to private, limit who can mention or tag underage users, and prompt time-limit reminders after 60 minutes — controls that, for users under 16, require parental approval to override. The company also asserted it no longer sets targets based on teens’ time spent on its apps.
The legal pressure arrives amid a broader policy fight over children’s online safety. Internal disclosures first surfaced in 2021, when a former employee released documents suggesting Instagram had adverse effects on adolescent girls. Since then, lawmakers have drafted multiple proposals, including the Kids Online Safety Act, aimed at curbing harms but drawing criticism from civil-liberties advocates who say certain versions risk broad censorship or intrusive age verification. Kelly Stonelake, a former Meta marketing director now pursuing litigation against the company alleging discrimination, said the unsealed files mirror her experiences and urged lawmakers to reject any bill that would strip states and families of legal recourse.
Legal analysts say these decisions could encourage plaintiffs to pursue liability based on product design rather than content moderation, a strategy that proved successful in these cases. If courts consistently accept that line of argument, even modest damages could multiply across the thousands of lawsuits already filed and the roughly 40 state attorney general actions pending, producing financial exposure that would capture corporate attention. Appeals are certain, and observers expect an extended period of legal fights, regulatory scrutiny and possible product changes as companies weigh how to respond.
Комментарии