Last week, a Los Angeles jury acted on mounting evidence that public health advocates — and many health journalists — have been observing for years: It held social media companies legally responsible for harm to a young user’s mental health.
Jurors found that platforms owned by Meta Platforms and Google (via YouTube) were negligent in designing products that contributed to depression, anxiety and other harms, and failed to adequately warn users. The jury awarded $6 million in damages. Both companies are evaluating their legal options including an appeal, the New York Times reported.
The verdict followed a March 24 ruling by a New Mexico jury that found Meta liable for violating state law by failing to safeguard users of its apps from child predators. That jury determined Meta should pay $375 million in damages.
Social media and mental health
For years, health reporters have wrestled with a familiar tension of mixed evidence, nuanced findings and a lack of clear causality between social media use and youth mental health.
As we previously wrote, the question of how much social media is “OK” for kids leads us to find that the data are messy, context-dependent and often misinterpreted.
But courtrooms operate differently than academic journals. In this case, jurors concluded that platform design was a “substantial factor” in harm and that companies knew enough about its negative effects to warrant warnings.
While that doesn’t settle the science, it does change the stakes of how evidence is interpreted and used. The ruling (which has likely set the precedent for more similar cases to proceed) has expanded reporting opportunities to include how that research is being translated into legal standards of harm, liability and risk.
The plaintiffs in the case argued that features like infinite scroll and autoplay were engineered to maximize engagement and foster compulsive use, which echoes arguments once used against cigarette makers.
In an earlier post, we explored whether warning labels on social media could follow a similar path. This verdict suggests that idea is no longer hypothetical — it’s part of an emerging regulatory and legal framework.
We are likely entering a phase where “addictive design” becomes a public health issue, platform features are treated as risk factors and industry responsibility is becoming central to the story.
This was a “bellwether” trial, meaning a test case for thousands more lawsuits already in the pipeline.
More cases to come
These verdicts illustrate a growing shift in the public’s perception of social media companies and their responsibilities in keeping young users safe, the Associated Press reported. They also might set the stage for how many more lawsuits will play out.
The Los Angeles trial is one of several planned for this year, according to Bloomberg, which noted that over 3,000 cases brought by children, adolescents and young adults — sometimes via their family members — claiming psychological distress, physical impairment and death have been filed against Meta, Google, Snap and TikTok nationwide. Two additional personal injury cases are expected to go to trial in Los Angeles state court, the article said.
Dozens of state attorney generals also are suing the companies, with a case against Meta set for federal court in Oakland, Calif., in August, Bloomberg reported. Public school districts also have brought more than 1,200 complaints on students’ behalf. The first of those trials is scheduled for Oakland in June, the news report said. The school districts claim the companies created a “public nuisance” by distracting children and undermining their education, yielding a youth mental health crisis.
There are options to regulate social media, wrote Daniel Katz, a clinical psychologist in Cambridge, Mass., in a March 25 editorial for the Los Angeles Times. Willpower alone “is unlikely to be sufficient in breaking habitual social media use that has been engineered and reinforced,” he wrote, noting the idea that we can quit anytime is a myth.
As in Australia, which restricts those under age 16 from holding accounts on platforms such as TikTok and Snapchat, our government could impose age restrictions, he said. It also could set design limits, and “companies could be required to disclose how they track and manipulate engagement metrics.”
Shifting risk reporting from screen time to product design
One of the biggest limitations in past health stories — including much of the research — is the emphasis on time spent. But this landmark case and, we suspect, many of the lawsuits that will follow, will probably focus on how platforms are built.
Not all screen time is equal. Not all platforms are equal. And increasingly, not all features are neutral. That aligns with what the data already suggest. Harms are unevenly distributed (by age, gender, preexisting vulnerability) and experiences (comparison, harassment, algorithmic amplification) matter more than raw hours. Design choices can shape those experiences.
Two things can be true at once: Social media is not uniformly harmful for all kids, and some platform designs may meaningfully increase risk for some users.
What comes next will likely include more litigation with competing expert testimony, legislative efforts (i.e. youth safety laws), industry-funded research and counter-narratives, and intensified debates about causality and responsibility.
This means health journalists will need to move beyond “How many hours?” stories and toward reporting that interrogates algorithmic recommendation systems, engagement loops, content amplification pathways and platform-specific risks for their health effects.
This verdict doesn’t answer the question of how social media affects youth mental health. But, it signals that society — including the legal system — is no longer willing to treat that question as purely academic.
Resources
- Meta and YouTube Found Negligent in Landmark Social Media Addiction Case – New York Times
- How much social media is ‘OK’ for kids? What the data really say – AHCJ blog post
- Warning labels could help regulate social media. But will it make us healthier? – AHCJ blog post
- How a social media trial verdict threatens big tech – Bloomberg
- Parents see hope in back-to-back rulings that social media providers failed to protect young users – AP
- Social media platforms aren’t the new cigarettes. They’re worse – Los Angeles Times










