Business

Meta and YouTube Found Negligent in Landmark Social Media Addiction Trial

Jury delivers verdict in civil trial accusing social media giants of designing dangerously addictive products targeting children.

· 3 min read
Meta and YouTube Found Negligent in Landmark Social Media Addiction Trial

A jury in Oakland, California, ruled on Tuesday that Meta and YouTube were negligent in designing their platforms in ways that contributed to a teenager's social media addiction and resulting mental health problems, delivering what legal experts called a landmark verdict that could reshape how technology companies approach product design for young users.

The case, brought by the family of a now-17-year-old plaintiff identified only as J.S. in court documents, alleged that features such as infinite scroll, push notifications, autoplay videos, and algorithmically curated content feeds were deliberately engineered to maximize engagement without adequate regard for the psychological well-being of minors.

After three weeks of testimony and two days of deliberation, the jury found that both companies had failed to exercise reasonable care in designing products they knew would be used extensively by children and adolescents. The panel awarded compensatory damages, though the specific amount will be determined in a subsequent phase of the trial.

The verdict marks the first time a jury has held major social media companies liable for harm caused by the addictive qualities of their core product features. Previous lawsuits targeting technology companies over youth mental health issues had been dismissed or settled before reaching trial, making the outcome a significant legal milestone.

Attorneys for the plaintiff presented extensive internal documents from both companies showing that employees had raised concerns about the impact of engagement-maximizing features on younger users. The documents included research conducted by Meta's own teams that found teenagers reported feeling unable to control their usage of Instagram, as well as internal YouTube analyses showing that the platform's recommendation algorithm disproportionately served emotionally provocative content to younger viewers.

Lawyers for Meta argued that the company had taken substantial steps to protect younger users, including implementing time-limit reminders and restricting certain features for accounts belonging to minors. YouTube's defense team maintained that the platform provided robust parental controls and that responsibility for monitoring children's internet use rested primarily with families.

The jury was not persuaded by these arguments. In post-verdict interviews, several jurors said they found the internal documents particularly compelling, noting the apparent gap between what the companies knew about risks to young users and the actions they took in response.

Child safety advocates hailed the verdict as a turning point. Jim Steyer, founder of Common Sense Media, said the ruling sent an unmistakable message that technology companies could no longer treat children's mental health as an acceptable cost of doing business.

The decision is expected to have far-reaching implications for hundreds of similar lawsuits pending against social media companies in federal and state courts across the country. Many of those cases make comparable claims about addictive design features and their impact on young users.

Both Meta and YouTube indicated they would appeal the verdict. A spokesperson for Meta said the company disagreed with the jury's conclusions and believed the evidence did not support a finding of negligence. YouTube's parent company, Alphabet, issued a statement expressing disappointment and reaffirming its commitment to youth safety.

Legal analysts noted that even if the verdict is overturned on appeal, the trial itself has brought unprecedented public scrutiny to the internal decision-making processes of major technology companies regarding youth safety.

Originally reported by NBC Business.

Meta YouTube social media addiction lawsuit children safety technology regulation