Tech Giants Face Major Legal Setback in Child Safety Lawsuits

Thursday, March 26, 2026 at 6:22 AM

Two landmark jury verdicts have found Meta and Google liable for harming young users, potentially weakening the legal protections that have long shielded social media companies from lawsuits. The cases could reshape how courts interpret Section 230, the federal law that protects online platforms from liability over user content.

Two groundbreaking court decisions against major technology companies could fundamentally change how social media platforms are held accountable for protecting children online.

In California this week, a Los Angeles jury determined that Meta and Google bear responsibility for a young woman’s mental health struggles, including depression and suicidal ideation, after she developed an addiction to Instagram and YouTube during her childhood. The jury awarded $6 million in combined damages against the companies. Meanwhile, in New Mexico, another jury ordered Meta to pay $375 million on Tuesday, ruling that the company deceived users about platform safety for minors and allowed sexual exploitation of children to occur.

These decisions represent significant breaches in the legal protection that has historically made it difficult to successfully sue technology companies: Section 230 of the Communications Decency Act. This 1996 federal legislation typically shields online platforms from responsibility regarding content created by users. However, both legal teams avoided this obstacle by focusing their arguments on how the companies designed their platforms rather than on the content hosted there.

“Courts are increasingly trying to distinguish claims about platform functionality or platform conduct from claims that would really just impose liability for third-party speech,” explained Gregory Dickinson, an assistant professor at University of Nebraska College of Law who specializes in technology and legal issues.

Both Meta and Google have rejected the allegations and maintain they have implemented measures to safeguard young users.

During pre-trial proceedings, both companies attempted to have the lawsuits dismissed, invoking Section 230 protections. The presiding judges in each case denied these motions, allowing the trials to proceed.

A Meta representative declined to provide additional comment but confirmed the company intends to appeal both verdicts. Google has similarly announced plans to appeal the Los Angeles decision but did not respond to requests for further comment.

These anticipated appeals will likely focus heavily on Section 230 interpretation and could have far-reaching consequences across the technology industry.

Meta, Google, Snap Inc (Snapchat’s parent company), and ByteDance (TikTok’s parent company) are currently defending against thousands of similar lawsuits in both state and federal courts. These cases allege that design decisions made by these companies have contributed to a widespread mental health crisis among teenagers and young adults. Over 2,400 cases have been consolidated under a single federal judge in California, with thousands more grouped together in California state courts.

Legal scholars note that courts have been adopting increasingly restrictive interpretations of Section 230’s liability protections. While several lower courts have ruled that companies’ platform design decisions fall outside the law’s protection, no appellate court has yet issued a definitive ruling on this matter. Appellate court decisions carry more legal weight as they establish precedents that bind other courts.

The implications of an appellate ruling on Section 230 could extend well beyond social media platforms, potentially affecting lawsuits against any online platform that hosts content accessible to children. Currently, more than 130 federal lawsuits are pending against Roblox Corporation, alleging the popular gaming platform failed to protect users from sexual exploitation. Roblox disputes these claims.

“I think the internet is on trial, not social media,” said Eric Goldman, co-director of the High Tech Law Institute at Santa Clara University School of Law. “If the theories work, they will be deployed elsewhere.”

Appeals in both cases would initially be heard by state-level appellate courts but could potentially advance to higher courts.

The U.S. Supreme Court has demonstrated interest in potentially determining Section 230’s scope. In 2023, the court heard arguments in a case involving Google’s YouTube platform but ultimately avoided making a definitive ruling on internet company legal protections.

In 2024, the Supreme Court declined to hear a Texas teenager’s attempt to revive his lawsuit against Snap, alleging the company failed to protect underage users from sexual predators. However, two conservative justices, Clarence Thomas and Neil Gorsuch, dissented from this decision and warned about continued delays in addressing the issue. “Social-media platforms have increasingly used (Section) 230 as a get-out-of-jail free card,” they wrote in their dissent.

Meetali Jain, director of the Tech Justice Law Project, which pursues litigation against technology companies, believes the U.S. Supreme Court may now be prepared to examine Section 230’s scope more closely.

“I personally think that the Supreme Court is even ready for a case like this, for the right case,” Jain said.

More from TV Delmarva Channel 33 News