Social media companies face legal reckoning over mental health harms to children
San Francisco: For years, social media companies have disputed allegations that they harm children’s mental health through deliberate design choices that addict kids to their platforms and fail to protect them from sexual predators and dangerous content.
Now, these tech giants are getting a chance to make their case in courtrooms around the country, including before a jury for the first time.
Some of the biggest players from Meta to TikTok are facing federal and state trials that seek to hold them responsible for harming children’s mental health. The lawsuits have come from school districts, local, state and the federal government as well as thousands of families.
Two trials are now underway in Los Angeles and in New Mexico, with more to come. The courtroom showdowns are the culmination of years of scrutiny of the platforms over child safety, and whether deliberate design choices make them addictive and serve up content that leads to depression, eating disorders or suicide.
Experts see the reckoning as reminiscent of cases against tobacco and opioid markets, and the plaintiffs hope that social media platforms will see similar outcomes as cigarette makers and drug companies, pharmacies and distributors.
The outcomes could challenge the companies’ First Amendment shield and Section 230 of the 1996 Communications Decency Act, which protects tech companies from liability for material posted on their platforms. They could also be costly in the form of legal fees and settlements. And they could force the companies to change how they operate, potentially losing users and advertising dollars.
Jurors in a landmark social media case that seeks to hold tech companies responsible for harms to children got their first glimpse into what will be a lengthy trial characterised by duelling narratives from the plaintiffs and the two remaining defendants, Meta and YouTube.