Meta, TikTok, and Google Cannot Evade School’s Lawsuit Alleging Harm to Minors, Court Rules

Meta, TikTok, and Google Cannot Evade School’s Lawsuit Alleging Harm to Minors, Court Rules
In this photo illustration, the TikTok, Facebook, and other apps are displayed on a phone in New York City on March 13, 2024. (Michael M. Santiago/Getty Images)

A California court allowed school districts to sue social media companies including Facebook and TikTok over accusations they harm the mental health of children and therefore force the districts to spend more resources to tackle such issues faced by students.

The judgment came as part of a multidistrict litigation that consolidated hundreds of complaints filed by school districts, local government entities, and state attorneys accusing social media platforms of harming children. The defendants in the case are Meta’s Facebook and Instagram, Google’s YouTube, ByteDance’s TikTok, and Snapchat.

A motion was filed by defendants in the U.S. District Court for the Northern District of California seeking to dismiss the complaints from school districts and government entities. In a court order issued on Oct. 24, Judge Yvonne Gonzalez Rogers allowed certain claims related to harm to minors to proceed.

Plaintiffs allege that the defendants “deliberately designed their social media platforms to foster compulsive use and addiction in minors, whose mental and physical health deteriorated,” the order noted.

The school districts claim that the algorithms forced them to expend “substantial financial resources to mitigate the mental health and consequent behavioral issues their students suffer as a result of social media addiction.”

Furthermore, the school districts said the platforms failed to implement robust age verification processes to determine user age, did not implement effective parental controls or set up parental notifications, failed to create adequate processes that allow users to report suspected child sexual abuse material, published geolocation information of minors, recommended minors’ accounts to adult strangers, and used algorithms to promote “addictive engagement.”

The overuse of social media by students resulted in “significant disruption” in school operations, impeding their ability to educate children in a safe and secure manner, the districts argued.

Forty-one percent of school districts added staff to focus on student mental health, 46 percent created or expanded mental health programs for students, 27 percent added student classes on social, emotional, and mental health, and 56 percent offered teachers professional development to help them deal with students facing mental health issues, the order noted.

School districts claimed that social media platforms “deliberately targeted school-aged children with knowledge of the impact their conduct could have on schools.”

The social media platforms argued that the allegations put forward by school districts and local government entities were “too remote or attenuated” for the law to remedy.

However, “in most ways, the Court disagrees,” the order stated. According to Rogers, the allegations made by the plaintiffs “sufficiently fall within the ambit—by and large—of the relevant states’ negligence laws.”

The plaintiffs have “adequately” alleged that the defendants breached “duty of care” and that school districts had to expend resources due to the platforms’ conduct of “fostering compulsive use of minors,” the order noted.

Rogers thus allowed school districts’ claim of social media apps harming the mental health of children to proceed.

In an emailed statement to The Epoch Times, Google spokesperson José Castaneda said: “Providing young people with a safer, healthier experience has always been core to our work. In collaboration with youth, mental health and parenting experts, we built services and policies to provide young people with age-appropriate experiences, and parents with robust controls. The allegations in these complaints are simply not true.”

The Epoch Times reached out to Meta and Snap for comment but received no replies by publication time.

Section 230 Applicability

Just days before the ruling, Rogers issued another ruling in the case. Social media platforms had filed a motion seeking to dismiss accusations from attorneys general that the networks negatively affected children. Rogers dismissed the motion in part in an Oct. 15 order, allowing claims of platforms harming children to continue in court.

Similar to the Oct. 15 ruling, the most recent order issued by the court also pointed out that Section 230 protections would blunt many of the plaintiffs’ claims against the companies.

Section 230 of the Communications Decency Act provides immunity to online platforms from civil liability for third-party content. “Allegations related to certain platform features are insulated by Section 230 and the First Amendment, other platform features are not so insulated,” the order stated.

Section 230 is a controversial provision, with some lawmakers attempting to alter it. In February 2023, a group of Democrats introduced the Safeguarding Against Fraud, Exploitation, Threats, Extremism, and Consumer Harms (SAFE TECH) Act, which seeks to reform Section 230 and hold social media companies responsible for third-party content.

These platforms “allow people to connect all across the world—but they also cause great pain and suffering, being used as a tool for cyberbullying, stalking, spreading hate, and more. The way we communicate as a society has changed drastically over the last 25 years, it’s time for our laws to catch up,” Sen. Mazie Hirono (D-Hawaii) said.

“The SAFE TECH Act targets the worst abuses perpetrated on internet platforms to better protect our children and our communities from the very real harms of social media.”

The Electronic Frontier Foundation warns that eroding Section 230 protections would harm everyone on the internet, whether it be small blogs, big corporations, or individual users.

‘The free and open internet as we know it couldn’t exist without Section 230,” it said in a post.

“If the law makes us liable for the speech of others, the biggest platforms would likely become locked-down and heavily censored. The next great websites and apps won’t even get started, because they’ll face overwhelming legal risk to host users’ speech.”

From The Epoch Times