Social Media Judge Says Algorithms, Peer Pressure Hard to Divide

May 13, 2024, 9:56 PM UTC

A California state judge considering whether school districts can sue social media giants for student addiction said Monday that it’s hard to tell whether harms related to social media can be blamed on its algorithms, or if other factors, like peer pressure, are to blame.

“You can’t really tell whether the content was something the student found themselves or whether it was pushed onto them,” California Superior Court, Los Angeles County Judge Carolyn B. Kuhl said.

Though Kuhl’s note that it’s difficult to track the origin of the influence on students would broadly favor tech companies — including Meta Platforms Inc., Snap Inc., TikTok Inc., and Google LLC that want Kuhl to scrap complaints by school districts in California, Florida, Rhode Island, and Washington — Kuhl didn’t show a strong leaning during more than three hours of oral arguments on a demurrer and a motion to dismiss the districts’ suits.

In a major Judicial Council Coordination Proceeding, which is similar to multidistrict litigation but at the state level, the districts allege that social media has increased the cost of education because it makes students more distracted and disruptive, driving up the need for classroom discipline, employee training, and communication with parents.

School property damage from the “devious lick” TikTok challenge, which encouraged users to destroy and steal items like toilet seats, exit signs, and fire extinguishers, doesn’t stem entirely from social media content—but rather its influence on student behavior—so a federal liability shield called Section 230 doesn’t apply, argued Josh Autry, an attorney for the school districts.

Autry, of Morgan & Morgan, drew a distinction between the school district bids and the Wozniak v. YouTube LLC ruling, which found that harm stemmed directly from YouTube hoax posts.

He argued that school districts in this case are experiencing a different kind of “downstream harm,” prompted by social media algorithms that influence students by making trends such as property damage go viral.

A California appeals court found in March that YouTube isn’t responsible for a hoax that altered footage of Apple Inc. co-founder Steve Wozniak and other tech icons to sell a fake Bitcoin giveaway. YouTube was deemed not responsible for most of the hoax’s harm because it stemmed directly from the posts themselves, Autry said.

‘On the Front Lines’

The social media companies argue that school districts can’t get damages for harms allegedly suffered by students because the children are third parties. The alleged negative impacts of social media also aren’t unique enough to schools, they say.

“A school district has no right to educate students in a vacuum—free from the mental and physical consequences that arise from a global pandemic, hunger, inadequate sleep, abuse, neglect, or any number of other external factors that make classroom learning more difficult and educating students costlier,” the tech companies said in court documents, noting that the district complaints would make line-drawing impossible for company liability.

Attorneys for the districts shot back in court filings that schools are “on the front lines” of a mental health crisis specifically created by social media companies, and they have a special responsibility to care for children’s mental health.

Social media companies could have easily predicted that addicting youth to their products and encouraging their use in schools would disrupt education, the districts argued, citing Snap’s “Snap to School” campaign and Meta’s notifications that intentionally target students when they’re busy.

Felicia Craick for the districts during Monday arguments distinguished social media giants from other companies, like fast food companies, whose products could raise public health concerns.

“You have to look at the facts,” said Craick, of Keller Rohrback. To make a similar suit against a company like McDonald’s Corp., “you need McDonald’s not only involving an addictive product, but also following students, telling them to eat the burgers, and breaking into schools, telling the students to eat, and controlling burgers even after McDonald’s doesn’t physically have them anymore.”

Addictive Platform Design

Meta, Snap, TikTok Inc., and Google LLC face allegations that they designed their platforms to be addictive, causing depression and anxiety in children.

Lawyers for the plaintiffs are attempting to wield a novel argument that the platforms are defectively designed to bypass a federal shield protecting online platforms from suits over third-party content.

Kuhl in October said social media sites aren’t “products,” but the plaintiffs made a claim that the companies were careless, so the companies couldn’t use the First Amendment or the federal shield to block the suits.

Kuhl in March sustained the platforms’ demurrer to claims for sex and age discrimination—one of 13 causes of action—without giving plaintiffs an opportunity to revise their complaint. But she gave plaintiffs a chance to try again and add more factual allegations to claims that the companies failed to warn users about risks of using the sites.

The case is Social Media Cases JCCP, Cal. Super. Ct., No. JCCP5255, 5/13/24.

To contact the reporter on this story: Maia Spoto in Los Angeles at mspoto@bloombergindustry.com

To contact the editors responsible for this story: Stephanie Gleason at sgleason@bloombergindustry.com; Cheryl Saenz at csaenz@bloombergindustry.com

Learn more about Bloomberg Law or Log In to keep reading:

See Breaking News in Context

Bloomberg Law provides trusted coverage of current events enhanced with legal analysis.

Already a subscriber?

Log in to keep reading or access research tools and resources.