California lawmakers are attempting again to hold social media companies liable for addicting child users to their product, a renewed effort that will face fierce resistance from the tech industry.
“This legislation is like throwing more fuel on the flames created by the legislature last session,” said Carl Szabo, vice president of NetChoice, which represents Meta, Google, and other tech companies.
State Sen. Nancy Skinner (D) last week introduced SB 287, which would subject a company up to $250,000 per violation, an injunction, and litigation costs and attorney fees. Her bill is similar to widely watched state legislation last year that would have allowed the attorney general and local district attorneys to file civil suits against social media companies for knowingly putting in designs or algorithms that will addict kids.
Skinner’s bill goes beyond addiction and adds other problems platforms could be held accountable for, including the use of fentanyl, harm to oneself or others, and eating disorders.
The effort comes as social media companies such as
Another difference from last year’s debate is that this bill would let private individuals, such as parents, sue within four years of the alleged harm. That will likely invite criticism from the industry, fearful of a flood of litigation.
“Californians deserve to have the right to hold social media accountable just as our laws allow for medical malpractice, faulty consumer products and other harms,” said Skinner on the private right to action.
Social media companies that conduct quarterly audits of their algorithms and features and resolve issues from an audit within 30 days wouldn’t face liability under the bill. It would only apply to businesses making at least $100 million in gross revenue annually.
California lawmakers have led in efforts to protect children online, and last year, multiple measures including a major online child privacy law passed despite some opposition from Big Tech.
Tech lobbyists were particularly intent on stopping last year’s social media addiction bill and successfully killed it. The intensity of the resistance is likely to be the same this time around.
“This bill would cripple hundreds of websites,” said Szabo said. “It does little to address the underlying issues raised by social media—responsible use of technology.”
Opponents argue the bill is an unconstitutional abridgment of free speech and could force platforms to close off social media to children, as anyone could claim any feature to be addictive. The government should not inject itself into a conversation between a child and their parent on social media use, they said.
Advocates for child safety say social media giants know their designs and algorithms are harming youth, and lawmakers must address the mental health problems resulting from that.
“If you’re a business, and you can get away with legally addicting your customers, especially ones that are going to be your customers for the most of their lives, you fight like crazy to be able to do that,” said Ed Howard, counsel at the Children’s Advocacy Institute in the University of San Diego School of Law, which supported last year’s bill.
Howard said the provision freeing companies from liability if they conduct audits and fix any resulting issues is a “safe harbor” for them. But Szabo of NetChoice called the safe harbor language a “trap where the websites must create the legal case against themselves.”
Skinner indicated she won’t revise her bill too much to soften the tech industry’s opposition and said the urgency around fentanyl sales, ghost guns and other harm to children would garner it success this year.
“Social media companies should have no trouble abiding by SB 287, if it were to become law. They just have to regularly monitor their algorithms to ensure they do not target youth with harmful content, including the sale of fentanyl, and do not facilitate the sale of illegal firearms — and then fix their algorithms if they do.” she said.
State lawmakers were able to pass smaller state protections last year to help prevent harm to children and other online users.
Later this year, social media platforms will be required to file a semiannual report to the state attorney general on their term of services, content moderation practices, and how they defines hate speech, disinformation and other terms.
Companies are now required to post their policies on the use of social media to illegally distribute a controlled substance as well as a link to any reporting mechanism for illegal and harmful content and behavior. Platforms must also state if there is a mechanism to report violent social media activity, and the target of a violent post can seek a court order to take it down.
To contact the reporter on this story:
To contact the editor responsible for this story: