California’s new law requiring social media platforms like
The law requires social media companies to post all of their internal content moderation policies and to submit semiannual reports detailing their moderation activities. The companies must state whether their policies define certain categories of listed content, including “hate speech or racism” and “extremism or radicalization"—and if so, how the terms are defined.
Unlike the Florida and Texas laws, California’s law doesn’t include the types of content-moderation restrictions that federal courts have found likely to be unconstitutional. Still, the new law is vulnerable to a First Amendment challenge that could slow its implementation, attorneys said.
There are a “number of people who are thinking about filing suit,” said Carl Szabo, vice president and general counsel of NetChoice, a tech group whose members include Twitter and Meta that sued over the Texas and Florida laws.
Proponents of social media transparency laws liken California’s disclosure requirements to regulations requiring things like nutritional labels, but some legal scholars say the new law is different.
“Just as newspapers could not, under the First Amendment, be required to disclose their policies about what they deem fit to print, the same should be true of social media as well,” Floyd Abrams, senior counsel at Cahill Gordon & Reindel LLP, said.
Even if transparency laws don’t directly dictate a company’s content moderation choices, the threat of enforcement will alter how they exercise editorial discretion, Eric Goldman, a Santa Clara University School of Law professor who specializes in internet regulation, said.
“When an editor makes a judgment call that applies an existing rule in a new way, or constitutes a new rule or exception to a rule, that’s all constitutionally protected,” Goldman said.
Social media companies are going to have to figure out what the California attorney general thinks constitutes hate speech, and adjust their content moderation, even when the attorney general’s view is inconsistent with how they might view the same content, Szabo said.
“If you are in-house counsel, this is terrifying,” Szabo said.
The Computer & Communications Industry Association, which joined NetChoice in challenging the Texas and Florida laws, “remains concerned that some provisions in this law may make it more difficult for companies to restrict or remove inappropriate or dangerous content,” Khara Boender, CCIA’s state policy director, said.
CCIA, whose members include Facebook and Twitter, “will be monitoring how these regulations are enforced and consider all options to ensure companies can keep their commitments to internet users,” Boender said.
Supporters of the law, including the Anti-Defamation League, say it will promote transparency and accountability, enabling regulators to better understand how hate and misinformation proliferate.
“It’s a transparency law, plain and simple,” Kendall Kosai, director of policy for the ADL’s Western Division.
The law isn’t about suppressing speech, it’s about assembling better data for better decisions at the policy level, Kosai said.
The law is necessary to help shield Californians from online “hate and disinformation,” California Gov. Gavin Newsom (D) said in a Sept. 13 statement.
“Californians deserve to know how these platforms are impacting our public discourse, and this action brings much-needed transparency and accountability to the policies that shape the social media content we consume every day,” Newsom said.
Critics question whether the government’s interest is actually served by what industry organizations have said will be burdensome regulations, however.
“Knowing how many reports they got and whether or when they took action, tell us exactly nothing,” Harvard Law School professor Rebecca Tushnet said.
“Maybe there are fake reports and a low number of responses is good,” and maybe not, Tushnet said.
The Florida and Texas laws remain largely blocked, and the challenges are ongoing. Although the US Court of Appeals for the Fifth Circuit stayed the district court’s preliminary injunction of the Texas law, the US Supreme Court vacated the stay.
The Eleventh Circuit held that Florida’s content-moderation restrictions likely violated the First Amendment, but that the law’s disclosure requirements likely didn’t.
The only disclosure provision the Eleventh Circuit struck down required the platforms to provide consumers with a “thorough explanation” for their editorial decisions—that is, decisions like whether to remove, promote, or demote content.
The case should “tee up for the Supreme Court overall standards for mandatory editorial transparency,” Goldman said.
Florida is expected to file a Supreme Court petition appealing the Eleventh Circuit’s decision by Sept. 21.
Tushnet said the high court is “extremely likely” to hear that appeal.
The court is generally inclined to hear cases where a federal court has struck down a state law as unconstitutional, and given the high political salience of the issues, “it would be shocking if they decided not to hear this,” Tushnet said.
To contact the reporter on this story:
To contact the editor responsible for this story: