The US Supreme Court will hear a case challenging whether YouTube’s recommendation algorithm can receive legal protections under Section 230 of the Communications Decency Act, a decades-old law that helped shape the modern internet but has faced growing criticism in recent years.
Family members of a US citizen killed in the 2015 ISIS terrorist attacks in Paris are arguing that YouTube’s advanced algorithms that recommend videos to particular users allowed the terrorist group to amplify its message and radicalize its followers.
Section 230 generally protects social media platforms from the legal consequences of the content that their users create. But petitioners in the case granted Monday are arguing that the legal immunity doesn’t apply when the platform itself is recommending harmful content, like ISIS propaganda and recruitment videos, which would violate the Anti-Terrorism Act.
The case, Gonzalez v. Google, appears to be the first time the country’s highest court will directly weigh in on the safe harbor law, which tech companies have invoked since its enactment in 1996. In past petitions declined by the Supreme Court, Justice Clarence Thomas wrote that the court should step in to clarify the scope of the law “in an appropriate case.”
Proponents of Section 230 have argued that it allowed the internet to flourish and has encouraged platforms to develop their own moderation policies while not being held liable for the almost infinite amount of content that users can post. Without the protections, many platforms would cease to moderate or shut down altogether for fear of legal liability, they say.
But the petitioners in Gonzalez argue that the protection extends only to traditional editorial decisions that a platform makes, like deleting or editing hateful comments from a user. Recommendation algorithms don’t fall under that traditional editorial judgment, they argue.
VIDEO: A look at how Section 230 of the 1996 Communications Decency Act made the web a haven for free speech and free expression and also a breeding ground for trolls, sexual predators, misinformation, and censorship.
In its reply briefs, YouTube’s owner
Gonzalez comes from the US Court of Appeals for the Ninth Circuit, where a three-judge panel held last year that YouTube can’t be liable for recommending ISIS videos. But the nearly 170-page decision contained separate opinions from each judge. The majority opinion concluded that a previous Ninth Circuit decision barred the court from swaying from its broad interpretation of the statute.
An adverse decision for Google could be existential, the company said.
“This Court should not lightly adopt a reading of section 230 that would threaten the basic organizational decisions of the modern internet,” Google said in its brief.
Law professor at the University of Washington Eric Schnapper, Excolo Law, Berkman Law, and the Law Office of Keith Altman represent the petitioners. Wilson Sonsini Goodrich & Rosati PC and Williams & Connolly LLP represent Google.
The case is Gonzaelz v. Google LLC, U.S., No. 21-1333, cert. granted 10/3/22.
To contact the reporter on this story:
To contact the editor responsible for this story: