Bloomberg Law
Free Newsletter Sign Up
Bloomberg Law
Free Newsletter Sign Up

Mass Shootings and Hate Speech—What Can the Government Do?

May 27, 2022, 8:00 AM

The federal law is clear that social media companies generally cannot be held liable for the speech that is posted on their platforms. Despite this, New York Attorney General Letitia James (D), at the urging of N.Y. Gov. Kathy Hochul (D), has launched an investigation of the role of social media companies in the tragic mass shooting in Buffalo, which 13 people were shot, 10 of them killed.

The alleged shooter used social media platforms, such as Twitch, 4chan, and Discord, to plot and livestream the mass shooting, according to federal law enforcement sources. The investigation is intended to look into the role of these companies and other online resources and platforms that the alleged shooter used to express White supremacist views and his desire for violence.

It is not clear what James or Hochul hope will come from the investigation. The clear implication is that the social media platforms share the blame for the tragic shooting. There are calls to do more to prevent social media from being used to disseminate hate speech.

Government Can’t Force Platforms to Curb Vile But Lawful Speech

There is nothing objectionable about asking social media companies to do a better job of excluding speech that is awful but lawful, such as expressions of hate and White supremacy. The internet is full of vile and hateful material. It would be a better world if no one had these views or expressed them.

But the government cannot force platforms to moderate lawful speech. Social media companies are private entities and—unlike the government—they do not have to comply with the First Amendment. They get to decide what content to include on their sites and what to exclude.

Although there are some limited areas—such as incitement of illegal activity or true threats—where speech is not constitutionally protected and the government can forbid dissemination, the U.S. Supreme Court has consistently held that hate speech is protected by the First Amendment.

The alleged shooter’s statements of intent aren’t incitement under current law. Incitement requires that he intend to incite imminent unlawful action in others and be likely to so incite (Brandenburg v. Ohio (1969)). Some of his posted statements might satisfy the New York statute prohibiting terroristic threats (N.Y. Pen. Code §490.20). But even if it did, it’s not clear it meets the constitutional test for a true threat.

The alleged shooter’s statements of intent aren’t incitement under current law. The Supreme Court has held incitement requires that the speaker intend to incite imminent unlawful action in others and be likely to so incite. Some of his posted statements might satisfy the NY statute prohibiting terroristic threats, N.Y. Penal Code §490.20. But even if it did, it’s not clear it meets the constitutional test for a true threat or falls within any other category of unprotected speech.

The government cannot punish hateful messages or platforms that are used to disseminate them. It would violate the First Amendment to hold the social media companies liable because the Buffalo shooter used them to express a message.

Section 230

Moreover, a federal statute, 47 U.S.C. § 230, is explicit that social media companies cannot be held liable, with narrow exceptions, for what is posted on their platforms. The law expressly preempts state laws that impose liability for such posts.

In fact, in part because this federal statute immunizes platforms for content moderation decisions they make, social media companies already do an enormous amount of content moderation. For example, from October to December 2021, Facebook, now known as Meta Platforms, says it took action against terrorism content 7.7 million times, bullying and harassment 8.2 million times, and child sexual exploitation material 19.8 million times.

And, when the alleged shooter attempted to livestream an act of mass murder, Twitch was able to remove the video and suspend his account in less than two minutes. Unfortunately, as with the shootings in Christchurch, New Zealand, recordings remain online.

Moderation of this speech is necessary for the internet to be usable for most people, and platforms know it. The enormous amount of content moderation performed by platforms such as Facebook demonstrate that social pressure and market factors can encourage conscientious content moderation without unconstitutional government pressure.

But it also must be remembered that content moderation occurs at so enormous a scale that it is impossible for platforms to get it right 100% of the time. It is completely acceptable to ask social media platforms to do better.

Government Cannot Force Platforms to Censor

Attorney General James, of course, is free to conduct an inquiry into the role social media played in the Buffalo shooting. But, if the investigation concludes that too much or too little content moderation occurred, New York cannot force platforms to change their moderation practices because those practices are protected by the First Amendment and immunized by Section 230.

The best we can hope for is that social media companies will improve their content moderation practices by more accurately and rapidly excluding or limiting access to objectionable speech that users do not want to see and platforms do not want to host.

The internet and social media are tremendously powerful tools for freedom of speech. It is not hyperbole to say that they are the most important development for expression since the invention of the printing press.

They have enormously enhanced the ability for people to reach a mass audience and to access information. But such tools, and speech itself, can be used for good or for ill.

James can investigate, but the law is clear: The social media companies cannot be punished for being the sites where racist speech was expressed by a deeply disturbed and violent individual.

This article does not necessarily reflect the opinion of The Bureau of National Affairs, Inc., the publisher of Bloomberg Law and Bloomberg Tax, or its owners.

Write for Us: Author Guidelines

Author Information

Erwin Chemerinsky is the dean of U.C. Berkeley School of Law and the Jesse H. Choper Distinguished Professor of Law. Prior to that he was the founding dean and distinguished professor of law, and Raymond Pryke Professor of First Amendment Law, at the University of California, Irvine School of Law.

Alex Chemerinsky is a federal law clerk in the District of Arizona.