Government Needs Social Media Coordination to Fight Election Lies

March 18, 2024, 8:30 AM UTC

Fake images of former President Donald Trump with his arms around Black voters. A meme portraying a Black person in front of a Hillary Clinton sign telling people that they can vote by text or hashtag. Robocalls telling voters it’s unsafe to vote by mail or, in an exact imitation of President Joe Biden’s voice, telling voters to “save” their vote for November instead of voting in the primary.

In Murthy v. Missouri, the US Supreme Court should provide a clear rule allowing the federal government to communicate with social media companies and civic organizations to stop the spread of this kind of disinformation on social media platforms.

This would not be much of an issue if social media companies were good at policing their own platforms for disinformation. But that isn’t the case. Though most major online platforms prohibit misrepresentations about when, where, or how to vote, enforcement of their policies varies significantly and is often lacking.

Government and civil society organizations help fill this void. Some government programs counter election disinformation with accurate information. Others communicate with social media companies about disinformation trends that harm vulnerable communities. State and local election officials—democracy’s first line of defense—also spot problems on the ground and raise the alarm.

The Lawyers’ Committee for Civil Rights Under Law is a co-convenor of Election Protection, a nationwide, nonpartisan coalition consisting of over 300 national, state, and local partners working to ensure that voters can exercise their right to vote. Sometimes, Election Protection escalates threats of election disinformation to election officials.

In Murthy, several individuals, Missouri, and Louisiana sued the federal government for violation of their First Amendment rights. They claimed that the federal government was responsible for the social media companies’ decisions to take down or downgrade their content. For plaintiffs to prevail, they have to show there was state action—that the social media companies weren’t acting on their accord but effectively as agents for the federal government.

The district court agreed with plaintiffs and issued a sweeping preliminary injunction that prohibited government agencies from communicating with social media platforms for the purpose of “urging, encouraging, pressuring, or inducing in any manner for removal, deletion, suppression, or reduction of content containing protected free speech.” It also prohibited the government from collaborating with third parties—potentially including voter protection organizations like Election Protection—for the same purpose.

Unsurprisingly, the injunction was so vague and overbroad that the government immediately ceased talking with social media companies and voter protection organizations.

The Fifth Circuit largely agreed with the district court on the merits of the plaintiffs’ First Amendment claims, but it narrowed and modified the injunction so that it would prohibit the government from “coerc[ing] or “significantly encourag[ing]” social media companies to remove, delete, suppress, or reduce content, and it removed the prohibition on the government collaborating with third parties.

The Fifth Circuit correctly stated the law but its application of the facts to the law is problematic and, if adopted by the Supreme Court, would potentially prevent the government in notifying social media companies of disinformation that is detrimental to the general public.

For example, the Fifth Circuit found that the FBI acted coercively toward the social media companies because the FBI asked the social media companies to take down content. The Fifth Circuit made this finding even though it acknowledged there was no evidence that the FBI’s communications were “threatening” and that the FBI “did not plainly reference adverse consequences” to the social media companies if they didn’t act.

The Fifth Circuit found that because the communications came from the FBI and the FBI “wielded some authority over the platforms,” communications that were acceptable on their face became coercive.

The implications of this broad reading of what constitutes state action are vast. Government agencies may be prohibited from meeting with platforms, sharing strategic information, and alerting them to disinformation spreading on their platforms—merely because they are government entities.

The chilling effects of the injunction would reach far beyond the sanctioned federal officials. State and local election officials—the individuals with the most knowledge of state and local election laws—may see the precedent of this case and proactively self-censor for fear of drawing a lawsuit because they can’t know, based on the Fifth Circuit’s injunction, what is permissible.

The end result is that, if left standing, the injunction would drive a stake into the government’s ability to work with civil society and social media platforms to protect elections against disinformation, months before a momentous election.

And recent technological advances, like generative AI, mean that deepfake images, audio, and video about the election could pervade social media platforms.

This is no time for vagueness. The Supreme Court should make it clear: The government is allowed to communicate with social media companies and civil society organizations to stem the tide of election disinformation. Our democracy is at stake.

The case is Murthy v. Missouri, US, No. 23-411, to be argued 3/18/24.

This article does not necessarily reflect the opinion of Bloomberg Industry Group, Inc., the publisher of Bloomberg Law and Bloomberg Tax, or its owners.

Author Information

Marc Epstein is senior counsel for the Lawyers’ Committee for Civil Rights Under Law’s digital justice initiative and litigates cases on technology and racial justice, including those involving online hate and disinformation.

The Lawyers’ Committee joined an amici curiae brief filed in support of the US Surgeon General Vivek Murthy and other petitioners.

Write for Us: Author Guidelines

To contact the editors responsible for this story: Jessie Kokrda Kamens at jkamens@bloomberglaw.com; Jada Chin at jchin@bloombergindustry.com

Learn more about Bloomberg Law or Log In to keep reading:

See Breaking News in Context

Bloomberg Law provides trusted coverage of current events enhanced with legal analysis.

Already a subscriber?

Log in to keep reading or access research tools and resources.