The Democratic and Republican parties, though seemingly unable to agree on most concepts, are now aligned on one topic—that Section 230 of the Communications Decency Act of 1996 needs to be reformed.
Section 230 protects internet speech by protecting interactive service providers (ISPs) (including social media websites) from lawsuits related to content and specifically content of user posts. The logic in the early 1990s was that since ISPs host, but do not produce or modify content, they shouldn’t be liable for content posted by third parties or hosted on their sites.
Other nations grapple with similar issues, but the U.S. can learn from France’s experience with a similar reform law that was recently partially struck down over censorship concerns.
Early this year, Democratic presidential candidate Joe Biden proposed a full revocation of Section 230. President Trump has called to repeal it, and in May, issued an executive order targeted at limiting Section 230 protections. In June, the Department of Justice also had a hand in re-writing Section 230, by releasing its own recommendations for its reformation.
However, in addition to the wide immunity for content, Section 230 opponents also decry “selective censorship” by large tech companies. Alternatively, opponents take issue with how little tech companies are doing to take down harmful content—such as terrorist propaganda and hate speech.
Section 230 contains two primary provisions creating immunity from liability for ISPs. These provisions state that interactive service providers and users may not be held liable for publishing or restricting access to material posted by another information content provider. Section 230(c)(1) specifies that interactive service providers and users may not “be treated as the publisher or speaker of any information provided by another information content provider.” Section 230(c)(2), states that interactive service providers and users, as well as services like ad-blockers that provide the “technical means” to filter content online, may not be held liable for voluntarily acting in good faith to restrict access to objectionable material.
These two immunity provisions apply broadly in most federal and state civil lawsuits, as well as most state criminal prosecutions. Case law has further strengthened the protections afforded to ISPs.
Lessons From France, EU Interest
The rise of social media and its influence in politics and social issues has forced Section 230 and content moderation into the spotlight globally. While the U.S. has the broadest legislative immunity under Section 230, other countries grapple with similar issues.
Recently, France’s National Assembly passed a law which would have curbed the protections afforded to social media platforms by requiring that “manifestly illicit” content be removed by the platform within 24 hours and terrorist propaganda and sex abuse content within an hour. Companies who did not comply faced fines of up to $1.36 million.
The proposed French law would have become effective on July 1—however, the French Conseil Constitutionel (Constitutional Council) struck down key portions of the law. The Constitutional Council found that the law was at risk of causing over censorship and harming free speech, writing that “freedom of expression and communication is all the more precious since its exercise is a condition of democracy and one of the guarantees of respect for other rights and freedoms.”
The Constitutional Council went on to say that, for certain aspects of the law, “the legislator has attacked freedom of expression and communication which is not adapted, necessary and proportionate to the aim pursued.”
This decision was lauded by free speech advocates who had criticized the law’s restrictions on speech and, who frankly criticized the hour-long window to remove content to be unworkable in practice.
This decision ripples across the EU and U.S. and provides insight into the balancing that needs to be afforded in any reform of Section 230. A European Commission spokesperson noted that the commission had “taken note” of the French decision. This is impactful, as the European Commission is expected to unveil a new Digital Services Act in 2021 (it is currently at the consultation stage) which will modernize Europe’s laws concerning digital services—and reform Europe’s equivalent of Section 230.
In the U.S., which traditionally has a much more hands off attitude on government oversight and which places a high importance on free speech, the Constitutional Council’s decision provides insight as to potential Section 230 reforms.
Pointedly, despite politicians’ strong rhetoric toward abolishing Section 230, the reformation processes will need to be measured, gradual, and carefully balance competing concerns of oversight and free speech.
Further, much like in consumer data privacy and how the EU’s General Data Protection Regulation (GDPR) has led to a wider discussion and sea change of laws in the U.S., the balancing of free speech, content moderation and ISP immunity in European jurisdictions may point to how the U.S. undertakes Section 230 reform.
Global companies are constantly balancing global policies and principles and the stakes are increasing in the sandbox that is the internet and content moderation. The U.S. should be watching the policies on content immunity in the European Commission’s Digital Services Act as a guide for its own reforms and a possible source of workable precedent to make modifications to Section 230, the same way U.S. states have drawn from GDPR in passing their state-level data privacy legislation.
This column does not necessarily reflect the opinion of The Bureau of National Affairs, Inc. or its owners.
Cynthia Cole is special counsel at Baker Botts in Palo Alto. She is a former CEO and general counsel who focuses her practice on corporate, strategic, and technology transactions and data privacy. She advises clients across a wide variety of industries including technology, retail, telecommunications, social media, and life sciences.
Neil Coulson is an intellectual property partner at Baker Botts focused on dispute resolution and the exploitation of IP rights and all matters relating to data and data privacy.
Brooke Chatterton is an associate in the corporate practice in the Palo Alto office of Baker Botts.