Justice Department Issues Recommendations for Section 230 Reform
Reforms Strike Balance of Protecting Citizens While Preserving Online Innovation and Free Speech
to download directly click here
DEPARTMENT OF JUSTICE’S REVIEW OF SECTION 230 OF THE COMMUNICATIONS DECENCY ACT OF 1996
As part of its broader review of market-leading online platforms, the U.S. Department of Justice analyzed Section 230 of the Communications Decency Act of 1996, which provides immunity to online platforms from civil liability based on third-party content and for the removal of content in certain circumstances. Congress originally enacted the statute to nurture a nascent industry while also incentivizing online platforms to remove content harmful to children. The combination of significant technological changes since 1996 and the expansive interpretation that courts have given Section 230, however, has left online platforms both immune for a wide array of illicit activity on their services and free to moderate content with little transparency or accountability.
The Department of Justice has concluded that the time is ripe to realign the scope of Section 230 with the realities of the modern internet. Reform is important now more than ever. Every year, more citizens—including young children—are relying on the internet for everyday activities, while online criminal activity continues to grow. We must ensure that the internet is both an open and safe space for our society. Based on engagement with experts, industry, thought-leaders, lawmakers, and the public, the Department has identified a set of concrete reform proposals to provide stronger incentives for online platforms to address illicit material on their services, while continuing to foster innovation and free speech. Read the Department’s Key Takeaways.
Hide Read More
The Department’s review of Section 230 arose in the context of our broader review of market-leading online platforms and their practices, announced in July 2019. While competition has been a core part of the Department’s review, we also recognize that not all concerns raised about online platforms (including internet-based businesses and social media platforms) fall squarely within the U.S. antitrust laws. Our review has therefore looked broadly at other legal and policy frameworks applicable to online platforms. One key part of that legal landscape is Section 230, which provides immunity to online platforms from civil liability based on third-party content as well as immunity for removal of content in certain circumstances.
Drafted in the early years of internet commerce, Section 230 was enacted in response to a problem that incipient online platforms were facing. In the years leading up to Section 230, courts had held that an online platform that passively hosted third-party content was not liable as a publisher if any of that content was defamatory, but that a platform would be liable as a publisher for all its third-party content if it exercised discretion to remove any third-party material. Platforms therefore faced a dilemma: They could try to moderate third-party content but risk being held liable for any and all content posted by third parties, or choose not to moderate content to avoid liability but risk having their services overrun with obscene or unlawful content. Congress enacted Section 230 in part to resolve this quandary by providing immunity to online platforms both for third-party content on their services or for removal of certain categories of content. The statute was meant to nurture emerging internet businesses while also incentivizing them to regulate harmful online content.
The internet has changed dramatically in the 25 years since Section 230’s enactment in ways that no one, including the drafters of Section 230, could have predicted. Several online platforms have transformed into some of the nation’s largest and most valuable companies, and today’s online services bear little resemblance to the rudimentary offerings in 1996. Platforms no longer function as simple forums for posting third-party content, but instead use sophisticated algorithms to promote content and connect users. Platforms also now offer an ever-expanding array of services, playing an increasingly essential role in how Americans communicate, access media, engage in commerce, and generally carry on their everyday lives.
These developments have brought enormous benefits to society. But they have also had downsides. Criminals and other wrongdoers are increasingly turning to online platforms to engage in a host of unlawful activities, including child sexual exploitation, selling illicit drugs, cyberstalking, human trafficking, and terrorism. At the same time, courts have interpreted the scope of Section 230 immunity very broadly, diverging from its original purpose. This expansive statutory interpretation, combined with technological developments, has reduced the incentives of online platforms to address illicit activity on their services and, at the same time, left them free to moderate lawful content without transparency or accountability. The time has therefore come to realign the scope of Section 230 with the realities of the modern internet so that it continues to foster innovation and free speech but also provides stronger incentives for online platforms to address illicit material on their services.
Much of the modern debate over Section 230 has been at opposite ends of the spectrum. Many have called for an outright repeal of the statute in light of the changed technological landscape and growing online harms. Others, meanwhile, have insisted that Section 230 be left alone and claimed that any reform will crumble the tech industry. Based on our analysis and external engagement, the Department believes there is productive middle ground and has identified a set of measured, yet concrete proposals that address many of the concerns raised about Section 230.
A reassessment of America’s laws governing the internet could not be timelier. Citizens are relying on the internet more than ever for commerce, entertainment, education, employment, and public discourse. School closings in light of the COVID-19 pandemic mean that children are spending more time online, at times unsupervised, while more and more criminal activity is moving online. All of these factors make it imperative that we maintain the internet as an open and safe space.
Areas Ripe For Section 230 Reform
The Department identified four areas ripe for reform:
1. Incentivizing Online Platforms to Address Illicit Content
The first category of potential reforms is aimed at incentivizing platforms to address the growing amount of illicit content online, while preserving the core of Section 230’s immunity for defamation.
a. Bad Samaritan Carve-Out. First, the Department proposes denying Section 230 immunity to truly bad actors. The title of Section 230’s immunity provision—“Protection for ‘Good Samaritan’ Blocking and Screening of Offensive Material”—makes clear that Section 230 immunity is meant to incentivize and protect responsible online platforms. It therefore makes little sense to immunize from civil liability an online platform that purposefully facilitates or solicits third-party content or activity that would violate federal criminal law.
b. Carve-Outs for Child Abuse, Terrorism, and Cyber-Stalking. Second, the Department proposes exempting from immunity specific categories of claims that address particularly egregious content, including (1) child exploitation and sexual abuse, (2) terrorism, and (3) cyber-stalking. These targeted carve-outs would halt the over-expansion of Section 230 immunity and enable victims to seek civil redress in causes of action far afield from the original purpose of the statute.
c. Case-Specific Carve-outs for Actual Knowledge or Court Judgments. Third, the Department supports reforms to make clear that Section 230 immunity does not apply in a specific case where a platform had actual knowledge or notice that the third party content at issue violated federal criminal law or where the platform was provided with a court judgment that content is unlawful in any respect.
2. Clarifying Federal Government Enforcement Capabilities to Address Unlawful Content
A second category reform would increase the ability of the government to protect citizens from harmful and illicit conduct. These reforms would make clear that the immunity provided by Section 230 does not apply to civil enforcement actions brought by the federal government. Civil enforcement by the federal government is an important complement to criminal prosecution.
3. Promoting Competition
A third reform proposal is to clarify that federal antitrust claims are not covered by Section 230 immunity. Over time, the avenues for engaging in both online commerce and speech have concentrated in the hands of a few key players. It makes little sense to enable large online platforms (particularly dominant ones) to invoke Section 230 immunity in antitrust cases, where liability is based on harm to competition, not on third-party speech.
4. Promoting Open Discourse and Greater Transparency
A fourth category of potential reforms is intended to clarify the text and original purpose of the statute in order to promote free and open discourse online and encourage greater transparency between platforms and users.
a. Replace Vague Terminology in (c)(2). First, the Department supports replacing the vague catch-all “otherwise objectionable” language in Section 230(c)(2) with “unlawful” and “promotes terrorism.” This reform would focus the broad blanket immunity for content moderation decisions on the core objective of Section 230—to reduce online content harmful to children—while limiting a platform’s ability to remove content arbitrarily or in ways inconsistent with its terms or service simply by deeming it “objectionable.”
b. Provide Definition of Good Faith. Second, the Department proposes adding a statutory definition of “good faith,” which would limit immunity for content moderation decisions to those done in accordance with plain and particular terms of service and accompanied by a reasonable explanation, unless such notice would impede law enforcement or risk imminent harm to others. Clarifying the meaning of “good faith” should encourage platforms to be more transparent and accountable to their users, rather than hide behind blanket Section 230 protections.
c. Explicitly Overrule Stratton Oakmont to Avoid Moderator’s Dilemma. Third, the Department proposes clarifying that a platform’s removal of content pursuant to Section 230(c)(2) or consistent with its terms of service does not, on its own, render the platform a publisher or speaker for all other content on its service.
Overview of Department of Justice Actions on Section 230
The Department of Justice’s review of Section 230 of the Communications Decency Act has included a number of different components, including:
1. Key Takeaways. The Department distilled lessons learned from its engagement and research in its Key Takeaways and Recommendations, which outline a set of key principles, specific areas for reform, and ideas for further consideration.
2. Public Workshop. On February 19, 2024 the Department held a public workshop on Section 230 titled “Section 230: Nurturing Innovation or Fostering Unaccountability,” bringing together thought-leaders from diverse viewpoints. See Section 230 Workshop Livestream; Section 230 Workshop Agenda; Section 230 Workshop Summary
3. Expert Roundtable. In the afternoon of February 19, the Department also hosted a Chatham House Rule roundtable with additional experts and thought-leaders to further discuss Section 230 and potential reforms. See Section 230 Workshop Summary; Biographies of Experts
4. Written Submissions. Participants in the morning Workshop and afternoon Roundtable were also invited to submit short written statements with their views on Section 230, which the Department reviewed. See Participant Written Submissions
5. Industry Listening Sessions. Following the Workshop, the Department met individually with a diverse group of businesses that had attended the public event or otherwise expressed interest in Section 230. Meetings were private and confidential to foster frank discussions about their use of Section 230 and thoughts on potential reform.