Access Alert: Trump Justice Department Proposes Reforms to “Section 230” Online Liability Protections

Access Alert: Trump Justice Department Proposes Reforms to “Section 230” Online Liability Protections

On Wednesday 23 September U.S. Department of Justice (DoJ), in response to the President’s May 2020 Executive Order on Preventing Online Censorship, issued a legislative reform request to Congress regarding Section 230 of the Communications Decency Act. Since early 2020, the DoJ has conducted hearings and consultations with stakeholders on potential reforms of the measure, which shields online companies from liability for content provided by users. While the Department had voiced concern that Section 230 provides too broad a shield from civil liability for tech companies that hampers law enforcement, its precise concerns and desired reforms were largely unclear.

Yesterday’s proposal, conveyed to Congress via a letter from Attorney General Bill Barr, outlines a number of specific reforms to Section 230, including a fully revised draft statute. Here are the key elements:

  • Restricting moderation – Revisions to Section 230(c)(1) and (2) – Content removal is only permitted if a provider has an “objectively reasonable” belief that specific material violates the terms of service, or that it is unlawful, or that it is obscene, lewd, lascivious, filthy, excessively violent, promoting terrorism or violent extremism, harassing, or promoting self-harm. This would remove language that currently enables providers to take down a much broader array of content that may be “otherwise objectionable.”
  • Reestablishing “bad Samaritan” liability – New Section 230(d)(1) – Establishes liability under both state criminal law and federal and state civil action when a provider “acted purposefully with the conscious object to promote, solicit, or facilitate” material or activity illegal under federal criminal law.
  • Require court-ordered take-down New Section 230(d)(3) – Establishes liability for criminal or civil prosecution if a provider fails to take down content after “receiving notice” of a final judgement that it is defamatory under state law or otherwise unlawful.
  • Required public notice system New Section 230(d)(4) – All providers must provide a free mechanism for the public to notify defamatory or unlawful material or activity and are liable if they fail to remove material they have been notified is illegal under federal criminal law.
  • Full exception for federal civil law enforcement Revision to current Section 230(e)(1) – Enforcement by the government of any federal civil statute or regulation is given a full exemption from Section 230.
  • Additional horizontal exemptions New Section 230(f)(6)-(9) – Enforcement of additional areas of civil law are given full exemptions from Section 230, namely anti-terrorism, child sex abuse (including state law), cyber stalking, and anti-trust.
  • Expands definition of content provider Revision to Section 230(g)(3) – The definition of “information content provider” is amended to explicitly include any person or entity who “solicits, comments upon, funds, or affirmatively and substantively contributes to, modifies, or alters” another’s content.
  • Defining “good faith” New definition Section 230(g)(5) – In order to be deemed to be acting in “good faith” per Section 230(c)(2), a provider must have public terms of service with information on its moderation practices, moderate or restrict access to content consistent with these terms, not moderate or restrict content on “deceptive or pretextual grounds,” and give timely notice to a content provider explaining the “reasonable factual basis” for restricting content and an opportunity to respond.

This proposal comes at a time when the Trump administration and many members of Congress have been increasingly hostile towards Section 230.  In addition to a number of proposals from lawmakers of both parties to revise the provision, the Trump administration has also requested new rulemaking by the Federal Communications Commission to clarify certain terms under Section 230.

Section 230 protections apply to a wide range of online businesses of every size beyond social media, from web hosting to communications to home-sharing.  These revisions would substantially narrow the circumstances under which these online providers could voluntarily moderate and remove content, making it more difficult for online platforms to remove harmful speech or activity.  It would also widen the circumstances under which they could be subjected to litigation, subjecting them to costly burdens even when they have not engaged in wrong doing or negligence.  These costs would substantially threaten many online business models.

Access Partnership recommends that any online business established in the US which directly or indirectly through its customers involves user-provided content to carefully assess the draft legislation and its potential impacts for their business model. As Congress weighs various reforms, keeping visibility into different versions and engaging stakeholders will be key to mitigating risks.

Related Articles

Driving Brazil’s app ecosystem: The economic impact of Google Play and Android

Driving Brazil’s app ecosystem: The economic impact of Google Play and Android

With the largest Internet population in Latin America and the fourth-largest market for app adoption globally, Brazil is an established...

15 Apr 2024 Opinion
Access Alert: Brazilian authorities ask for contributions on AI and connectivity

Access Alert: Brazilian authorities ask for contributions on AI and connectivity

On 9 April, Brazil’s National Telecommunications Authority (Anatel) released a public consultation to gather contributions and insights about the role...

11 Apr 2024 Latest AI Thought Leadership
Access Alert: Orbiting innovation – key satellite industry trends unveiled at SATELLITE 2024

Access Alert: Orbiting innovation – key satellite industry trends unveiled at SATELLITE 2024

The SATELLITE 2024 conference in Washington, DC, took place between 18-21 March 2024. The event brought together close to 15,000...

28 Mar 2024 Opinion
Access Alert: Saudi Arabia launches consultation on spectrum management

Access Alert: Saudi Arabia launches consultation on spectrum management

Continuing the efforts carried out by the Communications and Information Technology Commission (CST) to improve Saudi Arabia’s regulatory framework and...

26 Mar 2024 Opinion