ITIF Logo
ITIF Search
New Attempts to Amend Section 230 Would Impede Content Moderation When It Is Needed Most

New Attempts to Amend Section 230 Would Impede Content Moderation When It Is Needed Most

September 24, 2020

Sens. Roger Wicker (R-MI), Lindsey Graham (R-SC), and Marsha Blackburn (R-TN) introduced legislation on September 8, the Online Freedom and Viewpoint Diversity Act, which would change the language in Section 230 of the Communications Decency Act to make online services liable if they remove content that is not obscene, violent, harassing, or illegal. Sen. Graham also introduced the Online Content Policy Modernization Act on September 21, which contained very similar provisions. The Senators’ goal is to prevent platforms from censoring conservative opinions, but the changes their bills make to Section 230 would impede platforms’ ability to moderate content in a way that protects both their users’ safety and freedom of expression.

Section 230 contains two main provisions, (c)(1) and (c)(2). Section 230(c)(1) states that online services are not liable for illegal third-party content on their platforms, and Section 230(c)(2) states that online services are not liable if they moderate “obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable” third-party content in “good faith.”

Some critics of Section 230 believe the “otherwise objectionable” standard to be too open-ended because online services can remove content that doesn’t fit the description of being obscene, violent, or harassing without losing their Section 230 liability protection. To be clear, the First Amendment generally gives online services the freedom to remove any content they do not want on their platforms. But some critics want to see online services stripped of Section 230 liability protection if online services restrict political content the online services disagree with. If Congress were to make these changes, it would not change what content online services can legally remove, but it would increase costs from defending against lawsuits alleging harm from content moderation—an expense which could force them to rethink their moderation policies.

The Online Freedom and Viewpoint Diversity Act and the Online Content Policy Modernization Act would narrow Section 230(c)(2) by replacing the phrase “otherwise objectionable” with the more specific “promoting self-harm, promoting terrorism, or unlawful.” This would mean online services could not seek dismissal of cases under Section 230(c)(2).

Both bills also raise the standard for (c)(2)’s liability shield. Instead of protecting online services from liability for removing content they “consider” meets the above criteria, it would only protect them if they “have an objectively reasonable belief” that the content meets the criteria. This would change the incentive for online services to moderate content on their platforms. Currently, services have more freedom to remove content they consider to be potentially harmful to their users, but under this higher standard, services may raise their standards for content removal and choose not to remove some potentially harmful content.

Like President Trump’s “Executive Order on Preventing Online Censorship” signed earlier this year, which also sought to clarify Section 230, these bills are a reaction to allegations that major social media platforms discriminate against and censor conservative viewpoints. Changing the “otherwise objectionable” standard and adding the “objectively reasonable belief” standard would mean that, if an online service did remove political content, the content’s poster could more easily sue the online service.

In an interview, Sen. Blackburn correctly stressed the importance of “revisiting” Section 230 instead of repealing it altogether, as some—including both President Trump and presidential candidate Joe Biden—have suggested. Section 230 is a crucial pillar of the Internet economy and the freedom of expression online, but it is also important to update it to better reflect how the Internet has changed since its passage in 1996.

Sens. Wicker, Graham, and Blackburn are correct that there is a need to clarify Section 230 and increase accountability for online services. But the Online Freedom and Viewpoint Diversity Act and Online Content Policy Modernization Act would go beyond that, contravening one of Section 230’s primary intentions: to encourage good faith content moderation. Section 230 gives online services the freedom to moderate content on their platforms in a way that best suits their users. This freedom has allowed for the development of many different types of online platforms, each experimenting with moderation policies that work best for their communities.

Facing threats of foreign disinformation campaigns ahead of this year’s presidential electionfake news about the Oregon wildfires, and false health claims during an ongoing pandemic, effective content moderation is more important than ever. Tightening the standard for online services to remove potentially harmful or illegal content without risking legal action would incentivize less content moderation. Services would be less willing to remove forms of content like bullying and misinformation that fall into the gray area of being harmful but not illegal because Section 230 would no longer explicitly protect them from lawsuits for removing such content.

Unlike other ongoing attempts to amend Section 230, including the EARN IT Act and the PACT Act, the Online Freedom and Viewpoint Diversity Act lacks bipartisan co-sponsors and is therefore less likely to move forward in Congress. Similarly, thus far, the Online Content Policy Modernization Act has no co-sponsors, and its introduction appears to be designed to allow the Senate Judiciary Committee to claim at least partial jurisdiction over this issue from the Senate Commerce Committee. But both bills are an important reminder to lawmakers that attempts to change such a foundational law of the Internet come with consequences that impact many aspects of the modern world. Amending Section 230 in a way that will improve upon the existing law without hindering innovation, free speech, or content moderation will require careful consideration and bipartisan agreement. ITIF will be releasing proposals on this in the near future.

Back to Top