ITIF Logo
ITIF Search
Debunking Two Big Myths From the Recent Hearing on Sunsetting Section 230

Debunking Two Big Myths From the Recent Hearing on Sunsetting Section 230

May 24, 2024

With the resurgence of debate surrounding Section 230 of the Communications Decency Act, a law that states that online services are not liable for the content their users post, a number of misconceptions about the law have also reemerged. Two of these misconceptions featured in a House Energy and Commerce Committee hearing on May 22, 2024 discussing a legislative proposal to sunset Section 230. In order to effectively address problems ranging from disinformation to harmful advertising practices, Congress needs to look beyond Section 230, which remains an important legal shield for online services of all sizes and their users.

Much of the discussion focused on various online problems that lawmakers argued were a result of Section 230’s legal shield. This is the first common myth about Section 230. For years, lawmakers and anti-Section 230 advocates have pushed for limits or an outright repeal of Section 230’s legal shield in the name of solving a litany of online problems, including nonconsensual pornography, terrorist content, disinformation, child sexual abuse material (CSAM), harmful advertising practices, and a lack of transparency in content moderation. However, the government could effectively address all of these problems without touching Section 230.

For example, to address nonconsensual pornography, Congress should make the practice illegal at the federal level, including outlawing deepfake nonconsensual pornography. To address terrorist content and state-sponsored disinformation campaigns, the federal government should provide guidance and support to online services to find and remove this content, including increased information sharing between government and industry and research grants for academics studying these issues to partner with social media platforms and improve their methods. To address CSAM, Congress should increase funding for law enforcement to investigate CSAM reports and prosecute the perpetrators responsible for creating, spreading, and enabling this content, including more funding for police technology and Internet Crimes Against Children task forces. To address harmful advertising practices, Congress should pass a law instructing the Federal Trade Commission (FTC) to create and enforce new regulations that address issues with online advertising, such as adult products and services being advertised to children.

Congress should also pass transparency requirements for social media content moderation, requiring platforms to clearly describe what content and behavior is and is not allowed, how they enforce these rules, and how users can appeal moderation decisions. The law should also require platforms to enforce their rules consistently, create an appeals process for moderation decisions if one does not already exist, and release publicly accessible annual reports with data on how many and what types of rule-breaking content and behavior the platform removed, how many of those decisions were appealed, and how many of those appeals were successful.

Finally, the Department of Justice (DOJ) should go after bad actors—both online services and individual users—who break the law, something Section 230 does not impede. As an example, the DOJ seized and took down the online classified advertising website Backpage in 2018 and indicted the website’s owners on counts of money laundering, conspiracy, and facilitating prostitution. A federal jury convicted three of the former owners in 2023.

During the May 22 hearing on sunsetting Section 230, witnesses before the House Energy and Commerce Committee argued that online services do not need Section 230 because the First Amendment protects online services’ editorial decisions, including their decisions to remove content that violates their terms of service and community guidelines as well as their decisions not to remove legal but objectionable forms of content. This is another common misconception. Yes, the First Amendment protects online services’ editorial decisions, but Section 230 provides an important legal shortcut for online services to get cases against them dismissed.

While this may sound like, at best, a minor convenience for online services and, at worst, a “gift” to “Big Tech” companies—as some have argued—it is actually a huge cost-saving measure that helps online services of all sizes stay afloat and avoid massive, unnecessary legal fees. This benefits small businesses, startups, and nonprofit organizations most of all, as many do not have the resources to afford these fees. According to Engine, a nonprofit that advocates for startups, bringing a case to the motion to dismiss stage typically costs between $15,000 and $40,000, though it can cost up to $80,000. This is already a high cost, but it pales in comparison to the cost of bringing a case to the discovery and trial stage: between $100,000 and $500,000. Without the ability to use Section 230 to dismiss cases against them, online services would have to pay these six-figure fees every time a case is brought against them, even if they end up winning.

Congress recognized the importance of creating this legal shortcut when it passed Section 230, writing in the law itself that, “It is the policy of the United States to promote the continued development of the Internet” and “to preserve the vibrant and competitive free market that presently exists for the Internet.” Stripping away Section 230 protections would sabotage this competitive free market, to the detriment of users, as online services would likely pass their higher operating costs onto consumers by charging more for services or charging for services that were previously free. Alternatively, online services could implement strict guidelines that censor controversial forms of speech.

Repealing or sunsetting Section 230 will make the Internet a worse place, but there is much Congress can do to make the Internet better. Unfortunately, until policymakers understand the reality of Section 230 and how it protects not only large tech companies but any online services that hosts user-generated content, as well as users themselves, Congress will not make meaningful progress on content-related issues.

Back to Top