ITIF Logo
ITIF Search
PACT Act Would Increase Platform Transparency, But Undercut Intermediary Liability

PACT Act Would Increase Platform Transparency, But Undercut Intermediary Liability

August 7, 2020

A growing number of policymakers want to reform Section 230 of the Communications Decency Act, which states that online services are not legally responsible for content their users post, but the latest proposal under the spotlight still misses the mark.

The Senate Commerce Subcommittee on Communications, Technology, Innovation, and the Internet held a hearing July 28 on Sens. John Thune’s (R-SD) and Brian Schatz’s (D-HI) proposal to reform Section 230, the Platform Accountability and Consumer Transparency (PACT) Act. Sens. Thune and Schatz stressed their desire to strike a balance between ensuring accountability and transparency in content moderation without placing an unnecessary burden on online services. This is an important goal, and one that should shape any potential reform to this law, but the PACT Act does not meet this standard.

Former Congressman Chris Cox, one of Section 230’s coauthors, who spoke at the hearing, further emphasized this point when he argued that failing to strike the right balance would have devastating consequences not just for the large social media platforms but for platforms with limited financial resources, like Wikipedia or the thousands of small online platforms.

The PACT Act would require online platforms to publish acceptable use policies detailing the types of content they allow, how they enforce those policies, and how users can report violations. In addition, online platforms would be required to publish a quarterly transparency report on their content moderation decisions. These transparency measures are not controversial, and indeed the practice of most major platforms.

However, the PACT Act goes further and requires platforms also provide a complaint system for users to report policy-violating or illegal content and appeal platform decisions to remove user-submitted content. The PACT Act outlines how online platforms must process complaints regarding content or activity that violates their policies or is illegal. Certain violations, including failure to remove policy-violating content, would be treated as “unfair or deceptive acts and practices,” giving the Federal Trade Commission (FTC) power to take action against platforms. The act also would give state attorneys general the power to bring action against platforms that violate federal civil law.

This type of regulatory framework exposes platforms to a lot of risk and uncertainty because they do not know whether regulators will second guess their enforcement decisions. As a result, platforms would have an incentive to create extremely permissive acceptable use policies, so they are only responsible for taking down illegal content, to minimize their exposure. Currently, most mainstream online platforms have robust terms of service that aim to restrict online abuse, harassment, bullying, and more, which helps make the Internet a safer and more civil place. Giving platforms discretion on content moderation allows a diversity of opinions to flourish online. The PACT Act risks changing the online world so that more platforms resemble 4chan, an anonymous forum known for coordinated online abuse and harassment, than Facebook. Complaint systems can also be abused and used as a tactic to silence controversial or marginalized voices.

In addition, under the PACT Act, Section 230’s legal shield would not apply if an online platform is notified of illegal content or activity and fails to remove the content or stop the activity within 24 hours. But this will lead to overenforcement. The high risk for noncompliance and short time frame to take action create an incentive for platforms to take down reported content, including content that is not actually illegal. Moreover, companies already must comply with court orders to remove unlawful content and there is no reason to condition Section 230 liability to whether they act within 24 hours. Section 230 also already has no effect on intellectual property law; websites still must remove copyright infringing content.

Other provisions in the PACT Act seem counterintuitive when examined closely. One of Sens. Thune’s and Schatz’s goals with their reform of Section 230 is to update the law to reflect the many ways the online world has changed since the Communications Decency Act became law in 1996. But given that their goal is to modernize, it makes little sense for the Act to include an outdated requirement for online platforms to have a live company representative available via a toll-free number during regular business hours so users can notify them of policy-violating or illegal content. Requirements for an email address or other online mechanism for users to contact the platform—which the Act also includes—are more appropriate.

The PACT Act also contains exemptions for small businesses, defined as online services that, over the past 24 months, have received fewer than 1 million monthly active users or visitors and revenue of less than $25 million. Does illegal activity on a smaller platform matter less? The size of a platform says nothing about the amount of illegal activity the platform could host. A revenge porn website or an online chat app with an active base of users hosting child sexual assault material may have a much smaller total online footprint than Facebook or Twitter but still do plenty of harm, and in many cases, more harm than these larger platforms. Any bill aiming to reform Section 230 should set standards that can apply equally to small and large platforms without overburdening them.

Finally, it makes little sense to involve state attorneys general in the enforcement of federal civil law. Are there specific federal laws that the Department of Justice is failing to enforce? If so, what are they, and why? If Congress does not believe the Department of Justice is properly upholding federal laws, this represents a larger problem that Congress needs to address separately.

Sens. Thune and Schatz have the right goal in mind: to update Section 230 for the modern Internet without eliminating the benefits the law has for online services and users. In the 24 years since the Section 230 became law, the online world has changed dramatically, and it makes sense to reform the law to reflect those changes. The PACT Act is not Congress’ first attempt at Section 230 reform, nor is it the only proposal Congress is currently considering. But like those other attempts, it still fails to strike the right balance of updating the law without placing an unnecessary burden on online platforms that would have consequences for their users and the entire online world, and future reform efforts should learn from these mistakes.

Back to Top