ITIF Logo
ITIF Search
The Supreme Court May Take Out One of the Pillars of the Modern Internet

The Supreme Court May Take Out One of the Pillars of the Modern Internet

October 5, 2022

The Supreme Court announced on October 3, 2022 that it would hear the case Gonzalez v. Google LLC, the court’s first case dealing with an Internet law that has been caught in Congress’ crosshairs over the past few years—Section 230 of the Communications Decency Act. The Supreme Court’s decision in this case could change the way the Internet functions, potentially leaving both social media companies and users worse off.

Section 230 states that online services cannot be held liable for third-party content. In the Gonzalez case, the plaintiffs sued Google for allegedly hosting terrorist recruitment videos on YouTube and recommending these videos to users. The plaintiffs’ argument is that the algorithmic recommendation is not third-party content and therefore is not covered under Section 230. However, since the early days of the law, courts have interpreted Section 230 to also apply to online services’ editorial decisions related to third-party content, such as removal, failure to remove, or algorithmic recommendations.

The court could rule that Section 230 does not cover algorithmic recommendations, opening the doors to a flood of frivolous lawsuits related to social media and other online services’ algorithms. Currently, Section 230 is a significant cost-cutting mechanism that allows companies to get lawsuits against them dismissed when a case involves third-party content. If the Supreme Court created a large exception to this legal immunity for algorithmic recommendations, companies would now have to prove that lawsuits against them do not fall under this exception. Companies could respond to these lawsuits in two ways.

First, companies could bear the burden of higher legal costs, to the tune of up to hundreds of thousands of dollars per case. Even companies that have these resources at their disposal would need to cover these costs in some way—potentially taking resources away from other activities, including activities that benefit users, such as creating new services or expanding user safety features. Companies could, alternatively, start charging for services that were previously free or charging more for features for which users already must pay. Companies with fewer resources—and nonprofit organizations that offer online services—that cannot afford the cost of litigation may even end up going out of business.

If companies would rather not pay higher legal costs, they would instead be required to change the way their services operate. On the extreme end, online services may stop using algorithms to recommend content. While some would celebrate the return of a chronological feed on social media platforms, many users benefit from social media algorithms recommending content that is relevant to their interests or sorting content according to popularity or other metrics.

Changes to the way social media platforms recommend content would harm more than just users who enjoy exploring content that is tailored to them. In the Gonzalez case, the plaintiffs allege that YouTube’s algorithm recommends ISIS recruitment videos to users. While ISIS recruitment videos are inarguably terrorist content, in an effort to avoid potentially recommending terrorist content, platforms may change their algorithms to avoid recommending any content that could potentially lead users down an extremist path. This sounds like a good thing, but if platforms cast their net too wide, they could end up suppressing a wide range of relatively harmless content, such as political discourse or hobbies and interests that are associated with certain political ideologies.

This is the opposite of what many conservatives hope to accomplish by limiting the scope of Section 230. Some conservative lawmakers have accused popular social media companies of displaying a liberal bias and censoring conservative speech, despite evidence to the contrary, with research showing that some algorithms may even favor conservative views. Republican lawmakers want to limit companies’ ability to remove or de-amplify certain political viewpoints, but dismantling Section 230 is far more likely to achieve the opposite, compelling companies to make even more judgment calls about what speech to allow on their platforms, out of a concern that they may be held liable for hosting or amplifying that speech.

The Supreme Court’s decision on Gonzalez v. Google could have far-reaching consequences, not just for the law that “created the Internet” but for the Internet itself. Limiting the scope of Section 230 and carving out an exception for algorithmic recommendations would harm users and create an obstacle to innovation.

Back to Top