ITIF Logo
ITIF Search

The Exceptions to Section 230: How Have the Courts Interpreted Section 230?

February 22, 2021

Courts have generally interpreted Section 230 of the Communications Decency Act broadly, but they have identified some limits for when the law does not provide immunity from liability.

KEY TAKEAWAYS

A U.S. appellate court set the precedent for how Section 230 is interpreted, ruling in Zeran v. America Online (1997) that the law protects online services from liability even if they knowingly distribute illegal third-party content.
In other influential cases, courts have found that Section 230 protects users who repost content and social media platforms that facilitate communications between users.
There are limits to Section 230’s liability shield. Starting with the Roommates.com (2008) decision, courts have identified some cases in which Section 230 does not allow online services to avoid liability.
Examples of such cases have involved online services that were found to have induced or encouraged development of illegal content; failed to warn users about illegal activity; breached a contract; or failed to act in good faith.

Introduction

What Is Section 230?

When Does Section 230 Apply?

When Does Section 230 Not Apply?

Endnotes

Introduction

Section 230 of the Communications Decency Act of 1996 is one of the foundational laws behind the modern Internet. It states that online services are not liable for third-party content on their platforms. The law was designed to incentivize online services to moderate the content on their platforms in a way that creates a safe environment for their users without infringing on their freedom of expression.

The language of Section 230 has left room for courts to decide how broadly or narrowly to interpret its liability shield for online services. These courts have answered important questions about the section’s language, questions such as what or who counts as a “provider or user of an interactive computer service”? What or who counts as an “information content provider”? What does it mean to treat a service provider as “the publisher or speaker” of third-party content? The answers to these questions have set the boundaries of intermediary liability in the United States.

Though courts have generally interpreted Section 230 broadly, the law does not provide Internet companies with absolute immunity from liability. Indeed, although a common complaint about Section 230 is that it provides a limitless shield that bad or negligent online actors can hide behind to avoid facing consequences for facilitating and profiting off of illegal content, this is not true. In addition to the exceptions enumerated in the law itself—Section 230 does not apply to federal criminal or intellectual property law, and after a recent update also no longer applies to sex trafficking law—courts have identified additional cases in which Section 230 does not apply.

What Is Section 230?

Section 230 has two main provisions: (c)(1) and (c)(2). First, (c)(1) states, “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.”[1] In practice, this means online services are not liable for defamatory or otherwise unlawful content their users post. Meanwhile, 230(c)(2) states that providers are not liable for “any action voluntarily taken in good faith to restrict access to or availability of” objectionable content.[2] This second provision enables providers to engage in content moderation and enforce their community standards.

When determining whether Section 230 applies to a specific case, courts use a three-pronged liability test. First, the defendant must be a “provider or user of an interactive computer service.” An interactive computer service, as defined by the law, “provides or enables computer access by multiple users to a computer server.” Second, the defendant must not be an “information content provider,” or a “person or entity that is responsible, in whole or in part, for the creation or development of information provided through the Internet.” Finally, the plaintiff’s claim must treat the defendant as the “publisher or speaker” of the content in question.[3] If a case meets all three requirements, Section 230 applies, and the defendant is not liable.

Section 230 does not shield online services from liability for intellectual property violations. A separate law, the Digital Millennium Copyright Act (1998), addresses the issue of copyright infringement on the Internet. Section 230 also does not shield online services from facing criminal liability for breaking the law.

When Does Section 230 Apply?

For the first decade after Section 230’s passage, most courts took a broad interpretation of the liability shield. They looked at Congress’s intentions in passing Section 230, enumerated in Section 230(a) and Section 230(b), which included “to promote the continued development of the Internet and other interactive computer services” and “to preserve the vibrant and competitive free market that presently exists for the Internet and other interactive computer services, unfettered by Federal or State regulation.”[4]

Courts decided the best way to ensure continued innovation and to remove disincentives to content moderation was to shield online services from civil liability for content moderation decisions. This lowered the barrier to entry for new companies with limited resources to offer online services without risking the cost of litigation for content posted by users. It enabled new models of online platforms that rely on user-generated content, such as social media platforms. And it encouraged companies to develop new tools that give users control over the materials they access online. In other words, these court decisions have allowed the Internet to grow into what it is today.

A Broad Interpretation of Liability Shield

One of the first Section 230 cases, Zeran v. America Online (1997), came just a year after the section was signed into law. In the wake of the Oklahoma City bombing in 1996, an anonymous user posted to an America Online (AOL) bulletin board, posing as Oklahoma businessman Kenneth Zeran. The post advertised T-shirts and other merchandise featuring offensive slogans that made fun of the bombing. It also included Zeran’s first name and home phone number. Over the next five days, the user continued to post similar advertisements for new offensive merchandise, encouraging people to call Zeran’s number, and even to “please call back if busy.” As a result, Zeran received a high volume of angry phone calls—as many as one every two minutes—including death threats.

Because the user who posted the fake advertisements was anonymous and there was no way for Zeran to identify and bring a suit against them, Zeran instead sued AOL for defamation and negligence because AOL delayed in removing the posts, refused to post retractions, and failed to screen for similar posts after the first one. He argued that, once AOL became aware of the posts, they could be held liable if they did not remove them in a timely manner and take measures to prevent the same thing from happening again. Although Section 230 prohibits treating an Internet service provider as the publisher of third-party content, Zeran argued that AOL would still be liable as the distributor of third-party content, because distributors are liable for knowingly distributing illegal material.

The case went to the Fourth Circuit Court of Appeals. The court ruled in favor of AOL, stating that distributor liability is “merely a species or type” of publisher liability, and that Section 230 therefore prohibits holding providers liable as publishers or as distributors. In doing so, the court offered a broad interpretation of the law, setting a precedent for future cases. Section 230, the court found, broadly protects Internet service providers such as AOL from liability for third-party content and for their publishing, editing, and removal decisions.[5]

In the decades since, most courts have followed in the Fourth Circuit’s footsteps, interpreting Section 230 as a broad liability shield for Internet service providers. Because of its lasting impact, Eric Goldman, a law professor and Section 230 expert, called Zeran “the most important Section 230 ruling to date—and probably the most important court ruling in Internet Law.”[6]

Reposting Content

Another early Section 230 case, Batzel v. Smith (2003), provides additional clarification on the section’s liability protections. Handyman Robert Smith was working for Ellen Batzel when, according to Smith, she claimed she was descended from “one of Adolf Hitler’s right-hand men,” Heinrich Himmler. Based on this information, Smith concluded that the European-style paintings hanging in Batzel’s house were stolen during World War II. He emailed the Museum Security Network, a website about stolen artwork. The website’s operator, Tom Cremers, lightly edited Smith’s original email and forwarded it to the Network’s listserv. When Batzel discovered the message, she sued Smith and Cremers for defamation, claiming that Smith’s statements were false, though she later voluntarily dismissed her case against Smith.

The case went to the Ninth Circuit Court of Appeals—which has heard a number of influential Section 230 cases because Silicon Valley falls within its jurisdiction. It ruled that Section 230 shielded Cremers from liability for forwarding Smith’s email to the Museum Security Network listserv. The Ninth Circuit based its judgment on two factors. First, because Section 230 applies to “provider[s] and user[s] of an interactive computer service” (emphasis added), it applied to Cremers, even though his Museum Security Network was not an Internet service provider. Second, although Cremers edited Smith’s original email, the court determined that he was not responsible for developing the content of the email because he “did no more than select and make minor alterations” to it.[7]

The California Supreme Court ruled similarly in Barrett v. Rosenthal (2006). Ilena Rosenthal, director of a foundation that spreads information about the purported negative side effects of breast implants, participated in groups focusing on alternative medicines on an online bulletin board called Usenet. She forwarded an email written by Timothy Bolen, a publicist for alternative-medicine practitioners, to those groups that included allegedly libelous statements about two doctors who opposed alternative medicine, Stephen Barrett and Terry Polevoy. Barrett contacted Rosenthal and threatened to sue if she did not remove her messages. When she refused and continued to post about the doctors, Barrett and Polevoy sued her for defamation. Like in Batzel, the court ruled that Section 230 shielded Rosenthal from liability for reposting Bolen’s statements, and that 230’s protections apply to forwarded emails.[8] Barrett and Polevoy’s case against Bolen also failed because the doctors were public figures, and they could not prove Bolen acted with actual malice.[9]

Social Media Platforms

Critically, Section 230 protects social media platforms from liability for their users’ activity. Social media did not exist when Section 230 was first signed into law, but court cases in the years since have created a precedent for applying the section to platforms such as Facebook and Twitter. In one of these cases, Doe v. MySpace (2008), the anonymous plaintiff (“Julie Doe”), a 13-year-old girl at the time, lied about her age to create a public profile on MySpace.com. She used this profile to communicate with Pete Solis, an adult male user. She then gave Solis her phone number and arranged to meet him in person, where he sexually assaulted her.

Julie’s mother (“Jane Doe”) sued MySpace for negligence. She argued that MySpace lacked necessary security measures that would have prevented the 13-year-old Julie from creating a profile, which was against their terms of service. She also argued that MySpace marketed its platform to minors and made it easy for them to lie about their age so they could create a public profile and communicate with other users. Finally, she argued that Section 230 “[did] not immunize MySpace’s failure to take reasonable steps to ensure minors’ safety.”

The case went to the Fifth Circuit Court of Appeals, which rejected the Does’ arguments. The heart of the argument, the Fifth Circuit ruled, was that “if MySpace had not published communications between Julie Doe and Solis, … they would never have met and the sexual assault never would have occurred.” By suing MySpace, Jane Doe was trying to hold the platform accountable for Solis’s actions toward Julie Doe, when MySpace’s only role was as passively facilitating their communications. Because Section 230 protects Internet service providers from liability for the publication of third-party content and communication—as in Zeran—MySpace was not liable for Solis’s actions toward Julie Doe.

When Does Section 230 Not Apply?

While courts have broadly shielded online service providers from liability for third-party content, they have also identified instances wherein Section 230 should not apply. These cases generally fall into one of three categories. First, if the defendant may have induced or contributed to the development of the illegal content in question, then Section 230 does not apply. Second, if the plaintiff’s claim does not arise from the defendant’s publishing or content moderation decisions, then Section 230 does not apply because Section 230 does not protect providers from all liability (only liability from its role as a publisher). Third, if the case relates to a content-removal decision and the defendant fails to meet Section 230(c)(2)’s “good faith” requirement, then Section 230 does not apply because the defendant does not qualify for its protection.

Inducing Illegal Content

The first major case to identify limits to the Section 230 liability shield, Fair Housing Council of San Fernando Valley v. Roommates.com (2008), came over a decade after Zeran. Roommates.com operates a website that matches people looking for housing with available rooms. Users are required to disclose their sex, sexual orientation, and familial status, then list their roommate preferences, including those same characteristics, and Roomates.com automatically matches them with potential roommates and allows them to search for potential roommates based on their characteristics.

The Fair Housing Councils of San Fernando Valley and San Diego sued Roommates.com for violating the Fair Housing Act (FHA), a federal law, as well as the California Fair Employment and Housing Act (FEHA). The FHA prohibits housing discrimination based on sex and familial status, among other traits.[10] And the FEHA prohibits housing discrimination based on sex, sexual orientation, and familial status, among other traits.[11]

The case went to the Ninth Circuit Court of Appeals, which ruled that Section 230 did not apply because Roommates.com requires users to provide information about their sex, sexual orientation, and familial status as a condition of using the website. Because Roommates.com induced the illegal content in question, it could be liable for breaking federal and state housing law.[12] Four years later, in Fair Housing Council of San Fernando Valley v. Roommate.com (2012), the Ninth Circuit ruled differently, concluding that the FHA and FEHA did not apply to roommate selection.[13] Regardless, the original Roommates.com decision was a turning point, the first of a handful of key decisions in which the defendant was not able to successfully avoid liability under Section 230.

Other cases relating to housing discrimination have turned out differently when the defendant was not found to have induced illegal content. In Chicago Lawyers’ Committee for Civil Rights Under Law v. Craigslist (2008), the Chicago Lawyers’ Committee for Civil Rights Under Law, a consortium of Chicago civil rights law firms, sued Craigslist for violating the FHA because of advertisements users posted on its online classifieds service for rental properties that included discriminatory language. Craigslist had a policy requiring these ads to adhere to the FHA, and it removed posts that were discriminatory when users reported them, but it did not prescreen for them. Unlike in Roommates.com, the court ruled that Section 230 did apply and dismissed the case against Craigslist because Craigslist did not induce illegal content by requiring users to answer potentially discriminatory questions.[14]

More recently, in National Fair Housing Alliance v. Facebook (2018), the National Fair Housing Alliance and other fair housing organizations sued Facebook for violating the FHA. Third-party advertisers can select an audience using Facebook’s ad platform to target certain demographics, meaning landlords and real estate brokers could prevent people of a certain race, sex, or familial status from seeing their ads.[15] In its filing, Facebook claimed that Section 230 shielded it from liability, but in 2019, Facebook ended up settling and agreeing to change the way it sells advertisements for housing, employment, and credit services.[16]

Developing Illegal Content

A year after the original Roommates.com, a separate court ruled similarly, creating another exception to Section 230 and reaffirming the Ninth Circuit’s decision. Accusearch ran Abika.com, a website that facilitated the sale of individuals’ personal information, including customer proprietary network information, which is metadata about telephone calls. The acquisition of this information violated the Telecommunications Act of 1996,[17] and its sale violated the Telephone Records and Privacy Protection Act of 2006, although the latter was not enacted until after Accusearch stopped selling the information.[18] The Federal Trade Commission (FTC) accused Accusearch of engaging in unfair trade practices and took them to court.

The Tenth Circuit Court of Appeals heard the FTC v. Accusearch (2009) case. It denied Accusearch Section 230 immunity, borrowing its reasoning from Roommates.com and stating that “a service provider is ‘responsible’ for the development of offensive content … if it in some way specifically encourages development of what is offensive about the content.” Because Abika.com played an active role in selling personal information—advertising and delivering it and processing payments—rather than acting as a passive intermediary, the court considered it “responsible for the development of offensive content” even though the personal information came from third-party vendors.[19]

Breach of Contract

Another case that identified limits to Section 230, Barnes v. Yahoo! (2009), bears some resemblance to Zeran, but with one small but significant difference that led the court to rule differently. After plaintiff Cecilia Barnes broke up with her boyfriend, her ex created multiple Yahoo! profiles, posing as Barnes and posting nude photographs of her, along with her work address, email, and telephone number. Barnes then began receiving harassing phone calls, emails, and visits to her workplace. She mailed and then faxed a copy of her ID and a signed statement to Yahoo! denying that she had set up the profiles and asking the company to remove them. She received an assurance from Yahoo!’s director of Communications that she would “personally walk the statements over to the division responsible for stopping unauthorized profiles and they would take care of it.” However, the profiles remained, and after three months passed, Barnes filed a negligence claim and a promissory estoppel claim, which would hold Yahoo! accountable for breaking its promise to remove the fake profiles.

The case went to the Ninth Circuit Court of Appeals, which ruled that Section 230 did not apply to the latter of Barnes’s two claims.[20] Section 230 precludes any civil suit that would treat an Internet service provider—in this case, Yahoo!—“as the publisher or speaker of any information provided by another information content provider.”[21] According to the court, Section 230 did apply to Barnes’s negligence claim, which would treat Yahoo! as a publisher by punishing it for failing to remove the fake profiles; Zeran already established that service providers could not be liable for failing to remove third-party content. However, Section 230 did not apply to Barnes’ promissory estoppel claim, which did not treat Yahoo! as a publisher. Instead, it treated Yahoo! as one of the parties in a verbal contract and sought to hold Yahoo! accountable for breaching that contract.

Failure to Warn

Eight years after Doe v. MySpace, a similar case—Doe v. Internet Brands (2016)—went to the Ninth Circuit Court of Appeals and met with a different result. The anonymous plaintiff, aspiring model “Jane Doe,” posted her information on Model Mayhem, a networking website for the modeling industry. Two men, Lavont Flanders and Emerson Callum, posed as talent scouts and contacted Doe in 2011 to lure her to a fake audition and then rape her. Flanders and Callum had been running similar schemes using Model Mayhem for several years, beginning as early as 2006. Internet Brands purchased Model Mayhem in 2008 and learned of Flanders and Callum’s activity shortly thereafter.

Doe sued Internet Brands for negligent failure to warn, arguing that the company was liable because it knew about Flanders and Callum’s activity and did not warn its users. This “actual knowledge by Internet Brands from an outside source of information about criminal activity” is where Internet Brands differs from MySpace. Unlike in MySpace, the question at hand wasn’t related to Internet Brands’ publishing decisions: According to the court, the claim did not arise from allegations about Internet Brands “mishandling the removal of third party content” or “[failing] to adequately regulate access to user content.” Therefore, the Ninth Circuit decided Section 230 did not apply and Internet Brands could be held liable for failing to warn its users about the illegal activity taking place on its platform.[22]

Selectively Reposting Content

Diamond Ranch Academy v. Filer (2016) also bears a resemblance to a previous Section 230 case, Batzel v. Smith. Chelsea Filer created a website called DRASurvivors.com for former students of a residential treatment facility, Diamond Ranch Academy (DRA), to share their experiences. Filer posted statements from former students on the website accusing DRA of abusing and mistreating its students. DRA sued Filer for defamation. Filer sought immunity under Section 230, arguing that, because she merely reposted third-party statements, she could not be held liable. Because DRA was located in Hurricane, Utah, the case went to the District Court for the District of Utah.

Filer quoted the Ninth Circuit’s ruling in Batzel to make her argument that “the exclusion of ‘publisher’ liability [by Section 230] necessarily precludes liability for exercising the usual prerogative of publishers to choose among proffered material and to edit the material published while retaining its basic form and message.”[23] The district court did not accept this argument. It made a distinction between Filer’s posts and defendant Cremers’s post in Batzel, ruling that Filer’s posts “do not lead a person to believe that she is quoting a third party,” and that Filer “did more than simply post whatever information the third parties provided.” She combined her own statements with the statements of others and selectively chose those statements to paint DRA in a negative light. In other words, Filer did not “retain [the material’s] basic form and message”; she altered it. In doing so, the court decided, she did not qualify for Section 230’s liability shield because she was a publisher.[24]

Failing to Act in Good Faith

The bulk of Section 230 cases deal with the first of the law’s two main provisions, 230(c)(1), which “blocks civil liability when web hosts and other Internet service providers (ISPs) refrain from filtering or censoring the information on their sites,” instead of 230(c)(2), which ensures a provider “that does filter out offensive material is not liable to the censored customer.”[25] The other key difference between the two is 230(c)(1)’s broader language: (c)(2) protection only applies to actions “taken in good faith,” while (c)(1) contains no such requirement.

E-Ventures Worldwide v. Google (2016) is one of the minority of cases that dealt with (c)(2). E-Ventures Worldwide provided search engine optimization (SEO) services to its clients, helping them get ranked higher in search results on search engines such as Google. Google, on the other hand, offers a service called “AdWords,” through which companies can pay Google to rank their websites higher in its paid search listing results. Google allegedly found that many of E-Ventures’ websites were violating Google’s Webmaster Guidelines that protect users against spam. As a result, it de-indexed all of E-Ventures’ websites so they would no longer show up in Google search results.

E-Ventures sued Google for engaging in anticompetitive behavior, arguing that Google de-indexed the company’s websites because its SEO services competed with Google’s AdWords service. The case went to the District Court for the Middle District of Florida, where E-Ventures was based. Google claimed Section 230 shielded it from liability for its actions and tried to argue its case using 230(c)(1). However, because Google did filter content, instead of refraining from filtering content, it could only seek exemption under the narrower 230(c)(2). The district court did not grant Google’s motion to dismiss under Section 230 because E-Ventures provided enough evidence that Google may have acted anticompetitively; if this was the case, Google would not have been acting in good faith when it de-listed E-Ventures’ websites, and Section 230(c)(2) would not apply.[26] Even though it failed to get the case against it dismissed, Google ultimately won the case when the court ruled that its actions were protected First Amendment speech.[27]

In a similar case, Enigma Software Group v. Malwarebytes (2019), Enigma, a company that offers malware removal tools, sued competitor Malwarebytes, alleging that Malwarebytes configured its anti-malware products to block users from downloading and using Enigma’s products. Malwarebytes argued that Section 230(c)(2) shielded it from liability for blocking access to certain content. However, the Ninth Circuit Court of Appeals ruled that Section 230(c)(2) did not apply. Section 230(c)(2) protects online services from liability for removing or restricting access to content services they consider “obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable.”[28] According to the Ninth Circuit, this does not include blocking access to content for anticompetitive reasons, as Enigma alleged.[29]

About the Authors

Ashley Johnson (@ashleyjnsn) is a policy analyst at ITIF. She researches and writes about Internet policy issues such as privacy, security, and platform regulation. She was previously at Software.org: the BSA Foundation and holds a master’s degree in security policy from The George Washington University and a bachelor’s degree in sociology from Brigham Young University.

Daniel Castro (@CastroTech) is vice president at ITIF and director of its Center for Data Innovation. He writes and speaks on a variety of issues related to information technology and Internet policy, including privacy, security, intellectual property, Internet governance, e-government, and accessibility for people with disabilities.

About ITIF

The Information Technology and Innovation Foundation (ITIF) is an independent, nonprofit, nonpartisan research and educational institute focusing on the intersection of technological innovation and public policy. Recognized by its peers in the think tank community as the global center of excellence for science and technology policy, ITIF’s mission is to formulate and promote policy solutions that accelerate innovation and boost productivity to spur growth, opportunity, and progress.

For more information, visit us at www.itif.org.

Endnotes


 


[1]47 U.S.C. § 230(c)(1) (1996).

[2]47 U.S.C. § 230(c)(2) (1996).

[3]Kathleen Ann Ruane, “How Broad a Shield? A Brief Overview of Section 230 of the Communications Decency Act” (Congressional Research Service, February 2018), https://fas.org/sgp/crs/misc/LSB10082.pdf, 1–2.

[4]47 U.S.C. § 230(b) (1996).

[5]Zeran v. Am. Online, Inc., 129 F.3d 327 (4th Cir. 1997).

[6]Eric Goldman, “The Ten Most Important Section 230 Rulings,” Tulane Journal of Technology and Intellectual Property 20 (Fall 2017), 3, http://journals.tulane.edu/index.php/TIP/article/download/2676/2498.

[7]Batzel v. Smith, 333 F.3d 1018 (9th Cir. 2003).

[8]Barrett v. Rosenthal, 146 P.3d 510 (Cal. 2006).

[9]Barrett v. Clark, 2001 WL 881259, 29 Media L. Rep. 2473, (Cal.Sup. 2001)

[10]42 U.S.C. § 3604(c) (1968).

[11]Cal. Gov. Code §§ 12955.

[12]Fair Hous. Council of San Fernando Valley v. Roommates.com, LLC, 521 F.3d 1157 (9th Cir. 2008) (en banc).

[13]Fair Hous. Council of San Fernando Valley v. Roommate.com, LLC, 666 F.3d 1216 (9th Cir. 2012).

[14]Chicago Lawyers' Committee For Civil Rights Under Law v. Craigslist, 519 F.3d 666 (7th Cir. 2008).

[15]National Fair Housing Alliance v. Facebook, No. 1:18-CV-02689-JGK (S.D.N.Y. 2018).

[16]Alexis C. Madrigal, “Facebook Does Have to Respect Civil-Rights Legislation, After All,” The Atlantic, March 20, 2019, https://www.theatlantic.com/technology/archive/2019/03/facebook-inc-does-have-to-respect-civil-rights-legislation-after-all/585286/.

[17]47 U.S.C. § 222 (1996).

[18]18 U.S.C. § 1039 (2006).

[19]FTC v. Accusearch Inc., 570 F.3d 1187 (10th Cir. 2009).

[20]Barnes v. Yahoo!, Inc., 570 F.3d 1096, 1107 (9th Cir. 2009).

[21]47 U.S.C. § 230(c)(1) (1996).

[22]Doe v. Internet Brands, Inc., 824 F.3d 846 (9th Cir. 2016).

[23]Batzel v. Smith, 333 F.3d 1018 (9th Cir. 2003).

[24]Diamond Ranch Acad., Inc. v. Filer, No. 2:14-CV-751-TC, 2016 U.S. Dist. LEXIS 19210 (D. Utah Feb. 17, 2016).

[25]John Doe v. GTE Corporation, 347 F.3d 655 (2003).

[26]E-Ventures Worldwide v. Google, No. 2:14-CV-646-FTM-29CM, 2016 U.S. Dist. LEXIS 62855 (M.D. Fla. May 12, 2016).

[27]E-Ventures Worldwide v. Google, No. 2:14-CV-646-PAM-CM (M.D. Fla. Feb. 8, 2017).

[28]47 U.S.C. § 230(c)(2) (1996).

[29]Enigma Software Group USA, LLC v. Malwarebytes, Inc., 2019 WL 4315152 (9th Cir. Sept. 12, 2019).

Back to Top