ITIF Logo
ITIF Search

Germany’s Content Moderation Regulation

Germany’s Content Moderation Regulation
Knowledge Base Article in: Big Tech Policy Tracker
Last Updated: June 5, 2025

The Framework

The Network Enforcement Act (NetzDG), effective since January 2018, applies to social media platforms with over 2 million registered users in Germany, requiring them to remove “clearly illegal” content within 24 hours and all illegal content within 7 days of receiving user complaints.[1] The regulation covers 22 categories of criminal offenses including defamation, hate speech, incitement to violence, and public insults, with platforms facing fines up to €50 million for systematic non-compliance.[2] Platforms must establish transparent complaint procedures, inform users immediately of content moderation decisions, and store deleted content for at least 10 weeks for evidence purposes.[3] Companies receiving more than 100 complaints annually must publish biannual transparency reports detailing their content removal statistics and complaint handling processes.[4] The 2021 amendments expanded requirements to include appeals procedures for content decisions, mandatory reporting of certain criminal content to federal police, and enhanced oversight powers for the Federal Office of Justice.[5] The law primarily impacts major U.S. platforms including Facebook, Instagram, Twitter, and YouTube, while exempting smaller competitors and journalistic services from these obligations.[6]

Implications for U.S. Technology Leadership

Germany’s user threshold systematically captures major U.S. technology platforms while exempting smaller competitors, creating a two-tiered regulatory environment that burdens American market leaders with substantial compliance costs and operational constraints. The 24-hour removal requirement for “clearly illegal” content incentivizes over-removal as platforms err on the side of censorship to avoid €50 million penalties, with studies showing that 87.5 percent to 99.7 percent of removed content was legally permissible speech.[7] These regulatory obligations force U.S. companies to restructure their content moderation systems according to German legal specifications, diverting resources from innovation and product development toward compliance infrastructure and legal review processes.

The regulation’s extraterritorial influence extends beyond Germany as it has served as a template for similar laws in at least thirteen countries including Russia, India, Malaysia, and Venezuela, creating a fragmented global operating environment where U.S. platforms face mounting compliance obligations.[8] The enforcement actions against major platforms demonstrate the serious financial risks U.S. companies face in European markets. As NetzDG’s regulatory model proliferates internationally, American technology companies confront an increasingly constrained global environment where European content standards become the de facto international norm, systematically undermining U.S. technological leadership and competitive positioning while potentially creating market openings for less regulated competitors from other jurisdictions.

Endnotes

[1] “Germany: Network Enforcement Act Amended to Better Fight Online Hate Speech,” Library of Congress, July 6, 2021, https://www.loc.gov/item/global-legal-monitor/2021-07-06/germany-network-enforcement-act-amended-to-better-fight-online-hate-speech/.

[2] Ibid.

[3] “German Content Moderation and Platform Liability Policies,” University of Washington, July 17, 2024, https://jsis.washington.edu/news/german-content-moderation-and-platform-liability-policies/.

[4] “Reading between the lines and the numbers: an analysis of the first NetzDG reports,” Internet Policy Review, June 12, 2019, https://policyreview.info/articles/analysis/reading-between-lines-and-numbers-analysis-first-netzdg-reports.

[5] “Germany: Flawed Social Media Law,” Human Rights Watch, February 14, 2018, https://www.hrw.org/news/2018/02/14/germany-flawed-social-media-law.

[6] “Most Comments Deleted from Social Media Platforms in Germany, France, and Sweden Were Legal Speech,” Tech Policy Press, June 24, 2024, https://www.techpolicy.press/most-comments-deleted-from-social-media-platforms-in-germany-france-and-sweden-were-legal-speech-why-that-should-raise-concerns-for-free-expression-online/.

[7] Ibid.

[8] “UN Human Rights Committee Criticizes Germany’s NetzDG for Letting Social Media Platforms Police Online Speech,” Electronic Frontier Foundation, January 25, 2022, https://www.eff.org/deeplinks/2021/11/un-human-rights-committee-criticizes-germanys-netzdg-letting-social-media.

Back to Top