The UK’s Online Safety Act
The Framework
The UK Online Safety Act 2023 imposes obligations on user-to-user services and search engines accessible in the UK, with requirements varying by platform size. The Act mandates platforms detect and remove illegal content including terrorism material, child sexual abuse material, and fraud, while services likely accessed by children must block content related to suicide, self-harm, and eating disorders through age verification systems.[1] Platforms exceeding 34 million UK monthly users with content recommendation systems face Category 1 obligations including additional transparency requirements and senior management liability.[2] The Act requires platforms to complete risk assessments by March 2025 and implement “highly effective age assurance” technologies that require collecting user data, with Ofcom issuing codes detailing specific compliance measures.[3] Enforcement mechanisms include fines up to £18 million or 10 percent of qualifying worldwide revenue, plus criminal liability for executives and service blocking orders.[4] The Act’s monitoring obligations require platforms to scan user communications, which technical experts confirm cannot be accomplished without breaking end-to-end encryption.[5]
Implications for U.S. Technology Companies
The Online Safety Act’s threshold-based structure explicitly targets successful U.S. technology companies while allowing smaller competitors to operate with minimal restrictions. The 34 million user threshold captures leading U.S. platforms—Meta, Google, Microsoft, Apple—while exempting regional services and newer entrants. This discriminatory threshold forces U.S. companies to implement expensive compliance infrastructure, hire local content moderation teams, and modify global products for UK-specific requirements, while competitors below the threshold avoid these burdens entirely. The global revenue penalty structure weaponizes U.S. companies’ worldwide success against them: while U.S. technology companies generate revenues that expose them to potential multi-billion dollar fines, regional competitors with limited international presence face negligible financial risk even for identical violations.[6] A 10 percent global revenue fine would cost Meta over $13 billion or Google over $30 billion based on 2023 revenues, while a UK-focused platform faces penalties only on its modest local revenues, creating vastly disproportionate compliance incentives that force U.S. companies to over-censor and over-comply.[7]
The Act’s compliance requirements and fine structure systematically divert resources from innovation, undermining U.S. technological leadership. U.S. companies must redirect billions from R&D budgets to fund UK-specific compliance teams, content moderation systems, age verification infrastructure, and legal departments to navigate the Act’s complex requirements. [8] The threat of multi-billion-dollar fines based on global revenue forces conservative decision-making, as companies cannot risk penalties that could equal years of R&D investment. This systematic diversion of capital and talent from innovation to compliance erodes the technological edge that enabled U.S. companies to achieve global leadership, creating a regulatory tax on success that benefits competitors who face neither the compliance burden nor the fine exposure that comes with worldwide operations.
Endnotes
[1] “Online Safety Act 2023,” UK Legislation, October 26, 2023, sections 9-11, https://www.legislation.gov.uk/ukpga/2023/50/contents/enacted.
[2] “Draft Online Safety Act 2023 (Category 1, Category 2A and Category 2B Threshold Conditions) Regulations 2025,” UK Parliament, December 16, 2024, https://www.legislation.gov.uk/ukdsi/2025/9780348267174.
[3] Ofcom, “Protecting People from Illegal Harms Online,” December 16, 2024, https://www.ofcom.org.uk/siteassets/resources/documents/online-safety/information-for-industry/illegal-harms/volume-1-governance-and-risks-management.pdf.
[4] “Online Safety Act 2023,” section 140.
[5] Kir Nuthi, “ The Effect of International Proposals for Monitoring Obligations on End-to-End Encryption” (Center for Data Innovation, November 2022), https://www2.datainnovation.org/2022-E2EE-monitoring-obligations.pdf
[6] Online Safety Act 2023,” sections 139-140; “Impact Assessment: Online Safety Bill,” UK Department for Digital, Culture, Media & Sport, January 2022, https://assets.publishing.service.gov.uk/media/6231dc9be90e070ed8233a60/Online_Safety_Bill_impact_assessment.pdf.
[7] Meta Platforms, Inc., Annual Report (Form 10-K), February 2, 2024; Alphabet Inc., Annual Report (Form 10-K), January 30, 2024.
[8] “Impact Assessment: Online Safety Bill,” UK Department for Digital, Culture, Media & Sport, January 2022.
Related
May 14, 2025
The EU’s Content Moderation Regulation
March 7, 2025