Singapore’s Content Moderation Regulation
The Framework
Singapore’s Info-communications Media Development Authority (IMDA) on July 18, 2023 implemented the Code of Practice for Online Safety for Social Media Services under the Broadcasting Act 1994, targeting social media services with “significant reach or impact in Singapore.”[1] The regulation forces designated platforms to implement comprehensive content moderation systems, establish community guidelines and user safety tools, implement enhanced restrictions for users under 18 years of age, deploy proactive detection technology for specific content categories, maintain user reporting mechanisms with mandated response timelines, and submit annual safety reports for government publication.[2] Noncompliance triggers financial penalties up to S$1 million (approximately US$750,000), with potential service blocking for violations.[3]
Implications for U.S. Technology Leadership
Singapore’s content moderation framework systematically disadvantages U.S. technology companies by imposing substantial compliance burdens on American-owned platforms. Four of the six designated platforms—Facebook, Instagram, X, and YouTube—are owned by major U.S. technology corporations, while TikTok is Chinese-owned and only HardwareZone represents a Singapore-based service.[4] This concentration forces U.S. companies to divert resources from innovation to compliance infrastructure, annual risk assessments, and dedicated regulatory teams for Singapore-specific requirements. The mandated detection systems require significant technological investment and ongoing operational expenses that directly compete with product development resources.
The regulatory structure creates competitive advantages for platforms operating below Singapore’s designation thresholds, allowing competitors to build market share without facing equivalent compliance obligations. Chinese technology companies and other international rivals benefit from this asymmetric treatment, potentially expanding their Singapore presence while American platforms allocate substantial resources to regulatory compliance. The fragmented nature of content moderation regulations across jurisdictions forces U.S. companies to develop multiple compliance frameworks simultaneously, dividing engineering and policy resources that could otherwise strengthen their competitive position globally.
Endnotes
[1] Infocomm Media Development Authority, “IMDA’s Online Safety Code comes into effect,” July 17, 2023, https://www.imda.gov.sg/resources/press-releases-factsheets-and-speeches/press-releases/2023/imdas-online-safety-code-comes-into-effect.
[2] Infocomm Media Development Authority, “Code of Practice for Online Safety – Social Media Services,” July 2023, https://www.imda.gov.sg/-/media/imda/files/regulations-and-licensing/regulations/codes-of-practice/codes-of-practice-media/code-of-practice-for-online-safety-social-media-services.pdf.
[3] Bird & Bird, “Singapore’s Code of Practice for Online Safety comes into effect,” July 18, 2023, https://www.twobirds.com/en/insights/2023/singapore/singapores-code-of-practice-for-online-safety-comes-into-effect.
[4] Campaign Asia, “Singapore introduces code to protect teens from harmful ads on social media,” July 19, 2023, https://www.campaignasia.com/article/singapore-introduces-code-to-protect-teens-from-harmful-ads-on-social-media/485399.