Publications: Alex Ambrose
February 6, 2025
The Kids Off Social Media Act Misses the Mark on Children’s Online Safety
Senators reintroduced the Kids Off Social Media Act (KOSMA), which has many flaws, namely that it complicates compliance for platforms that already disallow children below age 13 and limits users’ ability to fully customize their online experience.
February 5, 2025
Congress, Not States or the Supreme Court, Should Lead the Way in Balancing Children’s Online Safety and Access to Adult Content
Congress needs to step in to prevent a patchwork of state age-verification laws and establish a child-flag system that would better balance adults' access to legal content versus children's access to material intended for mature audiences.
January 24, 2025
Comments to Standards Australia Regarding Children’s Safety in the Metaverse
Standards Australia should establish age-based guidelines for the metaverse using existing content ratings; require device operating systems to create an opt-in “trustworthy child flag” system; work with global, government-led groups to build voluntary industry standards with a focus on interoperability; and promote digital literacy campaigns for children to operate in the metaverse safely.
January 16, 2025
Meta Community Notes and Content Moderation in a Free Market
Meta announced on January 7, 2025 that it was ending its third-party fact-checking program on its social media platforms—Facebook, Instagram, and Threads—and moving to a “Community Notes” model, similar to X, in the United States.
December 12, 2024
The FTC’s Social Media Data Practices Report Is a House of Cards Built on False Assumptions and Unsubstantiated Claims
The FTC’s September 2024 staff report on the data practices of nine major social media and video streaming companies makes four flawed claims: that platforms surveil users, secretly share data with advertisers, collect data to block competitors, and limit consumer choice due to insufficient competition.
November 19, 2024
Social Media Ban for Children Is a Step Backward for Australia
Blocking an entire age group from social media uses a regulatory sledgehammer instead of a scalpel to address complex and evolving online safety issues. It ignores the benefits of social media for young people and the pitfalls of age-verification rules.
November 18, 2024
Policymakers Should Further Study the Benefits and Risks of AI Companions
Given the uncertainties surrounding the emotional and social impact of AI companions—both positive and negative—policymakers should prioritize funding research on how users interact with chatbots. This approach would ensure that any interventions or improvements are grounded in scientific evidence, rather than rushed regulation.
October 7, 2024
Boom in State Digital Replica Laws Fuels Need for Federal Publicity Right
Congress should pass an amended version of the NO FAKES Act that preempts all existing and future state digital likeness laws. This would ensure consistent IP protections for all Americans—including performers—support innovation in entertainment, and prevent a patchwork of state digital replica laws.
October 1, 2024
Congress Should Not Mandate Warning Labels for Social Media
Senators Katie Britt (R-AL) and John Fetterman (D-PA) introduced the Stop the Scroll Act, which would require warning labels on social media platforms about potential mental health risks. However, the proposal is flawed due to the lack of scientific consensus linking social media to mental health harms and the ineffectiveness of pop-up warnings, which are often ignored by users, especially children.
September 3, 2024
User Safety in AR/VR: Protecting Kids
Children play a crucial role in driving adoption of immersive technologies—and parents, corporations, and regulators all have roles to play in balancing privacy and safety concerns to ensure they can enjoy safe, engaging, and innovative experiences.