ITIF Logo
ITIF Search
The Kids Off Social Media Act Misses the Mark on Children’s Online Safety

The Kids Off Social Media Act Misses the Mark on Children’s Online Safety

February 6, 2025

As part of the ongoing debate surrounding children’s online safety, Senators Ted Cruz (R-TX), Brian Schatz (D-HI), Chris Murphy (D-CT), and Katie Britt (R-AL) reintroduced the Kids Off Social Media Act (KOSMA), which the Senate Commerce Committee advanced on February 5, 2025. The bill would prohibit users under the age of 13 from using social media entirely, prohibit recommendation algorithms for users under the age of 17, and require schools to restrict social media access on federally funded networks. Like other recent children’s online safety bills, KOSMA has many flaws, namely that it complicates compliance for platforms that already disallow children below age 13 and limits users’ ability to fully customize their online experience.

In a statement on the bill, Sen. Cruz outlined his intentions for KOSMA: To “combat the harms social media poses to children,” including predatory adults, content that promotes risky behavior, and content that negatively affects children’s self-esteem. Everyone, including social media platforms themselves, can agree that certain online content or activities can negatively impact children. However, social media also has many benefits for children and teens, such as facilitating communication, education, entertainment, and community-building. Like other children’s online safety bills, KOSMA fails to balance reducing the risk of harm with magnifying the benefits of social media.

KOSMA rightfully observes that all the major social media platforms already prohibit children under the age of 13. Online services restrict these users because the Children’s Online Privacy Protection Act (COPPA) imposes additional requirements on platforms with users under the age of 13. At the same time, KOSMA does not require platforms to use age verification to ensure they have no users below age 13. As a result, this bill accomplishes nothing that platforms do not already do in terms of restricting young children from their services.

However, KOSMA creates a regulatory challenge for social media platforms. With regard to underage users, COPPA holds platforms to an “actual knowledge” standard—online services are obligated to act when they are aware and have no doubt that a minor under the age of 13 uses the service. But KOSMA uses a “reasonable knowledge” standard—online services must act if there is a high likelihood that a user is below the age of 13. COPPA’s actual knowledge standard allows online services to protect children without significantly increasing compliance costs, whereas KOSMA’s reasonable knowledge standard is so broad and ill-defined that it would raise compliance costs and subject platforms to an increased risk of liability, even when attempting to comply in good faith.

KOSMA also prohibits personalized recommendation systems for users under the age of 17. Social media users should be able to choose the experiences they prefer, whether that be chronological feeds or algorithmic ones that prioritize content based on users’ profiles. Indeed, many platforms already give users this choice, and most users prefer personalized recommendation systems. Banning these systems would cut children and teenagers off from the benefits of algorithmic feeds, which recommend content users are more likely to find interesting and make it much easier for users to discover new content.

KOSMA does include exceptions for social media platforms to use certain, very limited types of data to personalize children’s feeds, but this does not include important data such as children’s interests or content they have interacted with in the past. Ultimately, KOSMA assumes personalized recommendations are always harmful, meaning these rules would also prohibit social media sites from creating customized feeds that could boost positive content encouraging acts of kindness, self-care, educational curiosity, and healthy living habits.

Though it attempts to differentiate itself from other children’s online safety bills, ultimately this bill is more of the same, with many of the same problems present in other proposals. KOSMA complicates platforms’ compliance and limits users’ ability to customize their experience. Additionally, because it does not preempt state law, KOSMA will further muddy the waters in the conflicting patchwork of state laws in the United States. KOSMA—and other children’s online safety bills up for debate—needs further improvement before Congress passes children’s online safety legislation.

Instead of resorting to blanket bans or proposals that target core features of social media such as algorithms, Congress should pass legislation to establish a standardized child-flag system, giving parents and guardians greater control over their children’s online safety. Under this system, all users would be presumed adults unless marked as children, with platforms checking for this flag when accessing age-restricted content. This approach would be less burdensome for platforms, parents, children, and adult social media users, and would strike a better balance between enabling children and teens to access the benefits of online spaces while gatekeeping them from inappropriate content.

Back to Top