ITIF Logo
ITIF Search
Updated Children’s Safety Bills Still Contain Serious Flaws

Updated Children’s Safety Bills Still Contain Serious Flaws

March 6, 2024

As multiple states consider and pass ill-advised legislation designed to protect children by restricting their access to social media, Congress is finally making progress on federal children’s online safety legislation. Unfortunately, the current bills before the Senate, the Kids Online Safety Act (KOSA) and Children and Teens’ Online Privacy Protection Act (COPPA 2.0), contain serious flaws that would threaten online free speech, privacy, and the digital economy.

Sens. Ed Markey (D-MA) and Bill Cassidy (R-LA) originally introduced COPPA 2.0 in 2021 and Sens. Richard Blumenthal (D-CT) and Marsha Blackburn (R-TN) originally introduced KOSA in 2022, with both bills reintroduced in 2023 after they failed to gain traction. With the recent focus on children’s online safety at the state, federal, and international levels, lawmakers may have more success advancing legislation like KOSA and COPPA 2.0 this year.

Though recent updates to the two bills attempted to remedy critics’ concerns, problems remain with the bills’ text that could incentivize online services to suppress controversial speech, collect more personal information from users, restrict children’s access to online spaces, and potentially stop providing services aimed at children.

Kids Online Safety Act

KOSA contains some provisions that would meaningfully protect children online. For example, it requires online services reasonably likely to be used by minors to provide certain parental controls and optional safeguards for minors, including the ability to restrict who can message them and view their profile, monitor screen time and establish limits, control or opt out of personalized recommendation systems, restrict purchases, manage privacy settings, and report harms that occur. This approach allows children and their parents to tailor their online experience in a way that is best suited to each child’s individual needs.

However, KOSA requires that online services default minors’ accounts to the strictest privacy and safety settings. While it does make sense to provide children with greater protections than adults, defaulting minors’ accounts to the strictest possible settings will likely result in many minors missing out on potentially beneficial design features—such as algorithmic recommendation systems, which provide immense value to users of all ages—if they, like many users, stick with the default options rather than personalizing their account settings.

KOSA also contains transparency requirements, including requiring online services to provide information for minors and their parents about safeguards, personalized recommendation systems, and advertising that exists on the platform. These provisions would allow children and their parents to make more informed decisions about their safety and personal information. Online services with at least 10 million monthly active users in the United States must also publish public reports on potential risks to minors, according to an independent third-party audit, and steps the platform takes to mitigate these risks.

Most of the concern over KOSA’s potential negative impact focuses on the “duty of care” the bill would establish for any online service that is reasonably likely to be used by a minor. Specifically, these online services would have a duty to ensure their design features prevent and mitigate harm to minors. The language of this provision is vague and undefined by existing case law, and as online services attempt to comply with their new duty of care—and avoid liability—they may overcorrect, making it more difficult for minors, or potentially all users, to access helpful content related to mental health, suicide, addiction, eating disorders, and sexuality, to name a few. In fact, KOSA’s duty of care provision may violate the First Amendment. The government cannot dictate an online service’s editorial decisions, which may include the very design features KOSA targets.

Another issue with KOSA is the size-based approach it takes when requiring online services to respond to reports of potential harm to minors. Online services with at least 10 million active monthly users in the United States must respond within 10 days, whereas smaller platforms have 21 days to respond. Seemingly the logic behind this approach is that larger platforms have more resources to respond to reports quickly. However, larger platforms are also likely to receive more reports, because they have more users making those reports and more content for users to report. Moreover, harmful content is not less harmful simply because it takes place on a platform with nine million monthly users rather than 11 million. A uniform set of rules for all online services would more effectively protect children.

Finally, KOSA instructs various federal agencies to conduct research on the risk of harms to minors on social media and other online platforms and the most technologically feasible methods for online age verification. More research into these topics would, of course, be beneficial, but it is telling that Congress is not calling for more research before legislating on these issues. This is yet another example of lawmakers putting the cart before the horse when it comes to children online.

Children and Teens’ Online Privacy Protection Act

COPPA 2.0 would amend existing federal children’s privacy legislation, the Children’s Online Privacy Protection Act (COPPA), originally passed in 1998. The FTC then issued the COPPA rule, which implements the law, in 2000, and updated it in 2011. The FTC more recently began considering further updates to the rule. Meanwhile, Congress may move forward with its own updates to the law itself.

Currently, COPPA applies to users under 13, prohibiting online services from collecting their personal information without parental consent. COPPA rules apply when an online service has “actual knowledge” that a minor under 13 is using their service. COPPA 2.0 would expand protections to users between 13 and 16 and change the standard for online services, requiring them to comply if they have actual knowledge or “knowledge fairly implied on the basis of objective circumstances” that a minor under 17 is using their service.

The issue of where to draw the line when extending additional protections to users under a certain age is a difficult one, considering there is a lack of research that identifies a single age at which children mature past needing such protections. If Congress does extend COPPA protections to users between 13 and 16, many online services that target this demographic will likely significantly change the way they operate or stop providing services for that demographic altogether, as many social media platforms already do by banning users under 13.

Likewise, switching from an actual knowledge standard to an implied knowledge standard will create a minefield of potential liability for online services. Navigating this minefield will be costly, taking resources away from innovating new products, services, and safety features and funneling them into compliance efforts. It may also require online services to collect more personal information about their users in order to determine who is an adult and who is a child. Without a comprehensive federal data privacy law that protects all users’ data from misuse, these data collection efforts could lead to more harm than they would prevent.

COPPA 2.0 would also ban targeted advertising to children and teens. Much of the Internet relies on targeted advertising to pay for services that are often free or lower-cost than would otherwise be possible. Banning targeted advertising to users under 17 would likely result in fewer online services designed for children, and particularly fewer free online services designed for children, which would prove most detrimental to lower-income households. Overall, COPPA 2.0’s provisions will result in less innovation in online services designed for children, including educational content and wholesome entertainment that many families rely on to keep kids engaged and learning.

Finally, COPPA 2.0 would require online services to allow children and their parents to delete information collected about a minor under 17, a provision the bill’s sponsor and cosponsors call an “Eraser Button.” Considering that proposed comprehensive federal privacy legislation would give this right of deletion to all users, this only highlights the need for Congress to focus on passing broader privacy legislation before tuning the finer details of children’s privacy legislation.

Recommendations

Judging by the recent changes to the text of KOSA and COPPA 2.0, the bills’ sponsors and cosponsors have listened to the criticism the bills received. However, these changes are not enough to effectively address some critics’ most pressing concerns related to free speech, privacy, and the digital economy. There are always trade-offs involved in increasing safety, especially for children, but with further changes to KOSA and COPPA 2.0, lawmakers could strike a better balance on these issues that would protect children online with fewer negative consequences for users of all ages.

First, Congress should remove KOSA’s requirement that online services default minors’ accounts to the strictest privacy and safety settings in all cases. Each child has individual needs that vary based on age and other circumstances, and defaulting to the strictest settings for all children will lead to many children missing out on potentially beneficial applications of online services. Congress should also expand upon KOSA’s duty of care by outlining specific steps online services must take to fulfill their obligation to ensure their design features prevent and mitigate harm to children. These steps should avoid content-based restrictions or otherwise dictating online services’ editorial decisions in ways that may violate the First Amendment.

Congress should replace KOSA’s tiered, size-based requirements regarding the length of time online services have to respond to reports of potential harm to minors with a universal time limit that applies to all online services regardless of size. If KOSA does not advance in the near future, Congress should pass a bill in the meantime that instructs federal agencies to conduct research on online harms to minors and age verification methods.

Before making changes to COPPA, Congress should first pass comprehensive data privacy legislation that preempts state laws. If there remains a need for additional protections for minors such as those in COPPA 2.0, Congress should maintain COPPA’s actual knowledge standard, as the FTC has done in its proposed changes to the COPPA rule. Before expanding COPPA’s protections to all users under 17, Congress should perform an impact assessment that weighs the impact to innovation and the digital economy against the potential privacy benefits for teens. Finally, Congress should not ban targeted advertising.

These changes would limit the potential negative impacts of KOSA and COPPA 2.0 to free speech, privacy, and the digital economy while maintaining the bills’ most important protections for children. Children deserve to be safe online and have their privacy protected, but they also deserve a flourishing online space filled with opportunities for education, entertainment, and connection.

Back to Top