Age-Appropriate Design Codes Are Just Age Verification in Disguise
Maryland enacted an age-appropriate design code on May 9, 2024, becoming the second state to limit online platforms from displaying certain types of content policymakers have deemed inappropriate for users under a certain age. Although policymakers intend for the law to protect children online, it will instead lead online services to implement age verification systems that would infringe on users’ privacy and create barriers to accessing online services. Moreover, if other states follow, they will create a state-by-state patchwork that will further harm consumers with inconsistency and overburden platforms with duplicative compliance costs. Few companies will design their web presence just for Maryland; they will use these rules nationally. In short, Maryland’s law will have negative effects not just for the kids it’s aiming to protect, but all users.
The Maryland Kids Code applies to online services that do business in the state of Maryland, collect consumers’ personal data, and either have annual gross revenues over $25 million, handle the personal data of at least 50,000 users or devices, or derive at least 50 percent of their annual revenue from the sale of consumers’ personal data. These size-based requirements ignore the harmful content and activity present on smaller online services. Maryland’s law requires covered online services to implement certain privacy and safety features by default to protect children. This includes preparing data protection impact assessments to evaluate the potential harm to children for any product reasonably likely to be used by children. States do not belong here. It is up to Congress to finally pass a national privacy law that would preempt such state laws.
Age-appropriate design codes establish standards online services must follow to make their services safe for children. It is highly unlikely online services can comply with many of these provisions without age verification to determine which users are children, which would require all users to submit their personal data to prove they are 18 or older to access websites, apps, and online services “reasonably likely to be accessed by children”. This type of age verification would severely undermine user privacy and anonymity online. Online services may have to collect and retain information from users’ government-issued IDs, which include not only an individual’s date of birth—something users already must provide to create an account on many social media platforms—but additional sensitive personal data including an individual’s full name and address.
This data collection would pose privacy risks that may deter some users from participating in certain online activity, particularly users who value their anonymity, which often includes members of vulnerable populations. Age verification could also bar those without a form of government-issued ID from using certain online services. There are better ways to protect children that would be less invasive and less burdensome for businesses and consumers, but existing age-appropriate design codes fail to take these factors into account. As currently crafted, age-appropriate design codes are too vague and bring with them a flood of potential privacy violations. Yet multiple states have considered similar legislation despite the potential costs. Even if other states decide not to go down this path, if online services change the way they operate to comply with the Maryland Kids Code, Maryland lawmakers will be essentially imposing their will on the rest of the nation.
Maryland’s law also bans covered online services from automatically analyzing users’ personal data to infer characteristics about them, unless the platform deems the profiling to be necessary and can demonstrate the platform has safeguards in place; collecting, storing, and deleting certain types of children’s personal data; processing a child’s precise geolocation data unless the collection of that precise geolocation information is strictly necessary for the covered entity to provide the online product requested and then only for a limited time; using “dark patterns” on children; or allowing a person other than a child’s parent or guardian to monitor the child’s online activity without notification. These data minimization provisions require organizations to collect no more data than is necessary to meet specific needs, negatively impacting organizations that do not know which data will be most valuable when deciding what data to collect, as well as limiting organizations’ ability to analyze data when innovating new products or services. Data minimization requirements are designed to limit, by assuming data collection is harmful and ignoring the public good of collecting data.
Violations of the Maryland Kids Code are subject to fines of $2,500 per affected child for negligent violations and $7,500 per affected child for intentional violations. The law also includes a right to cure for online services that are “in substantial compliance” with the law, allowing businesses that largely comply with Maryland’s new code to avoid liability for violations as long as they fix the violation and take measures to prevent future violations.
The Maryland Kids Code clearly models the California Age Appropriate Design Code (CAADCA), which requires online services that children are “likely to access” to consider the best interests of children and prioritize children’s privacy, safety, and well-being over commercial interests. Just like the CAADCA, the Maryland Kids Code is likely to face legal challenges over First Amendment violation claims, though supporters of the legislation say legislators crafted the new law to withstand court challenges like those faced by California.
California legislators designed the CAADCA after children’s online safety regulations in the UK. The UK’s Children’s Code pertains to platforms “likely to be accessed by children”—in other words, virtually all platforms. California—and now Maryland—continue a trend of following in Europe’s problematic footsteps of creating heavy-handed Internet regulations that lead to less innovation in the digital economy and a worse online experience for users.
Age-appropriate design codes come from a noble place—the desire to protect children—but have serious flaws and threaten to do more harm than good. Instead of ID-based age verification, Congress should require device makers and platforms hosting age-restricted content to establish a “child flag” system—so everyone is assumed to be an adult unless they are marked as a child. Consumers should also be given more autonomy to opt in or out of certain safety and privacy features. Finally, a federal data privacy law could replace the patchwork approach with a single national standard, but only if Congress maintains a light-touch approach that does not hinder productivity and innovation or impose significant costs on businesses and the economy.