ITIF Logo
ITIF Search
Trust Us, We Know Best: The Steady Rise of Privacy Paternalism

Trust Us, We Know Best: The Steady Rise of Privacy Paternalism

June 24, 2021

The foundation of data privacy is the principle of notice-and-choice—the concept that organizations should provide consumers information so that they can make decisions about how they share their personal information. As the FTC wrote in its seminal report on online privacy to Congress in 1998, “without notice, a consumer cannot make an informed decision as to whether and to what extent to disclose personal information.” By requiring organizations to adhere to certain standards of transparency, laws and regulations can empower consumers to choose when and where they share their personal information. But some policymakers are increasingly turning their backs on consumer choice, instead arguing that the government should be in charge of these decisions, effectively creating a nanny state for data privacy.

The rationale for notice-and-choice is simple: Not all consumers have the same privacy preferences, so consumer choice provides a way to satisfy different types of individuals, thereby maximizing consumer welfare. As the noted privacy researcher Alan Westin found in his studies, there are generally three groups of people: “privacy fundamentalists” who choose privacy above all else; “the unconcerned” who readily trade their data for benefits; and “privacy pragmatists” who make up the majority of consumers and consider the costs and benefits of each opportunity. Notice-and-choice allows each group to make tradeoffs according to their preferences.

There are many legitimate ways policymakers can strengthen notice-and-choice. For example, on the notice side, standardizing privacy policies could help consumers more easily compare services and mandatory data breach notifications create more transparency in the market. Likewise, on the “choice” side, requiring organizations to allow consumers to delete and access their data creates more opportunities for consumers to manage their personal information.

Some privacy advocates pay lip service to some of these proposals, such as saying they also want simple privacy notices, but they are being disingenuous. These are usually the same groups who are also advocating for laws that would increase penalties and create a private right of action, allowing even more regulatory fines and class action lawsuits against companies for even a single misstatement or omission in their privacy notices. Others complain that consumers suffer from notice fatigue, yet again it is the laws pushed by privacy advocates, such as the General Data Protection Regulation (GDPR), that have given rise to excessive privacy notices, such as alerts about a website using cookies.

Instead of improving notice-and-choice, most privacy advocates have spent years on making it harder for consumers to allow others to collect and use their personal information, such as by demanding that organizations obtain affirmative consent before collecting or sharing consumer data. While opt-in requirements have certainly increased data collection costs, they have not curtailed data sharing as significantly as many privacy advocates would have liked because most consumers find value in sharing their information, such as obtaining free online services or discounts in stores. In response, many privacy advocates now want to prohibit companies from providing better or cheaper services to consumers who opt into sharing their data.

Most of these privacy advocates fundamentally oppose the idea of corporations having personal information, not just about themselves (since most are privacy fundamentalists), but about anyone. Their paternalistic view of Internet users is at the heart of arguments in favor of government regulation to protect consumers from themselves. These privacy advocates do not want to give consumers a choice; they want to make the choice for consumers. They believe consumers who do not make the same choices as privacy fundamentalists are unenlightened and need to be saved from the grips of surveillance capitalism.

There are two main rationales that advocates increasingly use for why the government should make privacy choices for consumers.

First, some argue that large tech firms are abusing their market dominance to extract higher “prices” from consumers in the form of greater data collection. They assert that these companies are forcing consumers to turn over their personal data against their will, and consumers are helpless to stop it because these companies are too big and there are no viable alternatives that do not invade consumer privacy. But this overly simplistic analysis ignores the many consumer privacy controls available on large tech platforms like Apple, Facebook, and Google, and alternatives like DuckDuckGo, which explicitly cater to individuals with higher privacy preferences.

Second, some argue that tech firms use deceptive practices to trick consumers into giving up their personal data (a practice which privacy advocates derisively refers to as “dark patterns”). According to this theory, consumers cannot exercise effective control over their information because companies are manipulating them into giving up their personal data. Once again, this analysis assumes that Internet users are witless simpletons, unable to make decisions for themselves, and the only option is for the government to protect them.

Most privacy advocates do not want to be seen as explicitly anti-consumer choice or admit that their endgame is privacy paternalism, so they tend to mask their arguments. But the facade is crumbling. For example, the Federal Trade Commissioner (and outgoing acting chair) Rebecca Kelly Slaughter noted in a recent public event, “I want to sound a note of caution about approaches that are centered around user control...I think it is really problematic to put the burden on consumers to work through the markets and the use of data, figure out who has their data, how it’s being used, make decisions...so I really worry about a framework that is built at all around the idea of control as the central tenet or the way we solve the problem.” Similarly, incoming FTC Chair Lina Khan has argued that rather than relying on an “individualistic, consumer-centric framework” to protect consumer privacy, policymakers should pursue “clear prohibitions and economic disincentives” to prevent companies from “conditioning access to essential services on the ever-greater surrendering of personal data.”

Privacy paternalism is a misguided objective that would hurt consumer welfare. Creating one-size-fits-all rules for sharing data would be inefficient, increasing costs that would be passed on to consumers in the form of higher prices and fewer benefits. Moreover, most consumers trust the government significantly less than major Internet companies like Google and Amazon, raising doubt that consumers would prefer the government to make these decisions for them.

While privacy advocates may disagree with the choices many consumers make, that is not justification for overruling their decisions. As policymakers consider legislation to strengthen consumer privacy, they should go back to the basics of notice-and-choice.

Back to Top