ITIF Logo
ITIF Search
Weakening Encryption Would Put Vulnerable Populations at Risk

Weakening Encryption Would Put Vulnerable Populations at Risk

December 4, 2019

Encryption is making it more difficult to solve child exploitation cases, allege government officials from the United States, the United Kingdom, and Australia in an open letter to Facebook CEO Mark Zuckerberg. Therefore, they argue, technology firms should address the problem by designing their products to give law enforcement authorities access to the content of encrypted consumer data. But this is not the first time they have cried wolf, and these latest claims are a misleading attempt by law enforcement to subvert encryption—a move that could expose vulnerable populations to new risks.

One of the authorities’ main complaints is that Facebook plans to implement end-to-end encryption across its messaging services, which would prevent any third parties, including government, from accessing users’ communications. But while law enforcement needs tools to protect the public, requiring companies to provide access to encrypted consumer data would have the unintended consequence of putting vulnerable populations at risk without solving law enforcement’s most significant challenges in using digital evidence.

The U.S. government has a long history of opposing encryption. For example, in the 1990s, the FBI tried and failed to get phone makers to use the “Clipper Chip,” a hardware module developed by the NSA that would have allowed law enforcement to access any encrypted phone in the United States. In 2016, the FBI tried and failed to force Apple to modify its software to let government circumvent the encryption on iPhones. In both cases, law enforcement argued it needed access to encrypted data to protect public safety. Having lost these arguments, it is now making the case that it needs extraordinary access to encrypted information to prevent child exploitation.

The government argues it is reasonable to ask technology firms to design their systems so they, or a designated third party, can provide law enforcement access to encrypted data under certain conditions. But this proposal has several flaws.

First, key-escrow systems introduce a new attack vector that others can exploit to circumvent encryption. While encryption can provide a mathematical proof of security to users, key-escrow systems are only as trustworthy as the humans running them. For example, they are vulnerable to insider attacks, where an employee might steal or misuse an escrowed encryption key. And since firms would likely receive a high volume of data requests from domestic and foreign law enforcement, they would need to have a commensurate number of employees who have access to escrowed keys, which would increase the potential risk of human error or intentional malfeasance. Indeed, government agencies do not have a spotless track record when it comes to protecting sensitive personal data with internal controls. For example, the State Department and Internal Revenue Service have histories of employees inappropriately accessing passport records and tax files.

Creating new vulnerabilities in secure communication systems would be particularly undesirable because strong encryption has helped protect vulnerable individuals, including victims of abuse and LGBTQ minors who use encryption to communicate privately and to seek help without fear of reprimand. For example, victim advocates use encryption to discuss relocation plans with survivors of domestic abuse privately. Encryption has helped numerous other vulnerable individuals as well, including journalists who use encrypted messaging services to communicate with confidential sources. Limiting end-to-end encryption, or building backdoors into it, would expose these groups to potential threats. Meanwhile, bad actors would likely seek out foreign tools and services that do not have such vulnerabilities, leaving vulnerable populations with weaker security protections than criminals.

These threats are not hypothetical. Indeed, the encrypted messaging service WhatsApp recently filed a lawsuit against Israeli cybersecurity firm NSO Group for allegedly using malware in an attempt to surveil the mobile communications of 1,400 individuals, including journalists and human rights activists.

There are better ways for law enforcement and Internet firms to combat child exploitation than to undermine encryption, including tracking meta-data about user behavior, infiltrating networks of bad actors, and screening images before allowing them to be uploaded. Facebook and others are pursuing these options, and U.S. law enforcement officials should, too.

Furthermore, when it comes to digital evidence, law enforcement’s biggest challenge is not its inability to access encrypted data, but rather that law enforcement agencies often struggle to identify which firms have data relevant to their investigations. The U.S. government needs to devote more resources to this problem. The U.S. National Domestic Communications Assistance Center, which facilitates cooperation between industry and law enforcement, only had a budget of $11.4 million as of 2018. In addition, law enforcement receives limited training on how to handle, recover, and store digital evidence.

Instead of undermining efforts to ensure all Americans, including children, can privately and securely communicate with end-to-end encryption, the U.S. government should focus on providing funding to address these low-hanging law enforcement challenges. The United States also should work with its allies to establish standard practices for how to access, analyze, and share relevant data. For example, it is a positive development that the United States and the United Kingdom recently agreed to expedite requests by allowing their law enforcement agencies to go directly to technology firms based in the other nations, instead of going through the other nation’s government first.

It may be difficult to balance law enforcement’s need to access data with the need to secure people’s private communications, but undermining encryption would tip the scale too far away from protecting privacy and security, and risks hurting more people in the process.

Back to Top