surveillence

It is easy for those in power and the broader public to see new technologies—the telephone, email, cloud storage, or end-to-end encryption—as somehow exempt from the privacy laws which govern other communications. But as technologies become more widely used for business and personal communication, we all come to understand the need to protect our interactions from surveillance.

Encryption is not a force for evil. It is used by businesses across the country to prevent online attacks and theft. It is common practice for hackers to target retailers in the U.S. to steal customer data and access their bank accounts, but attacks like these are largely impeded by encryption. When hackers stole the banking information of millions of customers in Target’s infamous data breech, the victims’ bank accounts were protected because their PINs were encrypted.

Encryption also promotes innovation. Businesses and individuals have little incentive to create and discover if their intellectual property will simply be stolen by criminals or foreign actors.  Innovation requires protection of proprietary data.

Building in a systemic weakness, while it may make law enforcement’s job easier, does not make anyone safer. Creating backdoors for law enforcement to get around encryption would weaken the technology as a whole; backdoors can be exploited by criminals to access users’ personal data.

The tension between the Fourth Amendment and Law Enforcement goals is not new and is important to preserve. Officials claim that encrypted data is “warrant-proof,” but this misrepresents the purpose of a warrant. Search warrants allow law enforcement to search for evidence, but do not guarantee that they will find evidence supporting or leading to any conclusions.

Encryption doesn’t just protect us from criminals; it protects us from the prying eyes of the government. The FBI has waged a years-long battle against encryption, which it views as an impediment to investigations. The federal government has been intercepting electronic communications for as long as it has been possible; federal agents began wiretapping phone calls in the 1920s. It tends to take the government years to adjust to changes in communications technology; it took decades for the Supreme Court to place limits on warrantless electronic surveillance.

Following years of unsuccessfully pressuring tech companies to give the government a backdoor to encrypted data, Congress and the Department of Justice changed their strategy. A high profile letter sent by U.S. intelligence agencies and numerous international allies linked end-to-end encryption with child sexual exploitation, claiming that Facebook would be harming children by offering encryption on its messaging services.

Reframing the encryption debate around the most heinous crimes imaginable is dishonest; it posits that, because criminals use their privacy to exploit the most vulnerable people in society, no one has a right to privacy. This line of thinking also presupposes that Big Brother is unfailingly benevolent; the FBI will have a backdoor to all online communications, but not to worry: they will only spy on you if you deserve it.

In a misguided attempt to address the very real problem of online child exploitation, Senators Lindsey Graham (R-S.C.) and Richard Blumenthal (D-Conn) introduced the EARN It Act. The legislation has honorable goals, but in practice would limit the ability of tech firms to offer encryption to users. This would restrict the ability of all Americans—including survivors of abuse—to protect their privacy and personal safety while online.

After a firestorm of criticism from across the political spectrum, the Senate Judiciary Committee approved a manager’s amendment to the EARN It Act. The amendment resolves the aspects of the original bill that would have threatened encryption most seriously: businesses cannot be held liable simply for failing to take actions that would undermine encryption services, and offering encryption will not be automatic grounds for liability for child sexual abuse material (CSAM). Critics point out that the amendment provides only a defense against liability, not immunity; the threat of litigation will still be sufficient to discourage tech companies from providing secure end-to-end encryption.

The amendment removes the legal authority initially granted to the commission created in the bill; the commission’s standards will be recommendations, not legal requirements. It also extends the amount of time that providers can preserve the contents of a report, which will help with the development of algorithms made to detect CSAM. 

However, the manager’s amendment brings a new set of problems to the table. It allows states to penalize companies with both civil and criminal liability. In addition to balkanizing the legal landscape for a business model that naturally crosses state lines, this would leave American tech businesses vulnerable to a firehose of destructive lawsuits. Endless litigation would likely lead to the end of America’s global leadership in tech. 

The amendment also revokes tech companies’ liability protection based on “actual knowledge” of the existence of CSAM. This is a step up from the “recklessness” standard in the earlier version of the bill, but it still creates perverse incentives that discourage tech companies from investigating CSAM. When knowledge triggers liability, companies will avoid learning about potential CSAM cases. This will lead to less, not more, action by tech companies to stop the spread of CSAM.

The ability to use privacy services provided by a private business is part of what makes our country great and distinguishes us from authoritarian countries like China, where the government is entitled to the private data of citizens. Efforts by the government to weaken encryption should raise red flags and inspire a continued effort to protect encryption.