Skip to main content
Markkula Center for Applied Ethics

The Ethics of Encryption

Margaret Steen
Four people in discussion, likely about encryption.

L-R: Irina Raicu, Jonathan Mayer, Marshall Erwin, and David J. Johnson

Strong encryption is key to good data security. But how strong is too strong? Is there a point at which encryption jeopardizes security rather than preserving it? Should companies be allowed to create encryption that the government can’t crack?

A panel of experts discussed “The Ethics of Encryption” as part of the Business and Organizational Ethics Partnership at Santa Clara University’s Markkula Center for Applied Ethics.

The panelists included David J. Johnson, Special Agent in Charge of the San Francisco Division of the FBI; Marshall Erwin, a senior staff analyst at Mozilla and non-residential fellow at Stanford University; and Jonathan Mayer, a Ph.D. candidate in computer science and law school graduate at Stanford University.

The panel was moderated by Irina Raicu, director of Internet ethics at the Ethics Center. She pointed out that despite the differing perspectives on how to use encryption, “both sides are talking about protecting people.”

Mayer set the stage for the discussion by explaining the two types of encryption at issue: end-to-end encryption and authentication, where one person sends a message to another user and not even the service that is transmitting the message can tell what’s in it; and device encryption, in which the contents of a hard drive on a smart phone or tablet are encrypted by default. Device encryption complicates the work of law enforcement, since investigators may find a device but be unable to see its contents.

Johnson gave an overview of the law enforcement perspective: “The threat landscape today is much more complex than it was even a year ago. I need every tool I can get in my toolbox,” Johnson said.

He noted that the FBI is “filled with extraordinary people who firmly believe in the rule of law and protecting people’s civil liberties.” As for encryption, Johnson noted that it’s “a great, valuable tool. Without question it should be used by individuals and businesses to protect their data, and by the government to protect its data. However, there have to be some limits on it, and it gets pretty complicated pretty quickly.”

It is very rare, Johnson said, that the FBI will try to get access to protected data without a court order. The issue is not whether to expand the scope of information the government has access to, but rather whether to help the government get access to information it is legally entitled to have.

The FBI gets court orders allowing it access to both real-time communication, such as telephone calls, and data that is stored on a particular device, Johnson said.

Some service providers, like telephone companies, are required to help law enforcement get the information ordered by a court. However, this is a case where technology has moved faster than the law. “We have a big population of providers out there that are under no obligation to provide the government with technical assistance” to obtain access to information, Johnson said. With both Apple and Google moving to default encryption in their operating systems, it’s becoming a more visible issue.

“There is information that is important to criminal and national security investigations that we will not get otherwise,” Johnson said. “Any solution would provide some relief to the government when it satisfied the legal requirements to get to that particular data. It also has to be done fairly,” with consideration given to cybersecurity, civil liberties, innovation and global competitiveness.

Erwin, who has done national security work as well as working for Mozilla, then discussed how Mozilla makes decisions about encryption. One key principle: “Individual privacy and security on the Internet are fundamental and must not be treated as optional,” he said.

Therefore, Erwin said, Mozilla strongly supports the use of encryption technologies and uses encryption to protect its own users' data, though it first considers whether it’s necessary to collect the data in the first place. There are also some situations in which encryption isn't necessary to protect individual privacy and security. Performance data about the browser may be helpful to Mozilla, for example, but would be of little interest to other organizations." And the company is always trying to improve performance as well as privacy, making sure security measures don't make the product cumbersome to use.

As for law enforcement’s desire for access to encrypted information, he said, “When you grant law enforcement access, what you are ultimately talking about doing is creating a vulnerability in the system that allows them to have access.” This is fairly risky: “It’s very difficult to create a vulnerability in the system that can be exploited only by trusted actors.”

Mayer agreed that weakening security to facilitate government access to data would pose technological risks. He also argued that government should not be able to compel technology vulnerabilities or backdoors, on both law and policy grounds.

He said the legal system has not historically required people to collect or preserve information just because it may be helpful to the government. Just as people do not have a duty to act to help others, they do not have a legal duty or legal obligation to help the government with its surveillance efforts. “The legal system has long treated affirmative acts very differently from negative acts,” he said.

As a matter of policy, Mayer said “the best way to ensure trust is the strongest possible encryption.” Introducing vulnerabilities creates concerns about both free speech and human rights.

Erwin pointed out that in several recent terrorist attacks, it has become clear that the government had information beforehand that could have provided a warning, if it had been properly analyzed. “We do not suffer from a lack of information about terrorist activities – we have an abundance of information,” he said. To an audience member who wondered if encryption could pose an existential threat to law enforcement, Erwin said: “I’m skeptical that we’re there now. I think the trend line is about more data collection and more access.”

Another question for technology companies: If they build in access that they will grant to U.S. authorities, what about the governments of other countries?

“What if a demand from a U.S. agency is incompatible with local law in another country?” Mayer asked. “Moving to pervasive encryption opts a company out of that whole morass.”

Margaret Steen is a freelance author.

Feb 1, 2015
--