Authors: Jeremy Gillula, Staff Technologist, Electronic Frontier Foundation Stewart A. Baker, Partner, Steptoe & Johnson LLP Paul Rosenzweig, Red Branch Consulting, PLLC; Senior Advisor, The Chertoff Group Interviewer(s): Jonathan Masters, Deputy Editor January 28, 2015
The move by major technology companies like Apple and Google to sell products with advanced encryption has pushed the debate over digital privacy and security to a critical stage. Some policymakers are pushing for new laws that would require tech manufacturers to ensure that government investigators could access suspects digital information. Meanwhile, privacy advocates say such measures are unnecessary and may undermine security for all. CFR asked three experts to weigh in on how technology firms, in designing their products and services, should balance the privacy demands of their customers with the security concerns of police and counterterrorism agencies.
Apple's announcement in September that its iOS 8 mobile operating system would feature encryption by default has launched a spirited public debate over whether technology firms should be legally required to compromise the otherwise secure systems they market to consumers.
Law enforcement, namely the FBI, has answered with a resounding "Yes." They claim that as more data is encrypted, they are increasingly unable "to access the evidence [they] need to prosecute crime and prevent terrorism even with lawful authority." They call the process "going dark."
But the numbers dont back up these assertions. In 2013, encryption foiled only nine out of 3,576 federal and state wiretaps, according to the federal judiciary. It is a huge leap to jump from one quarter of one percent all the way to "going dark." Increasing the security of our digital systems wont stop law enforcement from prosecuting and preventing crime. Police have a wide variety of investigative tools at their disposal, and only an incredibly intelligent criminal could stymie every single one (and such criminals have already had access to strong cryptography for years).
Would introducing backdoors (secret access methods that investigators can use to overcome otherwise secure systems) make law enforcements job easier? Of course. But there are lots of other tools that would make their job easier, and weve decided as a nation that these would violate our basic rights enshrined in the Fourth Amendment.
The problem is that backdoors also make criminals jobs easier. Theres no such thing as a system insecure enough for police to gain access, but secure enough to guard against criminals, malicious foreign agencies, and other bad actors. Computer science just doesnt work that way.
"Regrettably, they are trying to frame this debate as one of privacy versus security, when in reality we can and should have both."
Indeed, we have examples of backdoors that led to major digital breaches: the hacking of Greeces cell phone system in 2006, a similar incident in Italy between 1996 and 2006, and the hacking of Gmail in 2010. Instead of protecting us, law enforcement is supporting policies that would make us and our private information less safe. Regrettably, they are trying to frame this debate as one of privacy versus security, when in reality we can and should have both.
Companies must reflect the values of the countries where they do business, at least if they want to stay in business. Unfortunately, in the most recent encryption debate, much of Silicon Valley has mistaken its own left-libertarian values for those of the world. In fact, surprisingly few people outside the Silicon Valley bubble want to live with the potentially dangerous consequences of giving unbreakable end-to-end encryption to everyone.
Go here to see the original:
Debate Simmers over Digital Privacy