We’ve been hearing a lot of discussion about encryption these days. The government proposes putting in “backdoors” in encryption algorithms to allow law enforcement and security groups to be able to monitor communication between entities who pose a threat to “our” security. Clay Bennet won a Pulitzer Prize in 2002 for an editorial cartoon that expertly explains the security vs. privacy issue. The Security vs. Privacy argument is a see-saw. The more security you want, the less privacy you have. It is not a “vice versa” situation. More privacy does not necessarily mean less security. Security advocates usually say “if you’re not doing anything wrong, then you shouldn’t be worried”. There are lots of flaws with this argument. The most common one is “who defines the definition of “wrong”? Does wrong mean “illegal” or dissent, for example. A common definition of privacy is the “right to be left alone”. Bruce Schneier said in his essay “The Eternal Value of Privacy” the “real choice is privacy versus control”.
Encryption provides a way to hide something you send or store from unauthorized entities. It can be as basic as speaking a foreign language to someone or using something based on high order mathematics. The Navajo code talkers employed an “encryption” method of communicating without the enemy being able to determine what was being said. As with any process, it can be used for good or evil. You “break” this encryption technique by using someone fluent in the language being used.
Suppose you put your tax papers in a vault to protect it from unauthorized access. You use a lock and key to gain access. A backdoor would be a master key for that lock. Common sense tells us the master key a) needs to be guarded all the time b) the person who has the master key isn’t evil and c) the person who has the regular key knows a master key exists.
In the 1990s, the Federal government proposed a method—the Clipper chip—allowing law enforcement and security groups to decrypt encrypted information. The resulting uproar was instrumental in shooting this proposal down but it showed how people didn’t understand how encryption works. The “clipper chip” was a “backdoor” way to decrypt a file or transmission. The arguments used by current administration officials are the same ones used 20 years ago.
So what’s the problem? Well, in the digital world, copies can be made without the owner’s knowledge. Any good hacker would try to get that “master” key and use it. It’s folly to assume a digital “master key/backdoor” would never be compromised. The 2011 RSA hack, 2013 Carbon Black attack are examples of hackers going after the “master” keys with success. A reason to use encryption is to protect data at rest and in transit. I note that these methods while straightforward require effort and skill to complete successfully.
The introduction of “backdoors” into any encryption algorithm destroys the algorithm as an encryption tool. The backdoors will become publicly known eventually.”
These attack points do NOT need to know your encryption key. Why? The data is in the clear when it’s entered thru a keyboard or in a program. If you write a tool to grab the data at these points, you get the data in the clear. Key stroke recorders record all keystrokes and write the output to a file. Tom Wilson and I wrote a paper in 1996 showing how keystroke recorders could be embedded in an email attachment. This is nothing new. The first public reporting of this technique was done in 1998 when the FBI used a keystroke recorder against a mafia don’s (Scarfo) computer. The recorder allowed them to make a copy of his private key (password) and information from his encrypted files. The 1998 FBI ScarfoKeylogger, 2001 Magic Lantern, 2009 CIPAV, 2012 Operation Torpedo and 2013 Freedom Hosting tools were very effective and didn't require a "encryption" backdoor.
The introduction of “backdoors” into any encryption algorithm destroys the algorithm as an encryption tool. The backdoors will become publicly known eventually