I Don’t Believe We Will Ever Have A Privacy Utopia
Martin Hellman received the 2015 ACM A.M Turing Award with Whitfield Diffie for inventing and promulgating both asymmetric public-key cryptography, including its application to digital signatures, and a practical cryptographic key-exchange method. He was
Martin E. Hellman Professor Emeritus, Stanford University.He shares his insights on Cryptography, its evolution and impact
If you look at the evolution of cryptography as a technique to secure digital assets, what, from your perspective, are the key inflection points/milestones that have transformed cryptography into the unique discipline that it is today? What follows is a survey of the most important milestones in this field over the past century. In 1883 Auguste Kerckhoffs published two essays on “Military Cryptography.” The most important of his five principles for modern cryptography was his clear enunciation that the “general system” (e.g. a piece of military hardware or the Advanced Encryption Standard) must be considered public information. That is true even if, as in the case of military hardware, it is kept secret—because there is too great a danger that it will be compromised or captured. All security, he therefore concluded, must reside solely in the secrecy of the key. Ironically, this created an intellectual barrier to cryptographers comprehending the possibility of public-key cryptography. If all security must reside in the secrecy of the key, how can there be a public key? Of course, what Whitfield Diffie and I did with the introduction of public-key cryptography is to break the key into two pieces—one public and one secret. All security resides in the secrecy of the secret key.
The development of the telegraph and radio created a need for both cryptanalysis and stronger encryption, giving the field a much-needed jolt. World Wars I and II accelerated progress even more, as exemplified by Bletchley Park’s primitive computers for cryptanalysis. Claude Shannon’s work on cryptography at Bell Telephone Labs during World War II laid an important foundation for modern cryptography. Appearing initially in a classified 1945 report (some versions are dated 1946), it was declassi
fied and published in the Bell System Technical Journal in 1949, a year after his more famous papers giving birth to information theory. This 1949 paper, given to me around 1970 by Prof. Peter Elias of MIT, was one of the key steps in my coming to work in the area–my PhD had been in information theory.
Horst Feistel’s 1973 Scientific American paper and IBM’s development of a first-rate cryptographic team outside of the military was another key advance, both for the field and for me personally. I had worked at IBM from 1968-1969 and, though I didn’t work in cryptography, I was in the same department as Horst and had a number of discussions with him. This was a second key step in my coming to work in cryptography.
On March 17, 1975, the US National Bureau of Standards (NBS), now the National Institute of Standards
and Technology (NIST), published the proposed Data Encryption Standard (DES). Whit Diffie and I fought NBS over the 56-bit key size and secret design principles but failed to get a larger key or any information on the design principles. While losing that battle, we won the larger war since the current standard, the Advanced Encryption Standard (AES), has a minimum key size of 128 bits and was designed in a transparent manner.
In the Fall of 1974, Ralph Merkle took the CS244 course at UC Berkeley and proposed the privacy part of publickey cryptography as a term project. The professor liked his other proposal better, so Ralph dropped the course and proceeded on his own. He later submitted a paper to Communications of the ACM (CACM) which was rejected (see my article in the December 2017 issue of CACM), but he persevered, and it was finally published in the April 1978 issue. Whit and I were unaware of Ralph’s work until 1976, after we had independently discovered publickey cryptography. Even though his paper appeared over a year after ours, Ralph’s work has priority based on his submission date, while ours including digital signatures and a workable system for privacy.
In mid-1975, before we were unaware of Merkle, Whit and I came up with the concept of public key cryptog- raphy but didn’t have a workable system. In November 1976, my paper with Whit Diffie, “New Directions in Cryptography,” was published in the IEEE Transactions on Information Theory journal, and our paper brought the concept of public-key cryptography to public attention. The paper also introduced what is now usually called “Diffie-Hellman Key Exchange,” which solved half of the public-key problem (privacy), but did not provide digital signatures (authentication). We came up with the concept of public- key cryptography in 1975, but didn’t have a workable system until May 1976, and even then only for the privacy half of the problem.
In April 1978, Ron Rivest, Adi Shamir, and Len Adleman at MIT developed the RSA public-key cryptosystem— the first fully functional public-key system (both privacy and signatures). They published their paper in the February 1978 issue of CACM. Around 1995 the Internet began to take off, creating a huge need for public-key cryptography.
The internet is insecure, yet credit card transactions, electronic banking, and many other applications require security.
Two years later NIST started developing what became the Advanced Encryption Standard (AES). As noted above,
AES was done right and replaced DES, which suffered from a marginal key size and opaque design process.
At some point in the future, quantum computers with thousands of qubits might become available and break the vast majority of public-key systems now in use. Research on “postquantum computing” is being developed to deal with these challenges.
Do you think the Cold War terminologies like MAD have lost their relevance towards nuclear deterrence? Can you amplify your “risk models” that can diffuse nuclear threat?
There is tremendous misinformation and “illogical logic” surrounding nuclear deterrence. My wife and I cover this extensively in our new book, A New Map for Relationships: Creating True - Love at Home & Peace on the Planet. (See Chapter 8, “How Logical is Nuclear Deterrence?” starting on page 243.) As just one example, governments usually talk as if nuclear deterrence were essentially risk-free. Yet, even if it could be expected to work for 500 years before failing—a time frame that seems highly optimistic to most people—that would be as risky as playing Russian roulette with a newborn child. That’s because 1/6 of 500 years is 83 years, roughly that child’s life expectancy.
What are “crypto wars?” Can you demystify this term?
I usually identify the first crypto war as starting in 1975, with Whit’s and my critique of the DES 56-bit key. It intensified the next year when we published “New Directions in Cryptography.” The US National Security Agency (NSA) basically maintained that our work was “born classified.” While they won the DES key size battle, we won the conflict over the right to publish our papers without government interference.
What I call “the second crypto war” occurred in the 1990’s over “key escrow,” the Clipper chip (developed by the NSA to secure voice and data messages, it included what is usually termed “a back door,” but I call “a front door” since it was known to exist), and related attempts of the US government to gain access to encrypted information when they had a legitimate need to do so. The problem is that giving them access introduces security weak points, so there’s an unavoidable tradeoff between law enforcement/national security access and security against bad actors, including potential rogue elements within those two communities. The 1996 National Research Council “CRISIS report” (Cryptography’s Role In Securing the Information Society) helped defuse the fight by recommending that the government experiment with key escrow for its own uses and if it could overcome the barriers, present the solution for consideration. It never did so. I served on that NRC committee, along with a former attorney general representing law enforcement’s interests and a former Deputy Director of NSA representing national security interests. The committee’s conclusions were unanimous. That report also recommended a considerable relaxation of export controls on cryptographic equipment and software. Relaxation along the lines of our recommendations occurred soon afterward.
What I think of as “the third crypto war” is largely a repeat of the second. The FBI’s insistence several years ago that Apple help it circumvent security on the iPhone used by the San Bernardino mass shooter was very similar to the second crypto war. It seems to me that some people in the government need to study their history. If you look at security threats faced by enterprises, can they leverage cryptography for proactive security? Yes, but the poor implementation is a huge problem. As just one example, I frequently get emails from my bank and brokerage houses with links to their websites. I never use those links, since they could be clever phishing attacks. I always use URLs that I have stored on my own computer. The banks are basically teaching people to fall for phishing attacks. What are your views on privacy? Do you think a utopian scenario is possible? I don’t believe we will ever have a privacy utopia. There will always be tradeoffs, and not just with law enforcement and national security concerns. A single identifying number, like my Social Security number, makes my life much more convenient than if I had to memorize separate numbers for each website. But that also makes it easier for a crook to break into my accounts.
The FBI’s insistence several years ago that Apple help it circumvent security on the iPhone used by the San Bernardino mass shooter was very similar to the second crypto war