I Don’t Be­lieve We Will Ever Have A Pri­vacy Utopia

Martin Hell­man re­ceived the 2015 ACM A.M Tur­ing Award with Whit­field Diffie for in­vent­ing and pro­mul­gat­ing both asym­met­ric pub­lic-key cryp­tog­ra­phy, in­clud­ing its ap­pli­ca­tion to dig­i­tal sig­na­tures, and a prac­ti­cal cryp­to­graphic key-ex­change method. He was

Dataquest - - FRONT PAGE -

Martin E. Hell­man Pro­fes­sor Emer­i­tus, Stan­ford Univer­sity.He shares his in­sights on Cryp­tog­ra­phy, its evo­lu­tion and im­pact

If you look at the evo­lu­tion of cryp­tog­ra­phy as a tech­nique to se­cure dig­i­tal as­sets, what, from your per­spec­tive, are the key in­flec­tion points/mile­stones that have trans­formed cryp­tog­ra­phy into the unique dis­ci­pline that it is to­day? What fol­lows is a sur­vey of the most im­por­tant mile­stones in this field over the past cen­tury. In 1883 Au­guste Ker­ck­hoffs pub­lished two es­says on “Mil­i­tary Cryp­tog­ra­phy.” The most im­por­tant of his five prin­ci­ples for mod­ern cryp­tog­ra­phy was his clear enun­ci­a­tion that the “gen­eral sys­tem” (e.g. a piece of mil­i­tary hard­ware or the Ad­vanced En­cryp­tion Stan­dard) must be con­sid­ered pub­lic in­for­ma­tion. That is true even if, as in the case of mil­i­tary hard­ware, it is kept se­cret—be­cause there is too great a dan­ger that it will be com­pro­mised or cap­tured. All se­cu­rity, he there­fore con­cluded, must re­side solely in the se­crecy of the key. Iron­i­cally, this cre­ated an in­tel­lec­tual bar­rier to cryp­tog­ra­phers com­pre­hend­ing the pos­si­bil­ity of pub­lic-key cryp­tog­ra­phy. If all se­cu­rity must re­side in the se­crecy of the key, how can there be a pub­lic key? Of course, what Whit­field Diffie and I did with the in­tro­duc­tion of pub­lic-key cryp­tog­ra­phy is to break the key into two pieces—one pub­lic and one se­cret. All se­cu­rity re­sides in the se­crecy of the se­cret key.

The de­vel­op­ment of the tele­graph and ra­dio cre­ated a need for both crypt­anal­y­sis and stronger en­cryp­tion, giv­ing the field a much-needed jolt. World Wars I and II ac­cel­er­ated progress even more, as ex­em­pli­fied by Bletch­ley Park’s prim­i­tive com­put­ers for crypt­anal­y­sis. Claude Shan­non’s work on cryp­tog­ra­phy at Bell Tele­phone Labs dur­ing World War II laid an im­por­tant foun­da­tion for mod­ern cryp­tog­ra­phy. Ap­pear­ing ini­tially in a clas­si­fied 1945 re­port (some ver­sions are dated 1946), it was de­classi

fied and pub­lished in the Bell Sys­tem Tech­ni­cal Jour­nal in 1949, a year af­ter his more fa­mous pa­pers giv­ing birth to in­for­ma­tion the­ory. This 1949 pa­per, given to me around 1970 by Prof. Peter Elias of MIT, was one of the key steps in my com­ing to work in the area–my PhD had been in in­for­ma­tion the­ory.

Horst Feis­tel’s 1973 Sci­en­tific Amer­i­can pa­per and IBM’s de­vel­op­ment of a first-rate cryp­to­graphic team out­side of the mil­i­tary was an­other key ad­vance, both for the field and for me per­son­ally. I had worked at IBM from 1968-1969 and, though I didn’t work in cryp­tog­ra­phy, I was in the same depart­ment as Horst and had a num­ber of dis­cus­sions with him. This was a sec­ond key step in my com­ing to work in cryp­tog­ra­phy.

On March 17, 1975, the US Na­tional Bu­reau of Stan­dards (NBS), now the Na­tional In­sti­tute of Stan­dards

and Tech­nol­ogy (NIST), pub­lished the pro­posed Data En­cryp­tion Stan­dard (DES). Whit Diffie and I fought NBS over the 56-bit key size and se­cret de­sign prin­ci­ples but failed to get a larger key or any in­for­ma­tion on the de­sign prin­ci­ples. While los­ing that bat­tle, we won the larger war since the cur­rent stan­dard, the Ad­vanced En­cryp­tion Stan­dard (AES), has a min­i­mum key size of 128 bits and was de­signed in a trans­par­ent man­ner.

In the Fall of 1974, Ralph Merkle took the CS244 course at UC Berke­ley and pro­posed the pri­vacy part of pub­lickey cryp­tog­ra­phy as a term project. The pro­fes­sor liked his other pro­posal bet­ter, so Ralph dropped the course and pro­ceeded on his own. He later sub­mit­ted a pa­per to Com­mu­ni­ca­tions of the ACM (CACM) which was re­jected (see my ar­ti­cle in the De­cem­ber 2017 is­sue of CACM), but he per­se­vered, and it was fi­nally pub­lished in the April 1978 is­sue. Whit and I were un­aware of Ralph’s work un­til 1976, af­ter we had in­de­pen­dently dis­cov­ered pub­lickey cryp­tog­ra­phy. Even though his pa­per ap­peared over a year af­ter ours, Ralph’s work has pri­or­ity based on his sub­mis­sion date, while ours in­clud­ing dig­i­tal sig­na­tures and a work­able sys­tem for pri­vacy.

In mid-1975, be­fore we were un­aware of Merkle, Whit and I came up with the con­cept of pub­lic key cryp­tog- ra­phy but didn’t have a work­able sys­tem. In Novem­ber 1976, my pa­per with Whit Diffie, “New Di­rec­tions in Cryp­tog­ra­phy,” was pub­lished in the IEEE Trans­ac­tions on In­for­ma­tion The­ory jour­nal, and our pa­per brought the con­cept of pub­lic-key cryp­tog­ra­phy to pub­lic at­ten­tion. The pa­per also in­tro­duced what is now usu­ally called “Diffie-Hell­man Key Ex­change,” which solved half of the pub­lic-key prob­lem (pri­vacy), but did not pro­vide dig­i­tal sig­na­tures (authen­ti­ca­tion). We came up with the con­cept of pub­lic- key cryp­tog­ra­phy in 1975, but didn’t have a work­able sys­tem un­til May 1976, and even then only for the pri­vacy half of the prob­lem.

In April 1978, Ron Rivest, Adi Shamir, and Len Adle­man at MIT de­vel­oped the RSA pub­lic-key cryp­tosys­tem— the first fully func­tional pub­lic-key sys­tem (both pri­vacy and sig­na­tures). They pub­lished their pa­per in the Fe­bru­ary 1978 is­sue of CACM. Around 1995 the In­ter­net be­gan to take off, cre­at­ing a huge need for pub­lic-key cryp­tog­ra­phy.

The in­ter­net is in­se­cure, yet credit card trans­ac­tions, elec­tronic bank­ing, and many other ap­pli­ca­tions re­quire se­cu­rity.

Two years later NIST started de­vel­op­ing what be­came the Ad­vanced En­cryp­tion Stan­dard (AES). As noted above,

AES was done right and re­placed DES, which suf­fered from a mar­ginal key size and opaque de­sign process.

At some point in the fu­ture, quan­tum com­put­ers with thou­sands of qubits might be­come avail­able and break the vast ma­jor­ity of pub­lic-key sys­tems now in use. Re­search on “postquan­tum com­put­ing” is be­ing de­vel­oped to deal with th­ese chal­lenges.

Do you think the Cold War ter­mi­nolo­gies like MAD have lost their rel­e­vance to­wards nu­clear de­ter­rence? Can you am­plify your “risk mod­els” that can dif­fuse nu­clear threat?

There is tremen­dous mis­in­for­ma­tion and “il­log­i­cal logic” sur­round­ing nu­clear de­ter­rence. My wife and I cover this ex­ten­sively in our new book, A New Map for Re­la­tion­ships: Cre­at­ing True - Love at Home & Peace on the Planet. (See Chap­ter 8, “How Logical is Nu­clear De­ter­rence?” start­ing on page 243.) As just one ex­am­ple, gov­ern­ments usu­ally talk as if nu­clear de­ter­rence were es­sen­tially risk-free. Yet, even if it could be ex­pected to work for 500 years be­fore fail­ing—a time frame that seems highly op­ti­mistic to most peo­ple—that would be as risky as play­ing Rus­sian roulette with a new­born child. That’s be­cause 1/6 of 500 years is 83 years, roughly that child’s life ex­pectancy.

What are “crypto wars?” Can you de­mys­tify this term?

I usu­ally iden­tify the first crypto war as start­ing in 1975, with Whit’s and my cri­tique of the DES 56-bit key. It in­ten­si­fied the next year when we pub­lished “New Di­rec­tions in Cryp­tog­ra­phy.” The US Na­tional Se­cu­rity Agency (NSA) ba­si­cally main­tained that our work was “born clas­si­fied.” While they won the DES key size bat­tle, we won the con­flict over the right to pub­lish our pa­pers with­out govern­ment in­ter­fer­ence.

What I call “the sec­ond crypto war” oc­curred in the 1990’s over “key es­crow,” the Clip­per chip (de­vel­oped by the NSA to se­cure voice and data mes­sages, it in­cluded what is usu­ally termed “a back door,” but I call “a front door” since it was known to ex­ist), and re­lated at­tempts of the US govern­ment to gain ac­cess to en­crypted in­for­ma­tion when they had a le­git­i­mate need to do so. The prob­lem is that giv­ing them ac­cess in­tro­duces se­cu­rity weak points, so there’s an un­avoid­able trade­off between law en­force­ment/na­tional se­cu­rity ac­cess and se­cu­rity against bad ac­tors, in­clud­ing po­ten­tial rogue el­e­ments within those two com­mu­ni­ties. The 1996 Na­tional Re­search Coun­cil “CRI­SIS re­port” (Cryp­tog­ra­phy’s Role In Se­cur­ing the In­for­ma­tion So­ci­ety) helped defuse the fight by rec­om­mend­ing that the govern­ment ex­per­i­ment with key es­crow for its own uses and if it could over­come the bar­ri­ers, present the so­lu­tion for con­sid­er­a­tion. It never did so. I served on that NRC com­mit­tee, along with a former at­tor­ney gen­eral rep­re­sent­ing law en­force­ment’s in­ter­ests and a former Deputy Di­rec­tor of NSA rep­re­sent­ing na­tional se­cu­rity in­ter­ests. The com­mit­tee’s con­clu­sions were unan­i­mous. That re­port also rec­om­mended a con­sid­er­able re­lax­ation of ex­port con­trols on cryp­to­graphic equip­ment and soft­ware. Re­lax­ation along the lines of our rec­om­men­da­tions oc­curred soon af­ter­ward.

What I think of as “the third crypto war” is largely a re­peat of the sec­ond. The FBI’s in­sis­tence sev­eral years ago that Ap­ple help it cir­cum­vent se­cu­rity on the iPhone used by the San Bernardino mass shooter was very sim­i­lar to the sec­ond crypto war. It seems to me that some peo­ple in the govern­ment need to study their his­tory. If you look at se­cu­rity threats faced by en­ter­prises, can they lever­age cryp­tog­ra­phy for proac­tive se­cu­rity? Yes, but the poor im­ple­men­ta­tion is a huge prob­lem. As just one ex­am­ple, I fre­quently get emails from my bank and bro­ker­age houses with links to their web­sites. I never use those links, since they could be clever phish­ing at­tacks. I al­ways use URLs that I have stored on my own com­puter. The banks are ba­si­cally teach­ing peo­ple to fall for phish­ing at­tacks. What are your views on pri­vacy? Do you think a utopian sce­nario is pos­si­ble? I don’t be­lieve we will ever have a pri­vacy utopia. There will al­ways be trade­offs, and not just with law en­force­ment and na­tional se­cu­rity con­cerns. A sin­gle iden­ti­fy­ing num­ber, like my So­cial Se­cu­rity num­ber, makes my life much more con­ve­nient than if I had to mem­o­rize sep­a­rate num­bers for each web­site. But that also makes it eas­ier for a crook to break into my ac­counts.

The FBI’s in­sis­tence sev­eral years ago that Ap­ple help it cir­cum­vent se­cu­rity on the iPhone used by the San Bernardino mass shooter was very sim­i­lar to the sec­ond crypto war

Newspapers in English

Newspapers from India

© PressReader. All rights reserved.