Los Angeles Times

Hiroshima’s lasting shadow on the world

- NICHOLAS GOLDBERG @Nick_Goldberg

The bombs dropped by the United States on the cities of Hiroshima and Nagasaki 75 years ago, in the final days of World War II, incinerate­d some 200,000 people, most of them civilians.

And they did much more than that. They also transforme­d the nature of war, raising the specter of Armageddon and ushering in the bizarre and terrifying nuclear age that defined the Cold War over five decades.

After two back-to-back global wars, the world was used to death and destructio­n, but the devastatio­n unleashed by the atom bomb in those early days of August 1945 was categorica­lly different.

“A rain of ruin from the air, the like of which has never been seen on this earth” was how President Truman described the new U.S. war capability just hours after the Hiroshima bombing on Aug. 6, 1945. The co-pilot of the Enola Gay, which dropped the bomb that day from 31,500 feet, wrote in his personal log: “My God, what have we done?”

It took the Soviet Union only four years to develop an atomic bomb of its own, launching an unpreceden­ted, ever-escalating arms race that was at the heart of the new Cold War. Both countries quickly graduated to hydrogen bombs, ultimately developing nuclear arsenals of tens of thousands of weapons, many of them with 1,000 times the power — or more — of the bomb dropped on Hiroshima.

But here was the strange paradox of nuclear weapons: Even as we built them, the overriding goal was to ensure they were never used.

After all, these were weapons of unpreceden­ted power that could destroy cities and even countries, killing tens of thousands in a moment. We’d seen it happen in Hiroshima and Nagasaki, and no one wanted to see it again.

“Thus far the chief purpose of our military establishm­ent has been to win wars,” wrote Bernard Brodie, an early nuclear strategist, in 1946. “From now on its chief purpose must be to avert them.”

But rather than stop building the weapons or destroy the ones they had, the superpower­s decided, bizarrely, that the safest approach was to build ever-bigger nuclear arsenals.

That was counterint­uitive, to say the least. But the official policy through the Cold War was one of “deterrence,” based on the idea that nuclear conflicts could best be averted if both sides were convinced that the consequenc­es of attacking would be too horrendous to bear. You had to make it clear to your adversary that you could withstand his nuclear “first strike” — and that you could respond with a retaliator­y second strike so devastatin­g that it would be irrational for him to attack you in the first place.

Deterrence made a certain amount of sense on paper — and yet it was utterly absurd and enormously risky. Not only did it require an ever-growing stockpile of costly and hyper-destructiv­e weapons that were not to be used, but it relied heavily on the rationalit­y and restraint of world leaders. It presumed no misjudgmen­ts or misunderst­andings.

No wonder the theory was known as Mutually Assured Destructio­n — or MAD.

Despite the deterrence talk, the U.S. never adopted a no-first-use policy, and there were plenty of generals and policymake­rs who believed a nuclear war could be fought and won. Over the years, they made secret plans for preemptive attacks. Tactical nuclear weapons were developed for use in limited wars.

As the Cold War dragged on, the nuclear “balance of terror” became part of the culture, as Americans (and Russians) adjusted to the knowledge that annihilati­on was an ever-present possibilit­y. The Pentagon urged homeowners to build fallout shelters; schoolchil­dren were taught to “duck and cover” beneath their desks. In the late 1950s, more than 60% of American children said they’d had nightmares about nuclear war.

Movies like “Fail Safe” and “Dr. Strangelov­e” described the awful things that could go wrong; Bob Dylan wrote “Talkin’ World War III Blues.” The nuclear launch codes were carried in a briefcase by a military aide at the side of the U.S. president.

If Americans were frightened, they were only being reasonable. During the 1962 Cuban missile crisis, even President Kennedy believed the chance of a nuclear war with Russia was “between one-in-three and even.”

And yet today, 75 years after Hiroshima and Nagasaki, we are still here. Nuclear weapons have never again been used.

Beginning in the 1970s, treaties were negotiated by the superpower­s to limit weapons growth. In the 1980s, Ronald Reagan and Mikhail Gorbachev negotiated the first pact requiring that existing nuclear weapons be destroyed. In the 1990s, the Soviet Union collapsed.

Today, the total number of warheads in the U.S. nuclear stockpile is approximat­ely 3,800 — down from a peak of 31,255 in 1967, according to the Federation of American Scientists. Russia now has about 4,310 warheads in its stockpile.

But a multipolar world brings its own dangers. Nine countries now have nuclear weapons, including India and Pakistan, which are perpetuall­y in conflict. Whether we can trust Kim Jong Un to behave rationally in a nuclear crisis is unclear. China, increasing­ly at odds with the U.S., is also, of course, a nuclear power.

The possibilit­y of a nuclear accident remains, or of a terrorist group obtaining a weapon. The landscape of war itself is evolving, with increasing focus on cyberwar and artificial intelligen­ce, each of which will have consequenc­es for nuclear strategy.

Today there are other things to worry about, including the existentia­l threat of climate change. But don’t kid yourself: The dangerous legacy of Hiroshima and Nagasaki lingers on 75 years after the dawn of the nuclear age.

 ??  ??

Newspapers in English

Newspapers from United States