Eric Schlosser and the Illusion of Nuclear Weapons Safety

Eric Schlosser and the Illusion of Nuclear Weapons Safety

Eric Schlosser and the Illusion of Nuclear Weapons Safety

A new book explores the alarming threat of accidental nuclear detonations.

Facebook
Twitter
Email
Flipboard
Pocket


Frenchman Flat in Las Vegas, Nevada, on May 25, 1953, a moment after history’s first atomic artillery shell was fired. (AP Photo).

In his gripping new book Command and Control, Eric Schlosser shows that the United States has often relied on what he calls “the illusion of safety” when it comes to nuclear weapons. Throughout the cold war, there was a real danger that nuclear weapons could have detonated by accident. Schlosser’s centerpiece is the explosion in a Titan II missile silo in Damascus, Arkansas, in the early hours of September 19, 1980, that launched a nine-megaton warhead several hundred feet in the air before it finally came to rest in a ditch. Around this incident he weaves the larger story of nuclear weapons accidents, a danger that has not ended with the cold war.

In a startling number of cases, bombs crashed and burned, and the conventional explosions spread radioactivity in the surrounding areas. Scholars and members of the public vaguely knew about such incidents, but dismissed them or took the fact that they didn’t result in nuclear explosions as evidence of how safe the weapons were. Schlosser’s view is that we were lucky, and his penultimate sentence sums up the current concern: “Every one of [the currently deployed missiles] is an accident waiting to happen, a potential act of mass murder.”

This is bad enough, of course, but it gets worse because missiles and warheads are much less vulnerable than the means of communicating with them: the command and control system that gives the book its title. To be credible, the threat of nuclear retaliation requires the adversary to believe that the state’s weapons will survive in significant numbers along with the command and control system. At first glance, this seems an undemanding requirement since multiple channels can be deployed. But what works well in peacetime can fail under the stress of a nuclear attack, especially one aimed at disrupting it. Phone lines, of course, would be destroyed, various radiation effects would disrupt radio communication, and the blast effects would blow away sending and receiving antennas. In peacetime the problem is different: sensitive radars and other sensors have sometimes produced frightening false alarms, briefly raising the threat of nuclear war.

Schlosser’s point isn’t only that perfect safety is impossible and that some accidents will happen. As he shows in rich and vivid detail, the ways to cope with these dangers create new ones. Putting leaders and weapons deep underground will increase their chances of survival but make command and control more difficult. The chances that enough of the system will survive intact to strike back could be increased by placing it on alert. For much of the cold war, American missiles were ready to fire and bombers were on “strip alert,” armed and ready to take off in fifteen minutes. In the period when fear was great but large numbers of missiles were not yet deployed, bombers were in the air, where they were even less vulnerable. But these measures required moving bombs around and keeping them ready to fire, which heightened the risk of accidents.

Even these measures might not be sufficient, however, leading both sides to consider the policy of “launch on warning,” which, while fairly safe where bombers were concerned, meant something very different for missiles. For them, and even for the less hair-trigger alert stances, the United States needed systems that would give unambiguous warning of an attack. These were developed with enormous technical skill and money, but they introduced their own dangers. Radars could and did interpret reflections from the moon as a missile attack. Tapes designed to train crews on how to react to sightings of missiles were inadvertently fed onto the main system. And computer chips failed, telling the operators that an attack was on the way. All these dangers implied their opposites—that a real attack might be mistaken for a malfunction. So it is not surprising that American military leaders were led to contemplate and at times urge that the United States be ready to strike before the Soviets could.

Once the Soviets developed secure retaliatory forces, the impulse to strike pre-emptively subsided. But this development generated new problems. Since an all-out war would destroy the United States, how could it credibly threaten to respond to a Soviet conventional attack in Europe? As early as the Eisenhower era, analysts and administration officials were preoccupied with this question. The dominant answer was to develop limited nuclear options and to plan on a controlled nuclear war. While this made a sort of twisted sense, it vastly raised the requirements for the invulnerability of weapons and, even more, command and control systems. Starting with Kennedy, every president called for plans to fight controlled wars and by the end of their terms believed that they were in place. They weren’t. Out of the desire to preserve their autonomy, to strike at all Soviet forces as quickly as possible, and an understanding that the command and control systems would not permit a controlled nuclear war, the military never was ready to do what the White House called for. As each new administration came in, officials were shocked to find how inflexible the war plans were, and when they left they congratulated themselves—incorrectly—for having rectified the situation. Of course, it would have taken two to fight a limited war. The Soviets did not accept the American rules of the game, nor did they possess the hardware and software necessary to conduct a limited war.

At each level, from the design of weapons, to the command and control system, to the nuclear strategies they were designed to serve, the parts interlocked so that solving one problem produced others. Most scholars have concentrated on nuclear strategy, with a smaller number paying attention to the command and control systems, with an even smaller number noting the danger of nuclear accidents. A great virtue of Schlosser’s book is that he reverses this emphasis, while keeping an eye on the interrelations among the three. He has read almost all the relevant secondary studies and done extensive research in declassified documents, many of which have hardly been used before. The story of the Damascus accident is riveting, entirely original and very much worth telling.

As Schlosser shows, the basic problem is that these weapons systems are not only complicated but complex—their parts, including the humans who manage them, interact in ways that are hard to understand and control, especially in unusual circumstances. The weapons look sturdy, but in fact they degrade and require frequent maintenance. It was just such normal housekeeping that set off the Damascus accident. The lug in a technician’s wrench slipped out during a routine task, ricocheted off several pieces of equipment and the silo wall, and in an unpredictable carom shock, knocked a hole in the fuel tank. This had never happened before, and there were no plans for how it could be managed. Very intelligent and well-trained personnel made well-meaning decisions, but many of them were probably misguided. Even if they had made other choices, the missile might have been doomed. The Air Force had mandated the use of a different kind of wrench, but it was not immediately available and the team was running behind schedule. This pattern should not surprise us: a normal response to cumbersome regulations is to develop work-arounds and shortcuts. It is hard to imagine any organization without them, and so it would be foolish to say that we can solve these problems by making sure that everyone works by the book.

Even if we could do this, Schlosser shows that the problem is deeper than system safety. In fact, it is easy to design weapons so that they can never fire by accident or unauthorized use. It is equally easy to design them so that they will always fire when the order is given. But it is extraordinarily difficult to meet both criteria at once—what is known as the “always/never” problem. Any safety feature will increase, at least slightly, the danger that the weapon will not go off when it should. A special safety feature was needed for the warhead of the missiles carried by the Polaris submarines because the standard measure had a major defect. But it was later discovered that this device rendered the warhead <i>too</i> safe—it could not fire at all. More frequently, however, the military privileged “always” over “never,” and also did not want to “waste” money on unnecessary safety devices. The result was that despite a great deal of ingenuity the weapons probably were both less likely to fire in war and less safe in peacetime than they seemed.

But how safe were they? There was simply no way to know. As Fred Iklé, a RAND analyst who went on to important positions in the government, said, “We cannot derive much confidence from the fact that no unauthorized detonation has occurred to date [because] the past safety record means nothing for the future.” Circumstances changed too much, too many new designs were introduced and too many new situations were possible to calculate, let alone eliminate, the risks. The weapons that were perfectly safe under normal circumstances might not be when bombers crashed or caught fire, bombs were accidentally dropped or planes ran off runways into the “igloos” that stored the warheads. In such “abnormal environments,” weapons were subject to physical stresses and deformations that were not entirely predictable. Wires pushed out of place could lead to short-circuits that would circumvent safety features; hidden weakness would become manifest only in the most unusual circumstances.

Schlosser details the long struggle of the safety engineers to get the attention of high-level officials and to install devices that made the weapons much safer, at least against the problems they had seen or imagined. But these efforts yielded fruit only at the end of the cold war. Not only did the long campaign finally convince skeptics, but the disintegration of the Soviet Union meant that many weapons could be taken out of service and dangerous alerts could be curtailed. In an irony that typifies our nuclear dilemmas, however, the fact that nuclear war no longer seemed a pressing menace meant that the status and vigilance of those in charge of the weapons greatly decreased. Indeed, some warheads were actually misplaced.

If the danger of nuclear war and even nuclear accidents has gone down for the United States and Russia, it may have increased for India and Pakistan. Not only are these countries more bitter rivals than the superpowers were, but their stockpiles are increasing, their “always/never” dilemmas are sharper and it is likely that they have given even less priority to avoiding accidents. Schlosser’s unrelenting attention to nuclear dangers might lead the reader to expect him to endorse the popular proposals to abolish nuclear weapons. Instead, he understands (although only briefly discusses) the fact that a world without them would not be a world without the knowledge of how to build them, with the consequent instabilities caused by each state’s fear that its rival was about to restart deployments. So he settles for a more modest proposal to reduce the reliance on nuclear weapons, take the more dangerous ones out of circulation and place even greater priority on safety. An anticlimax, perhaps, but a position that I believe is the most sensible one.

In the end, however, the basic conundrum remains. One the one hand, the multiple assurances by authorities that we had nothing to worry about were, as in so many other areas, unfounded and deceptive. Numerous accidents went unreported to the public and often to officials who should have known about them. Hydrogen bombs were dropped on or crashed into Spain, Iceland and numerous locations in the United States. They were subject to fires and explosions, in some cases leading to the dispersion of uranium and plutonium.

On the other hand, we never had a nuclear explosion. But does this show the strength of the safety mechanisms? Schlosser draws the opposite conclusion, that we were lucky. If things had been a bit different in many of these cases—had wires crossed one way rather than another or had decay in a safety switch occurred in a bomber that crashed—bombs would have exploded. We can ask how close we came, which means thinking about would have had to have been different in order to produce this dreaded result. But we cannot be confident about where this way of thinking leads us. And that means we may have been a lot closer to disaster than most of us believed at the time.

Bob Dreyfuss asks, can the Obama administration pull off a deal with Iran over its nuclear program?

Thank you for reading The Nation!

We hope you enjoyed the story you just read. It’s just one of many examples of incisive, deeply-reported journalism we publish—journalism that shifts the needle on important issues, uncovers malfeasance and corruption, and uplifts voices and perspectives that often go unheard in mainstream media. For nearly 160 years, The Nation has spoken truth to power and shone a light on issues that would otherwise be swept under the rug.

In a critical election year as well as a time of media austerity, independent journalism needs your continued support. The best way to do this is with a recurring donation. This month, we are asking readers like you who value truth and democracy to step up and support The Nation with a monthly contribution. We call these monthly donors Sustainers, a small but mighty group of supporters who ensure our team of writers, editors, and fact-checkers have the resources they need to report on breaking news, investigative feature stories that often take weeks or months to report, and much more.

There’s a lot to talk about in the coming months, from the presidential election and Supreme Court battles to the fight for bodily autonomy. We’ll cover all these issues and more, but this is only made possible with support from sustaining donors. Donate today—any amount you can spare each month is appreciated, even just the price of a cup of coffee.

The Nation does not bow to the interests of a corporate owner or advertisers—we answer only to readers like you who make our work possible. Set up a recurring donation today and ensure we can continue to hold the powerful accountable.

Thank you for your generosity.

Ad Policy
x