The Cold War did not end with the opening of the Berlin Wall, the reunification of Germany and the subsequent collapse of the Soviet Union. By the time of these events, it had already lost much of its earlier intensity. A skein of international agreements, some formal and explicit, others tacit and even denied, averted the dangers of unintended confrontations. More importantly, the populations on both sides of the Iron Curtain were disinclined to think that the risk of nuclear obliteration was worth incurring.
There was conflict between the blocs, conducted by proxy or through covert operations. Still, in 1973 the US and the Soviet Union did not allow their client states, Israel and Egypt, to drag them into war. Where the superpowers openly intervened, each suffered not only military defeat but immense loss of moral standing—the US in Vietnam and the Soviet Union in Afghanistan. The Soviet Union before that undermined its ritualized criticisms of American imperialism by invading Czechoslovakia in 1968 to terminate an experiment in democratic socialism. The US in 1973 paid a similar price by using the Chilean armed forces to destroy Chilean democracy.
In the late ‘70s and early ‘80s the superpowers recklessly stationed new nuclear missiles in Europe. Unrest in both parts of Europe led to the stabilization of the situation. Amidst this turmoil, the Iron Curtain in fact became more porous. Cultural and political elites on each side developed the familiarity that made crisis management possible. The Helsinki agreements of 1971 provided its signatories with a lesson in unintended consequences. Its provisions on human rights were casually accepted by the Soviet bloc governments as harmless rhetorical conceits. Few in the west thought these significant either (recall the photo of Kissinger dozing off at the ratification ceremony), yet they provided the moral legitimation for the movements that eventually ended one party rule in Soviet Europe.
None of these developments were inevitable. Academics and journalists, bureaucrats and politicians, are far better at retroactive explanation than inspired prediction. If we move (or stumble) backward, we do find a moment of historical breakthrough, in which much previously thought impossible was depicted as feasible.
Its major protagonist was John Kennedy, who in the spring of 1963 was aware of a considerable discrepancy between his very favorable position in domestic and world opinion—and his actual accomplishments. Perhaps he underestimated himself. He had, with the help of brother Robert in the Cuban Missile Crisis of 1962, avoided a nuclear conflict with the USSR. Kennedy was very aware of how close he had come to helplessness. He and Khruschev reached an agreement at the very last hour, each desperate to wrest control of the situation from their armed forces. The terms of the agreement were kept secret (an undertaking by Kennedy not to attack Cuba in return for the withdrawal of the Soviet missiles). In the US, the event was interpreted as an American triumph in a battle of wills. “They blinked” as the inimitably unimaginative Secretary of State Rusk put it. Kennedy sought for ways to avoid such situations in the future.
Kennedy had a good sense of what bothered ordinary citizens—and of the distance between their concerns and those of their political leaders. The President was aware of the cultural and psychological devastation of the threat of nuclear war. School exercises (as if crawling under desks would help), theological disputes over use rights to backyard bomb shelters were absurd, symptoms of the pervasive nuclearization of our culture. The American (and other) peoples lived in pervasive anxiety. Death meant not only the end of one’s own or a family’s life, it meant the extinction of human continuity. A politician declared that if only one man was left alive on earth, he wished him to be an American. The choice of gender was not accidental. Spiritual primitivism and national narcissism drew upon the least sublime aspects of human nature.