For about a decade now, nuclear boosters have been telling us that a “nuclear renaissance” is underway thanks to the advent of cheaper, safer and faster-built “third-” and “fourth-generation” reactors. Their ranks have been swelled lately by green champions of nuclear power like George Monbiot, who has recently embraced nuclear energy as an alternative to fossil fuels in the quest to mitigate climate change. Anti-nuke activists like Helen Caldicott have responded with dire warnings of nuclear apocalypse and radiation-induced cancer (see their exchange on a recent episode of Democracy Now!).
But for all its moral urgency, this debate usually ignores the economics of nuclear power. It is economic factors like costs, supply chains, financing and profitability that will determine our future energy mix. And so far, the dollars and cents calculations for nuclear power just do not add up.
The argument for nukes gets even weaker when one considers the compressed time frame of climate change: carbon emissions must drop sooner and faster than the long, slow, costly process of building new nuclear plants would allow. The boosters of nuclear power, including greens like Monbiot, seem to forget the reactors don’t build themselves. They are built and operated by specific institutions under concrete economic circumstances like the price of capital, special metals, insurance and the availability of skilled labor. Once the economic arguments get to that level of specificity, the viability of atomic power falls apart.
Moreover, casting a nuclear renaissance as the panacea for climate change is dangerous because it threatens to delay the shift to clean energy. Continually pushing nukes has opportunity costs; every dollar, euro or RMB spent on nuclear power is one not spent on clean technology like wind, solar, hydro or tidal kinetics.
First, a bit of history: The initial wave of nuclear power reached its zenith after the Arab oil embargo of 1973. That political and economic shock sent many developed economies on a reactor construction spree. The logic here was fundamentally geostrategic, not economic: better to have power from nukes that operated at a loss and were subsidized by the rest of the economy than to have your whole economy collapse because you could not import oil. In particular, Japan and France went nuclear; France converted the majority of its electrical supply from fossil fuels to nuclear.
But these second-generation reactors, which make up the majority of the world’s current fleet of 443 nuclear power stations, soon proved to be prohibitively expensive and slow to build. After the Three Mile Island accident, hundreds of planned plants in the United States were canceled and construction around the world slowed. Bankruptcies associated with nuclear power rose, and investors began to turn away from it.
Even in France and Japan, building new reactors mostly halted. France became the most nuclear-powered country in the world, in part because its system is fundamentally socialized; the various companies associated with construction and operation of nuclear plants never had to turn a profit and managed to offload most of their debts onto the public. Japan’s reactors are also heavily subsidized.
In the US and the UK cost overruns on nuclear plants helped bankrupt several utility companies. In the US these losses helped usher in the debacle of energy deregulation in the mid-’90s that saw rising rates and power blackouts in California. When the UK began privatizing utilities its nuclear reactors were so unprofitable they could not be sold. Eventually in 1996, the government gave them away. But the company that took them over, British Energy, had to be bailed out in 2004 to the tune of 3.4 billion pounds.