Nuclear power pitfalls
HAYDEN WALLES ponders whether New Zealand should contemplate treading the uncertain path to nuclear energy.
Around the world nuclear power is being touted as a way for us to persist in our current energy habits without emitting clouds of greenhouse gases. Even in New Zealand its scattered supporters are growing bolder, charging that the threat of climate change demands consideration of nuclear energy as a replacement for fossil fuels.
The inevitable squabble has emerged over the safety of nuclear power. Critics point to the likes of the 1986 Chernobyl accident, proponents point to new designs intended to prevent such catastrophes. How do we know who is right? More importantly, is it even possible to tell?
Setting aside the foreseeable difficulties of nuclear power, like routine radiation exposure and waste disposal, let us focus on the unforeseen difficulties— the capacity of nuclear reactors to surprise us. This capacity means it may be impossible to estimate their safety.
Conceptually, a nuclear reactor is simple enough. Inside the core nuclear fission of uranium (or another suitable radioactive fuel) generates intense heat, which is drawn off by a primary coolant fluid. In a conventional reactor the fluid is just water, and this water (which is radioactive) is used to boil water in a secondary system (which isn’t). The steam then drives a turbine that generates electricity. This isn’t much different from the way a coal fired plant might work.
However, a nuclear reactor can cause a far greater catastrophe if it runs out of control. It may seem that we just have to take extra care with nuclear power; yet this may not be possible. Sociologist Charles Perrow addressed this point in his 1984 book Normal Accidents, in which he concluded that nuclear power systems are of a type particularly prone to run out of control unpredictably.
This is because the components of a reactor are tightly coupled, meaning that the operation of each part depends on the other parts doing their job, with no margin for error or delay. The fuel has to be cooled, that coolant has to be cooled and so on, or the system rapidly breaks down. Also, nuclear reactors are complex, Perrow’s term for a system with many apparently unrelated components that can unexpectedly interact.
The 1979 Three Mile Island accident, in which the reactor overheated causing the fuel to melt (a meltdown), began when the secondary coolant pumps stopped and automatically halted the turbine. The backup pumps failed to work (nobody realised) and an emergency valve designed to release pressure from the now overheating core stuck open (again nobody knew, the control room indicator having failed). Primary coolant squirted out and uncovered the fuel.
It wouldn’t normally have been a big deal for the pumps to stop, but a chain of complex interactions led from this to a meltdown. Three Mile Island also illustrates Perrow’s point that adding safety systems can paradoxically make things worse. They add complexity, introducing possibilities for unexpected interaction. According to Perrow the only thing we can reliably predict about complex, tightly coupled systems is that things will go badly wrong in unpredictable ways.
Willem Labuschagne, a senior lecturer in computer science at the University of Otago, agrees and adds a further warning about our limitations. As a logician he is well aware that we are often forced to reason defeasibly when faced with a lack of information— to do our best with assumptions that hold most of the time but occasionally lead us to make bad decisions. For example, he says, we expect other cars to stop at a red light, and most of the time we are right. But we don’t have all the information— the driver might be drunk, or the car’s brakes might have failed— and sometimes we’ll be wrong, with disastrous consequences. To use the road, we must accept such occasional mishaps.
The same is true of nuclear energy. A lot of the blame over Three Mile Island landed at the feet of the operators who, in hindsight, did some very stupid things. But Labuschagne points out that the operators had limited and faulty information about what was going on. Nobody could have understood all the things that happened during the meltdown, not even the designers. The operators could only do their best with the information available and their own expectations, developed during more normal operation. But it wasn’t normal operation, their expectations were flawed, and things got worse.
The situation may not be hopeless. Perhaps it is possible to build a reactor that can be comprehended by designers and operators, or that fails more slowly and manageably. Some nuclear proponents will argue that new designs, like pebble bed reactors, solve the safety problems of the previous generation. Unfortunately, it isn’t the problems you see that get you, but the ones that sneak up from behind. And the path to nuclear energy leads down a dark alley in a bad part of town. Dare we tread it?
Hayden Walles is a science writer from Dunedin.
Out of action: two cooling towers at Three Mile Island, in the United States, stand idle in this 1999 photo after a partial meltdown in 1979.