Nu­clear power pit­falls

HAY­DEN WALLES pon­ders whether New Zealand should con­tem­plate tread­ing the un­cer­tain path to nu­clear en­ergy.

The Press - - Perspectiv­e -

Around the world nu­clear power is be­ing touted as a way for us to per­sist in our cur­rent en­ergy habits with­out emit­ting clouds of green­house gases. Even in New Zealand its scat­tered sup­port­ers are grow­ing bolder, charg­ing that the threat of cli­mate change de­mands con­sid­er­a­tion of nu­clear en­ergy as a re­place­ment for fos­sil fu­els.

The in­evitable squab­ble has emerged over the safety of nu­clear power. Crit­ics point to the likes of the 1986 Ch­er­nobyl ac­ci­dent, pro­po­nents point to new de­signs in­tended to pre­vent such catas­tro­phes. How do we know who is right? More im­por­tantly, is it even pos­si­ble to tell?

Set­ting aside the fore­see­able dif­fi­cul­ties of nu­clear power, like rou­tine ra­di­a­tion ex­po­sure and waste dis­posal, let us fo­cus on the un­fore­seen dif­fi­cul­ties— the ca­pac­ity of nu­clear re­ac­tors to sur­prise us. This ca­pac­ity means it may be im­pos­si­ble to es­ti­mate their safety.

Con­cep­tu­ally, a nu­clear re­ac­tor is sim­ple enough. Inside the core nu­clear fis­sion of ura­nium (or an­other suit­able ra­dioac­tive fuel) gen­er­ates in­tense heat, which is drawn off by a pri­mary coolant fluid. In a con­ven­tional re­ac­tor the fluid is just wa­ter, and this wa­ter (which is ra­dioac­tive) is used to boil wa­ter in a sec­ondary sys­tem (which isn’t). The steam then drives a tur­bine that gen­er­ates elec­tric­ity. This isn’t much dif­fer­ent from the way a coal fired plant might work.

How­ever, a nu­clear re­ac­tor can cause a far greater catas­tro­phe if it runs out of con­trol. It may seem that we just have to take ex­tra care with nu­clear power; yet this may not be pos­si­ble. So­ci­ol­o­gist Charles Per­row ad­dressed this point in his 1984 book Nor­mal Ac­ci­dents, in which he con­cluded that nu­clear power sys­tems are of a type par­tic­u­larly prone to run out of con­trol un­pre­dictably.

This is be­cause the com­po­nents of a re­ac­tor are tightly cou­pled, mean­ing that the op­er­a­tion of each part de­pends on the other parts do­ing their job, with no mar­gin for er­ror or de­lay. The fuel has to be cooled, that coolant has to be cooled and so on, or the sys­tem rapidly breaks down. Also, nu­clear re­ac­tors are com­plex, Per­row’s term for a sys­tem with many ap­par­ently un­re­lated com­po­nents that can un­ex­pect­edly in­ter­act.

The 1979 Three Mile Is­land ac­ci­dent, in which the re­ac­tor over­heated caus­ing the fuel to melt (a melt­down), be­gan when the sec­ondary coolant pumps stopped and au­to­mat­i­cally halted the tur­bine. The backup pumps failed to work (no­body re­alised) and an emer­gency valve de­signed to re­lease pres­sure from the now over­heat­ing core stuck open (again no­body knew, the con­trol room in­di­ca­tor hav­ing failed). Pri­mary coolant squirted out and un­cov­ered the fuel.

It wouldn’t nor­mally have been a big deal for the pumps to stop, but a chain of com­plex in­ter­ac­tions led from this to a melt­down. Three Mile Is­land also il­lus­trates Per­row’s point that adding safety sys­tems can para­dox­i­cally make things worse. They add com­plex­ity, in­tro­duc­ing pos­si­bil­i­ties for un­ex­pected in­ter­ac­tion. Ac­cord­ing to Per­row the only thing we can re­li­ably pre­dict about com­plex, tightly cou­pled sys­tems is that things will go badly wrong in un­pre­dictable ways.

Willem Labuschagn­e, a se­nior lec­turer in com­puter science at the Univer­sity of Otago, agrees and adds a fur­ther warn­ing about our lim­i­ta­tions. As a lo­gi­cian he is well aware that we are of­ten forced to rea­son de­fea­si­bly when faced with a lack of in­for­ma­tion— to do our best with as­sump­tions that hold most of the time but oc­ca­sion­ally lead us to make bad de­ci­sions. For ex­am­ple, he says, we ex­pect other cars to stop at a red light, and most of the time we are right. But we don’t have all the in­for­ma­tion— the driver might be drunk, or the car’s brakes might have failed— and some­times we’ll be wrong, with dis­as­trous con­se­quences. To use the road, we must ac­cept such oc­ca­sional mishaps.

The same is true of nu­clear en­ergy. A lot of the blame over Three Mile Is­land landed at the feet of the op­er­a­tors who, in hind­sight, did some very stupid things. But Labuschagn­e points out that the op­er­a­tors had lim­ited and faulty in­for­ma­tion about what was go­ing on. No­body could have un­der­stood all the things that hap­pened dur­ing the melt­down, not even the de­sign­ers. The op­er­a­tors could only do their best with the in­for­ma­tion avail­able and their own ex­pec­ta­tions, de­vel­oped dur­ing more nor­mal op­er­a­tion. But it wasn’t nor­mal op­er­a­tion, their ex­pec­ta­tions were flawed, and things got worse.

The sit­u­a­tion may not be hope­less. Per­haps it is pos­si­ble to build a re­ac­tor that can be com­pre­hended by de­sign­ers and op­er­a­tors, or that fails more slowly and man­age­ably. Some nu­clear pro­po­nents will ar­gue that new de­signs, like peb­ble bed re­ac­tors, solve the safety prob­lems of the pre­vi­ous gen­er­a­tion. Un­for­tu­nately, it isn’t the prob­lems you see that get you, but the ones that sneak up from be­hind. And the path to nu­clear en­ergy leads down a dark al­ley in a bad part of town. Dare we tread it?

Hay­den Walles is a science writer from Dunedin.

Out of ac­tion: two cool­ing tow­ers at Three Mile Is­land, in the United States, stand idle in this 1999 photo af­ter a par­tial melt­down in 1979.

Newspapers in English

Newspapers from New Zealand

© PressReader. All rights reserved.