Tuesday, June 24, 2008

A Primer on Nuclear Safety: 1.3.3 Heat, Water, Steam, Mistakes

1.3.3 Heat, Water, Steam, Mistakes

Homer Simpson is wedded in the public mind to the issue of nuclear safety. We of course know that Homer is a reactor operator, and we know that Homer is going to make mistakes. Much of the Simpson's humor comes from a wedding of the horrifying and the funny, and the jokes that link Homer's professional lapses, with nuclear safety, are perfect examples of that link. We know that Homer is going to make mistakes, and that his job means that his mistakes can cause things to go horribly wrong. Some where in the back of their mind, most Simpson viewers, are probably aware that in the real world Homer would not get a job as a nuclear plant operator. Without that awareness watching Homer operate the reactor would simply be horrifying, not funny. Never-the-less in a back corner of their mind people wonder if there might be truth in the Homer Simpson image. The Canadian cartoon showing Homer advising Canadian Prime Minister Stephen Harper on nuclear safety plays off the Homer Simpson nuclear worker joke. In a way it this cartoon is the perfect expression of manipulative nuclear safety disinformation campaigns of anti-nuclear groups like Greenpeace.   Nuclear safety occupied the minds of some of the 20th centuries finest scientist, and they fought for it with great integrity.  Alvin Weinberg who was fired for his stand over nuclear safety,was surely the anthesis of Homer Simpson.  

Homer Simpson was perhaps a little more representative of the Soviet nuclear workers.  We have seen from the Soviet naval experience that reactor accidents can happen because of human error. How is it possible to manufacture a reactor which is built with such fundamental flaws, that cooling water fails to reach part of the core during the shake down cruise? The answer is quite simple. Give the reactor manufacturer an order that the reactor must be completed by a given date no matter what. Contingencies enter into the manufacturing process. A flue epidemic strikes the factors staff, and a third of the workers are home sick for two weeks. The remaining staff does the best that it can, but they have a deadline to keep, and the reactor goes out the door incomplete. A parts supplier fails to supply a vital part, that must be installed by a certain date if the reactor is to be completed on time. The part does not arrive, and the order is given to complete the reactor without it. A factory worker and his supervisor stay out late after work drinking. They both come to work the next morning very hung over. The worker fails to complete his assigned task for the day because of his hang over. The supervisor is too hung over to check on his subordinates work, and marks the tasks as complete.

There are ways for factory management to prevent such defective things from being manufactured. Just ask the Japanese. But not even the Japanese finish every project on time. The Japanese know that management staff and the workers must be well trained, and management and the factory staff must be motivated to do their job properly. The Japanese believe that there must also be a top to bottom commitment to quality control. For the Soviets, the words "quality control" were the punch line of a joke about the follies of the Capitalist system.

Of course, in order to build safe reactors, reactors must be designed safe. There is something basically wrong with the design of a reactor which goes critical if the pressure vessel lid is opened incorrectly. Even if there is a fundamental defect in the reactor design, once that defect is demonstrated through a criticality incident, and partial core melt down, it is inexcusable that the design flaw not be corrected. Yet we see the same design flaw leading to Soviet naval reactor accidents over and over. In contrast the United States Navy has never had in its entire history of naval reactor operations, a single incident of accidental criticality, followed by a partial core melt down.

The designers of the U.S. Naval reactor system, people like Hyman Rickover and Milton Shaw, might have been a little crazy, but they were obsessed with safety, and they did designed a safety system that worked in the Navy context. Unfortunately the believed that they had solved the nuclear safety problem, and that belief was mistaken. United States reactor scientists, understood the Navy's mistake, and there emerged during the 1960's a conflict between the Navy's view, and the views of AEC scientist.

During the 1960’s nuclear safety researchers had determined that major nuclear asccidents could occure due to unlikely sets of occurrences, none of which by itself could trigger a major accident, but which in an extremely unlikely combination of events could lead to disaster. Unlikely is not impossible, and it is even possible to calculate the odds of those unlikely events happening and a major reactor accident taking place. This lead to a new concept of nuclear safety, one which involved probabilistic risk assessment. Think of this in terms of a poker bet on drawing an inside straight.

The Navy, in effect said, we will never bet on drawing an inside straight. The scientist said, sooner or later an inside straight will happen. The Navy in effect wanted to place there safety bet on the proposition that their safety system was infallible. The scientist thought the Navy was mistaken. The result of this disagreement was something like a war between the Navy, and the AEC scientific community. The Navy won all the battles, but lost the war over nuclear safety twice. The first time was when Dixie Lee Ray removed Milton Shaw from responsibility for nuclear safety. The second naval defeat over the safety issue took place at Three Mile Island.

The Three Mile Island accident was to prove the probabilistic theory of nuclear safety theory correct. The nuclear research community began to take a view of nuclear safety with focused on probabilities rather than certainties, and began to feel that safety approaches that were based on simple A causes B models of potential accidents failed to capture the whole picture. The Navy thought in terms of the A causes B theory, a theory that had worked for them. The Navy, in addition, thought that the safety concerns of the civilian scientist was damaging the future of a civilian nuclear industry. There is in fact little doubt that the AEC nuclear safety controversy of the 1960's and early 1970', little understood by the public, and largely ignored by historians,was to have profoundly adverse consequences for the future public reputation of nuclear safety.

Paradoxically one of the effects of Milton Shaw's assault on the nuclear safety community, was the shutting down of research by nuclear safety researchers like George Parker.   What was lost when Parker's research was terminated was the knowledge that that even serious reactor accidents might not lead to major human disasters.   

No comments:


Blog Archive

Some neat videos

Nuclear Advocacy Webring
Ring Owner: Nuclear is Our Future Site: Nuclear is Our Future
Free Site Ring from Bravenet Free Site Ring from Bravenet Free Site Ring from Bravenet Free Site Ring from Bravenet Free Site Ring from Bravenet
Get Your Free Web Ring
by Bravenet.com
Dr. Joe Bonometti speaking on thorium/LFTR technology at Georgia Tech David LeBlanc on LFTR/MSR technology Robert Hargraves on AIM High