Safety Culture and Human Errors
One relatively prevalent notion in discussions of nuclear safety is that of safety culture. The International Nuclear Safety Advisory Group, defines the term as “the personal dedication and accountability of all individuals engaged in any activity which has a bearing on the safety of nuclear power plants”. The goal is “to design systems that use human capabilities in appropriate ways, that protect systems from human frailties, and that protect humans from hazards associated with the system”.
At the same time, there is some evidence that operational practices are not easy to change. Operators almost never follow instructions and written procedures exactly, and “the violation of rules appears to be quite rational, given the actual workload and timing constraints under which the operators must do their job”. Many attempts to improve nuclear safety culture “were compensated by people adapting to the change in an unpredicted way”.
According to Areva's Southeast Asia and Oceania director, Selena Ng, Japan's Fukushima nuclear disaster is "a huge wake-up call for a nuclear industry that hasn't always been sufficiently transparent about safety issues". She said "There was a sort of complacency before Fukushima and I don't think we can afford to have that complacency now".
An assessment conducted by the Commissariat à l’Énergie Atomique (CEA) in France concluded that no amount of technical innovation can eliminate the risk of human-induced errors associated with the operation of nuclear power plants. Two types of mistakes were deemed most serious: errors committed during field operations, such as maintenance and testing, that can cause an accident; and human errors made during small accidents that cascade to complete failure.
According to Mycle Schneider, reactor safety depends above all on a 'culture of security', including the quality of maintenance and training, the competence of the operator and the workforce, and the rigour of regulatory oversight. So a better-designed, newer reactor is not always a safer one, and older reactors are not necessarily more dangerous than newer ones. The 1978 Three Mile Island accident in the United States occurred in a reactor that had started operation only three months earlier, and the Chernobyl disaster occurred after only two years of operation. A serious loss of coolant occurred at the French Civaux-1 reactor in 1998, less than five months after start-up.
However safe a plant is designed to be, it is operated by humans who are prone to errors. Laurent Stricker, a nuclear engineer and chairman of the World Association of Nuclear Operators says that operators must guard against complacency and avoid overconfidence. Experts say that the "largest single internal factor determining the safety of a plant is the culture of security among regulators, operators and the workforce — and creating such a culture is not easy".
Read more about this topic: Nuclear Safety
Famous quotes containing the words safety, culture, human and/or errors:
“A lover is never a completely self-reliant person viewing the world through his own eyes, but a hostage to a certain delusion. He becomes a perjurer, all his thoughts and emotions being directed with reference, not to an accurate and just appraisal of the real world but rather to the safety and exaltation of his loved one, and the madness with which he pursues her, transmogrifying his attention, blinds him like a victim.”
—Alexander Theroux (b. 1940)
“As the traveler who has once been from home is wiser than he who has never left his own doorstep, so a knowledge of one other culture should sharpen our ability to scrutinize more steadily, to appreciate more lovingly, our own.”
—Margaret Mead (19011978)
“Whoever regards human beings as a herd and flees them as swiftly as he can will no doubt be overtaken by them and impaled on their horns.”
—Friedrich Nietzsche (18441900)
“Generally speaking, the errors in religion are dangerous; those in philosophy only ridiculous.”
—David Hume (17111776)