Safety Culture and Human Errors
One relatively prevalent notion in discussions of nuclear safety is that of safety culture. The International Nuclear Safety Advisory Group, defines the term as “the personal dedication and accountability of all individuals engaged in any activity which has a bearing on the safety of nuclear power plants”. The goal is “to design systems that use human capabilities in appropriate ways, that protect systems from human frailties, and that protect humans from hazards associated with the system”.
At the same time, there is some evidence that operational practices are not easy to change. Operators almost never follow instructions and written procedures exactly, and “the violation of rules appears to be quite rational, given the actual workload and timing constraints under which the operators must do their job”. Many attempts to improve nuclear safety culture “were compensated by people adapting to the change in an unpredicted way”.
According to Areva's Southeast Asia and Oceania director, Selena Ng, Japan's Fukushima nuclear disaster is "a huge wake-up call for a nuclear industry that hasn't always been sufficiently transparent about safety issues". She said "There was a sort of complacency before Fukushima and I don't think we can afford to have that complacency now".
An assessment conducted by the Commissariat à l’Énergie Atomique (CEA) in France concluded that no amount of technical innovation can eliminate the risk of human-induced errors associated with the operation of nuclear power plants. Two types of mistakes were deemed most serious: errors committed during field operations, such as maintenance and testing, that can cause an accident; and human errors made during small accidents that cascade to complete failure.
According to Mycle Schneider, reactor safety depends above all on a 'culture of security', including the quality of maintenance and training, the competence of the operator and the workforce, and the rigour of regulatory oversight. So a better-designed, newer reactor is not always a safer one, and older reactors are not necessarily more dangerous than newer ones. The 1978 Three Mile Island accident in the United States occurred in a reactor that had started operation only three months earlier, and the Chernobyl disaster occurred after only two years of operation. A serious loss of coolant occurred at the French Civaux-1 reactor in 1998, less than five months after start-up.
However safe a plant is designed to be, it is operated by humans who are prone to errors. Laurent Stricker, a nuclear engineer and chairman of the World Association of Nuclear Operators says that operators must guard against complacency and avoid overconfidence. Experts say that the "largest single internal factor determining the safety of a plant is the culture of security among regulators, operators and the workforce — and creating such a culture is not easy".
Read more about this topic: Nuclear Safety
Famous quotes containing the words safety, culture, human and/or errors:
“There is always safety in valor.”
—Ralph Waldo Emerson (18031882)
“To be a Negro is to participate in a culture of poverty and fear that goes far deeper than any law for or against discrimination.... After the racist statutes are all struck down, after legal equality has been achieved in the schools and in the courts, there remains the profound institutionalized and abiding wrong that white America has worked on the Negro for so long.”
—Michael Harrington (19281989)
“The business of a novelist is, in my opinion, to create characters first and foremost, and then to set them in the snarl of the human currents of his time, so that there results an accurate permanent record of a phase of human history.”
—John Dos Passos (18961970)
“Generally speaking, the errors in religion are dangerous; those in philosophy only ridiculous.”
—David Hume (17111776)