Desktop version

Home arrow Environment arrow Reflections on the Fukushima Daiichi Nuclear Accident

Where Do We Go from Here?

As stated in the introduction, this chapter has revolved around three questions:

1. What would it take to improve the quality of risk analysis and emergency planning so that this terrible accident and the subsequent loss of public confidence can be avoided in the future?

2. Can a risk analysis paradigm be developed that incorporates the cultural conditioning of people and organizations responsible for nuclear energy?

3. Can a global safety culture be developed while still preserving the societal culture of host nations?

In Appendix C, I describe the Station Blackout scenario as quantified in NUREG 1150 for Unit 2 of the Peach Bottom Nuclear Power Plant, a General Electric boiling water reactor (BWR-4) unit of 1,065 MWe capacity housed in a Mark 1 containment. This report, published in 1991 [26] was an updated version of the same analysis published in 1975 [27]. This nuclear reactor is basically the same as the nuclear reactor systems at Fukushima Daiichi, Units 1–4. The dominant internally and externally initiated accident sequences leading to core-melt for Peach Bottom in NUREG-1150 consists of three station-blackout scenarios, where the timing of two of them matches the sequence of events at Fukushima Daiichi (the spent-fuel pools notwithstanding). And yet, given the robustness of the analysis, the diesel generators at Fukushima Daiichi were not adequately protected from a large tsunami, in spite of warnings to the contrary, as we discussed above.

We might conclude that the risk as analysis paradigm described in Appendix B works well when the system under consideration has adequate historical or actuarial data on failure rates, and empirical data on public health and environmental impact. Moreover, the system must be fairly well defined, has (assumed) fixed or rigid boundaries and where second order or nonlinear effects are (assumed) small. In terms of a nuclear power plant, as long as the plant functions within its design basis, or accidents occur within its design basis envelope, we might call this “safe”.

Because challenges to public health and safety resulting from beyond designbasis events violate these assumptions, I believe a new paradigm for risk and ethical decision-making is required. And this brings me to the complex domain. Hence it is useful to describe here some of the basic differences between the science and technology of the Industrial and Post-Industrial Ages. The key distinction we draw is between systems that are “complicated” and systems that are “complex”.

The paradigm within which Industrial Age technologies are understood is based on an Enlightenment worldview. As said, this worldview is atomistic (reductionism), deterministic (cause and effect) and objectivistic (universal laws). In other words, the laws governing the behavior of these complicated systems can be:

• Understood by studying the behavior of their component parts,

• Deduced from cause and effect (a search for causal links or chains), and

• Determined independent of the observer, that is, only deduced from “objective” empirical observations.

The context within which our Post-Industrial Age Technologies and their underlying science are understood is based on a nonlinear worldview. This worldview gives rise to complex systems that are characterized by at least one of the following [28]:

Holistic/emergent—the system has properties that are exhibited only by the whole and hence cannot be described in terms of its parts,

Chaotic—small changes in input often lead to large changes in output and/or there may be many possible outputs for a given input, and

Subjective—some aspects of the system may only be described subjectively.

It is often said that for complex systems, “the whole is greater than the sum of its parts”. What this means is that there is an emergent quality (sometimes called an emergent property) that is not exhibited by the parts alone. Examples include electric power transmission grids, the disposal of high-level radioactive waste, and the response of social systems to severe natural phenomena. I believe that the new issues regarding national and international security also fall into this category. In each case, the system is simultaneously a whole and a part of a larger whole, a characteristic of complex systems.

It should be made crystal clear that the impacts of human activities on both society and the environment (from the development of the steam engine to the development of the jet engine) have always been complex. In the past, however, the only undesirable consequences of an Industrial Age technology, such as a nuclear power plant, that were considered in a PRA were geographically local (public health effects out to one mile or 25 miles) or they were observable in “real” time (a hydrogen explosion). This gave the impression that the current risk paradigm is accurate because locality and observability were two characteristics of the impact. This lens is changing, and yet our practices are still based on the same paradigm. That is, a nuclear power plant accident has “global” impacts (an accident at one plant affects the operation of all plants) and manifests very quickly (e.g. loss of public confidence worldwide). In the case of disposal of radioactive waste, the undesirable consequences are almost imperceptible (e.g. the migration of high-level radioactive waste takes place over geological timescales or millennia). Moreover, these impacts may be temporally persistent and/or irreversible (e.g. the degradation of public welfare due to nuclear proliferation).

Thus, as a result of the complexity inherent in Post-Industrial Age Technology, societal and environmental impacts are no longer geographically local, nor perceptible in real time, nor reversible. Rather, complexity can produce impacts that are geographically global (a malicious human act), imperceptible in time either manifesting very quickly (on the Internet) or very slowly (high level radioactive waste disposal), or irreversible (release of radioactivity due to a core-melt accident). We are like the driver of a modern automobile, cruising along on the Interstate (in a linear world), and now suddenly, we are faced with “black ice”!

The impacts we have described above lead to unprecedented ethical issues as reflected in the three questions above. Moreover, questions such as: “What constitutes an acceptable risk and why?” take on new meaning in the face of challenges to the ecology of life. There is a growing belief, as noted by Donald Rumsfeld's quote above, that not only is the future unknown, it is unknowable. Moreover, because these complex ethical issues are arising so much faster than ever before, and because there has been little time to develop normative processes for decisionmaking, there is even greater ambiguity. The unknown-unknown looms large in the domain of Risk as feelings.

What we are pointing to, for lack of a better description, is a Cultural Risk Analysis. This would entail making explicit the implicit cultural conditioning of individuals, and organizations/institutions, and their relationship to the society in which they abide. Such a Cultural Risk Analysis would illuminate cases where the underlying societal culture runs counter to the demands of safety culture, such as for nuclear power. If aspects of the societal culture are left implicit, they just don't underlie the safety culture, they will undermine it. If made explicit, it becomes possible for the safety culture to be designed and constructed in a way that accounts for, accommodates or even overcomes the conflicts between the two cultures.

Such a Cultural Risk Analysis would then require an analysis of cultural conditioning, much the same way we analyze the machine. This would mean understanding how underlying assumptions, values, and beliefs come from culturally defi sources and not “objective facts”.[1] However, there is one major difference; people are “complex” emotional, mental, physical and spiritual human beings. Humans are not “complicated” machines and so are not amenable to a linear reductionist approach.

Human beings have emergent properties, namely feelings and thoughts that do not reside in any one part of the body. Humans may respond differently to the same stimulus on any given day. And there are no “closed form” analytical solutions to describe human behavior; it is, for the most part subjective. Coincidentally with the development of these new complex technologies, there has been growing empirical evidence that in the realm of human decision-making, the emotional precedes the cognitive [29], and that motivation and intention derive from the unconscious– emotive and subconscious-mental [30]. These fi have found their way into such fi as Behavioral Economics [31] and Risk Perception [32], among others (An extensive literature review can be found in [33]). And a number of consulting companies have developed analytical methods in an attempt to quantify the “Risk Culture” of Business Organizations. In this case, the focus is on comparing the “selfinterest” of the individual employees versus the corporate interest.

Developing a framework for a Cultural Risk Analysis, i.e. to carry out a cultural analysis, requires a paradigmatic shift in human consciousness similar to the one that took place in the Enlightenment. And this will be extremely difficult because it is a shift requiring both the rational (cognition) and the emotional (feeling). It will require both risk-as-analysis and risk-as-feelings; it will require both moral reasoning and emotional morals. As any good engineer knows (at least those who have taken my class), a redundant and diverse system has order of magnitude higher reliability if the system is built of “AND” gates rather than “OR” gates.[2]

Perhaps Thomas Kuhn [34] said it best, “…that is why a law that cannot even be demonstrated to one group of scientists may occasionally seem intuitive to others. Equally, it is why, before they can hope to communicate fully, one group or the other must experience the conversion we have been calling a paradigm shift.” And, “Just because it (a paradigm shift) is a transition between incommensurables, the transition between competing paradigms cannot be made a step at a time, forced by logic and neutral experience. Like the gestalt switch, it must occur all at once (though not necessarily in an instant) or not at all.”

  • [1] By “objective facts”, I mean empirical observation and data. Evolution and Global warming are two areas where cultural conditioning and scientific observations and data are in conflict.
  • [2] Consider a system, S, with sub-systems A and B. If A AND B must fail, the failure rate of S is AB. If A OR B must fail, the failure rate of S is A + B. If the failure rate of A is 0.1 and B is 0.1, then AB is 0.01 and A + B is 0.2, rendering the AND configuration, an order of magnitude more robust!
 
Found a mistake? Please highlight the word and press Shift + Enter  
< Prev   CONTENTS   Next >

Related topics