*

Strategic Learning from Disaster

Dietrich Dorner - The Logic of Failure

In order to study decision-making in complex environments, Dietrich Dorner and his associates created a computer simulation in which participants took on the role of a small town "mayor." The fictional town controlled a municipally owned watch factory, and interacted with a number of constituents, including a bank, retail stores, medical practices and restaurants. The "mayors" were allowed to control spending and make important strategic decisions over a simulated ten year period.

Many of the 48 mayors ran their small towns into economic ruin. In these cases, unemployment ran rampant, home ownership dropped, businesses failed, and the citizenry was, as a result, quite unhappy with the leadership displayed by their mayor.

Other mayors, though, presided over a town displaying remarkable economic growth characterized by housing growth, financial success of business institutions, and a delighted citizenry.

As they examined the differences between the "good" and the "bad" mayors, the Dorner group found that success was quite predictable, varying according to the decision behaviors of the mayors. They found:

The good mayors made more decisions than did the bad ones over the ten year period. Moreover, good mayors tended to make more and more decisions over time, while the bad mayors made a number of decision early on, and then tended to stick with the plans they'd made early in the process. As the game rolled out, though, the good mayors continued to find possibilities for influencing the fate of their town than the bad mayors.

As the authors note, "a town is a complex system of interlocking economic , ecological, and political components." It is impossible to make a decision about one aspect of municipal or economic life without impacting some other part of the system. The good mayors were able to see the town as a complex system, and were better at recognizing the cause and effect relationships among variables. Moreover, they anticipated possible unintended consequences of their decisions. When decisions did not result with anticipated effect, the good mayors were able to "tweak" their initial decisions and nudge the town back on the hoped-for economic track, rather than "staying the course" toward economic ruin.

Two essential elements emerged as researchers studied the difference between the good an bad mayors. First, the good mayors seemed to understand that they were dealing with a complex system, and sought to balance complex relationships among a variety of variables. They sought to understand the dynamic underpinnings of the town's economy, and the dynamic relationship among variables.

Second, the good mayors tended to ask more "why" questions. The good mayors tested their hypotheses more often than did the bad mayors. Bad mayors tended to take events at face value, and were less introspective about the ways their own decisions and behaviors had affected the fate of their town. For the bad mayors, "to propose a hypothesis was to understand reality; testing the hypothesis was unnecessary. Instead of generating hypotheses, they generated 'truths.'"

* * * * * * *

The Chernobyl disaster provides an excellent example of decision making about complex systems gone awry. Most remember that a nuclear reactor exploded at the Chernobyl facility in the Ukraine, then part of the USSR, on April 26, 1986. The fact that the disaster was caused entirely by human error of highly experienced and well educated technicians is less widely recalled.

A nuclear reactor is designed to generate intense heat in order to created steam that is then converted to electrical power. The nuclear reactivity must be contained within certain limits so that the reactor does not overheat. The balance between the degree of reactivity necessary to create power and the degree of reactivity that can send a reactor out of control is regulated by what are called control rods. To mitigate reactivity, rods are inserted . To increase reactivity, rods are withdrawn. At Chernobyl, there were 211 rods available for use in Reactor 4. A rule-of-thumb was widely understood that there should never be less than 15 rods inserted into the core.

On that day in April, 1986, a series of experiments was planned for the purpose of identifying improvements to existing safety systems. To conduct the experiments, engineers wanted to reduce the reactor's output to 25% of capacity. As it happened, though, the reactor was cut down to 1% capacity because of a miscalculation by an engineer. Running at such low capacity was known to create instability in the reactor, and the engineers were anxious to bring the process back up to the safe 25% level.

The group was anxious to get started with their study. To allow the reactor to "heat up," the engineers began a series of over-correction, and removed all but all but 6 to 8 of the rods, ignoring well established safety rules, and putting the nuclear process into a state well below safety standards.

With the benefit of hindsight, we can see that the engineers did something we have all experienced in one way or another: They over-steered their vehicle. We all know that you must not over-correct when a wheel slip occurs while driving over ice, because the over-steering can send a car on an even more dangerous course than the one spurring us to make the adjustment. Likewise a ship's captain must not over-steer in adjusting his course because once instigated, forces of physics cannot be easily undone.

Getting the reactor heated back up seemed to take an inordinately long period of time and, perhaps due to simple impatience, the engineers decided to perform their experiments even though the reactor had only returned to 7% level. They knew that they were working with a system with dangerously low stability, but they were an award-winning group of well qualified experts. Relying on their intuition and experience, the group continued, perhaps, because they knew that they were collectively too smart for anything bad to happen. To begin their experiment, the engineers shut off a steam pipe to observe the effects on other elements of the system.

Quite suddenly, the reactor began to react to the series of adjustments the engineers had made. With so many rods removed, the reactor was getting very hot very fast. Imagine the panic experienced by these men as they watched the reactor go out of control. Quickly, they attempted to reinsert the ever-critical rods back into place. Unfortunately, the pipes that received the rods had bent because of the heat and pressure, and the rods could not be pushed in. They reactor exploded within two minutes of the beginning of their experiment.

The immediate explosion took two lives.

On 26 April 1986 at 01:23:44 a.m. (UTC+3) reactor number four at the Chernobyl plant, near Pripyat in the Ukrainian SSR, exploded. Further explosions and the resulting fire sent a plume of highly radioactive fallout into the atmosphere and over an extensive geographical area. Four hundred times more fallout was released than had been by the atomic bombing of Hiroshima.[1]
The plume drifted over extensive parts of the western Soviet Union, Eastern Europe, Western Europe, Northern Europe, and eastern North America. Large areas in Ukraine, Belarus, and Russia were badly contaminated, resulting in the evacuation and resettlement of over 336,000 people. According to official post-Soviet data,[2] about 60% of the radioactive fallout landed in Belarus.