Meltdown
The airline industry studies every crash in great detail to learn the lessons only failure can teach. Lessons are emerging for other industries in which failure has potentially catastrophic consequences. In Meltdown, Chris Clearfield and András Tilcsik examine the common factors of situations as diverse as a bank failure, a surgical mishap and a nuclear reactor meltdown.
Three stand out:
The systems involved are highly complex -- not just complicated, but so multi factored and densely interactive that only computer models can come close to making sense of cause-and-effect relationships among their ‘parts’ (organs, employees, machine parts, financial flows, etc…).
The components are tightly coupled, or closely connected, so that activity or change in one quickly ripples through the whole system like a domino chain falling: one small touch and the fallout proceeds exponentially.
The humans involved must rely on intermediate technologies for information about what is happening, but often get so much information so fast that they cannot process it into a good judgment about what to do.
These situations interest me because they grow more common as more systems become complex (for example: a dam regulating water level used to be a simple system, but now dams are highly complex systems that require computer management through sophisticated instrumentation), and because they illustrate tension between concrete reality and the abstraction from that reality that causes, sometimes, a dangerous disintegration. For the most part, these systems work well. When the data does not correspond to the reality, however, or when a human operator is too overwhelmed by data to construct a response that adequately corresponds to the reality, meltdowns occur.
What do you do when you face a crisis of confusion? Research shows that most people start by following procedure (like good habits, good training in protocols and procedures is a great help) Then, if that doesn't work (and without correspondence between reality and act, it may make things worse), they focus narrowly on (and act on) one reason, factor, or solution (thus missing information outside that narrow frame of focus), or defer to authority (thus depriving the person in charge of a full spectrum of perspectives). This is all interesting, but what really caught my attention is the way this research has helped prevent meltdowns
Communication is of the utmost importance. We may make better checklists, do more thorough training, add more fail-safe instruments and run more simulations, but we must restore the capacity of the human actors to act in correspondence with reality in the totality of its factors! For that we must generate a context that takes human beings into greater account. We must carefully craft verbal structures (even script and practice them) by which the full complement of human whole-knowing and supportive social engagement and effective communication can occur.
No simulator can predict every possibility. When faced with an emergency, you – the pilot, the surgeon, the nuclear plant manager – need your humanity and the humanity of the wondrous beings around you to cope with reality adequately. Time after time, disaster has been averted by well-trained human teams supported in crisis by a strong matrix of good protocols, good communication, and awareness that the situation’s pressure to collapse them into human doings can and must be consciously resisted. The exercise of one's own judgement one's own authority and one's own whole-knowing must be cultivated and invited and shown to be of critical importance. The tendency to relinquish free agency to expertise, power, or authority must be overcome if lives are to be saved. This sounds like good advice for maintaining full spectrum freedom under pressure: weave a context that supports human being.
These thoughts originally appeared in Full Spectrum Freedom.