The human factor in accidents

Author

Jason Collins

Published

June 17, 2015

The below passage is from a neat article on how mistakes can save lives.

CRM [crew resource management] as born of a realisation that in the late 20th century the most frequent cause of crashes wasn’t technical failure, but human error. Its roots go back to the Second World War, when the US army assigned a psychologist called Alphonse Chapanis to investigate a curious phenomenon. B-17 bombers kept crashing on to the runway on landing, even though there were no apparent mechanical problem with the planes. Rather than blaming the pilots, Chapanis pointed to the instrument panel. The lever to control the landing gear and the lever that operated the flaps were next to each other. Pilots, weary after long flights, were confusing the two, retracting the wheels and causing the crash. Chapanis suggested attaching a wheel to the handle of the landing lever and a triangle to the flaps lever, making each easily distinguishable by touch alone. Problem solved.

Chapanis had recognised that human beings’ propensity to make mistakes when they are tired is much harder to fix than the design of levers. His deeper insight was that people have limits, and many of their mistakes are predictable effects of those limits. That is why the architects of CRM defined its aim as the reduction of human error, rather than pilot error. Rather than trying to hire or train perfect pilots, it is better to design systems that minimise or mitigate inevitable human mistakes.

In the 1990s, a cognitive psychologist called James Reason turned this principle into a theory of how accidents happen in large organisations. When a space shuttle crashes or an oil tanker leaks, our instinct is to look for a single, “root” cause. This often leads us to the operator: the person who triggered the disaster by pulling the wrong lever or entering the wrong line of code. But the operator is at the end of a long chain of decisions, some of them taken that day, some taken long in the past, all contributing to the accident; like achievements, accidents are a team effort. Reason proposed a “Swiss cheese” model: accidents happen when a concatenation of factors occurs in unpredictable ways, like the holes in a block of cheese lining up.

James Reason’s underlying message was that because human beings are fallible and will always make operational mistakes, it is the responsibility of managers to ensure that those mistakes are anticipated, planned for and learned from. Without seeking to do away altogether with the notion of culpability, he shifted the emphasis from the flaws of individuals to flaws in organisation, from the person to the environment, and from blame to learning.

The science of “human factors” now permeates the aviation industry. It includes a sophisticated understanding of the kinds of mistakes that even experts make under stress.

I recommend reading the full article. Among other things, it has a lot of interesting material about mistakes in medical settings.