Just Culture: Are we sustaining a false belief?

6 Min Read

Search

Continued from part 1

None of this is just rhetoric on the part of those who want to keep operators out of jail. Instead, there is a large and growing research base on why accidents happen. And the balance of scientific opinion is clearly in favor of the systems view. Accidents happen not because individual people make errors. Accidents happen because entire systems break down. Accidents happen because once you open up the system in which the accident happened, and probe just a little deeper than those first errors and problems on the surface, you will discover much more. Many factors, all necessary and only jointly sufficient, have to align, in order for an accident to happen.

It is not that a few people doing stupid things is not sufficient for an accident to happen. Rather, it is not necessary at all for people to do stupid things. Accidents happen when everybody is doing their job. Their job as they and their organization see their job at that time. Harmful consequences can occur even when everybody is following the rules. Accidents happen because the systems in which they happen are exposed to risk, uncertainty, competition and resource scarcity. To people working inside those systems it is not a matter of doing stupid things, it is a matter of doing normal work in, what to everybody seems to be a perfectly normal organization.

Let us now look at the second problem. Firing and punishing people does not create progress on safety. It does nothing to prevent such accidents from happening again. In fact, it may even push up the risk; firing and punishing people can increase the chances of having an accident again. Here is how it works. When we fire or punish people, we actually sustain a false belief. We believe that the system is basically safe, and that with a few bad apples removed, it will once again be safe. But systems are not basically safe. There is no inherently safe system: systems are made safe by the people who operate them. All systems are compromises between safety and economy, between doing the job safely and doing the job at all; all systems are exposed to risk, to competition, to resource scarcity. All systems are subject to communication problems and political or managerial priorities. There is no perfect system, and no perfectly safe system. These systems are safe by virtue of the people who work inside of them.

When we punish a few individuals inside that system, a number of well-documented adverse consequences occur. Most importantly, people who work inside those systems get afraid. They will stop reporting on safety problems they see. They will not mention any errors or problems or incidents, for fear that they will be connected to them and hence lose their job or be otherwise punished. Such a system, in which no communication about safety issues exists, is a really risky, dangerous system. It is a system that does not learn from its own operations. It is a system that cannot and does not improve. It is a system that is much more likely to have an accident than a system in which people feel free to talk about safety problems. But as soon as people get punished for “errors”, they and their colleagues shut up. Such a closed, fearful system is a dangerous system, not a safe system.

Threatening operators using cover stories such as “they should be disciplined in carrying out their work” or “they should have an ethical awareness” is equally counterproductive. It misses the point entirely on what makes systems dangerous or safe. What makes systems safe is an awareness of the safety threats (this requires people to feel free to talk about them) that are out there. What makes systems safe is an awareness of where the boundaries of safe performance lie (this also requires people to feel free to talk about errors and problems they encounter) and how close the operation is to those boundaries. What makes systems safe is the realization that it is the entire system that succeeds, and the entire system that fails. Not individual heroes or anti-heroes.

What makes systems unsafe is punishing people. What makes systems unsafe is the illusion that safety is inherent in systems, if it weren’t for a few bad apples.

This article was written in 2004, as a response to the outcome of the Linate accident. Sadly, the text remains as relevant today as it was 15 years ago…