How to Avoid Catastrophe
What are near misses?
Near misses are often unremarkable small failures that permeate day to day business but cause no “apparent” harm.
People are hard wired to misinterpret or ignore the warnings embedded in these failures, and so they often go unexamined.
If conditions were to shift these near misses could erupt into chaos and crisis.
When disaster happens numerous poor decisions and dangerous conditions have contributed to it
With near misses we overlook the warning signs. With each near miss, rather than raise alarms and prompt action, we move on along the process because nothing happened
We accept the fact that nothing wrong happened as a good indicator that we are making the correct decision
Multiple near misses normally proceed every disaster and business crisis.
Most of the misses are ignored or misread. Our cognitive biases conspire to blind us to these near misses.
Two particular cognitive biases cloud our judgment.
1. Normalization of deviance – the tendency overtime to accept anomalies as normal, particularly risky ones,.
Things we become too comfortable with become normalized.
Therefore, what should be dangerous could be perceived in our minds as being safe because no dangerous event has ever occurred.
2. Outcome bias – tendency to focus on the results more than on the often unseen complex processes
Near misses should be instructive failures where leaders can apply their lessons to improve and ward off catastrophe
However, ….
….when people observe successful outcomes, and do not recognize and learning from near misses, it is simply not a matter of not paying attention
Roots of crisis
When people observe a successful outcome, their natural tendency is to assume the process that led to success was fundamentally sound…. even when it was not
Organizational disasters rarely have a single cause
They are initiated by unexpected, seemingly unimportant small latent/human errors of:
technical failures
bad business decisions.
These latent errors or human errors align with enabling conditions to produce a significant failure.
Enabling Conditions are factors in the environment that contribute to an event happening.
Latent errors often exist for long periods of time before they combine with enabling conditions to produce a significant failure.
Whether an enabling condition transforms a near miss into a crisis normally depends on chance.
Thus, it makes little sense to try to predict or control enabling conditions.
Instead, companies should focus on identifying and fixing human errors before circumstances allow them to create a crisis.
Because latent errors are normalized by bias, near misses become increasingly acceptable. Further, deviances caused by the near misses are also normalized.
Remember: These latent errors underlying a crisis exist long before the crisis happens.
These deviances are cognitively ignored because of our outcome bias. The latent errors only become apparent when a crisis gains momentum.
When coupled with the right enabling conditions the crisis will erupt. Only when enabling conditions occur, the latent error will trigger a crisis.
Recognizing and preventing near misses
Research suggests there are seven strategies that can help organization recognize near misses and root out the latent errors behind them.
Heed high pressure
The greater the pressure to meet performance goals, the more likely people are to discount near miss signals or misread them.
A classic case of normalization of deviance is exacerbated by political pressure.
Pressure can create an atmosphere that increasingly accepts less than expected performance.
Research shows that when people make decisions under pressure, they tend to rely on heuristics, or rules of thumb.
Thus, they are more easily influenced by biases in high pressure work environments.
People who are more easily swayed by outcome bias are:
more likely to normalize deviance
more apt to believe that the decisions are sound.
2. Learn from deviation
Research shows that decision makers clearly understand the statistical risk represented by deviation, but become increasingly less concerned about it.
It is important that leaders seek out operational deviations from the norm/specific rules and examine whether their reasons for accepting or tolerating the associated risk has merit.
The question to ask is whether we have always been comfortable with this level of risk? Has our policy toward this risk changed overtime?
3. Uncover root causes
When leaders identify deviations, their reflex is to correct the symptom rather than its cause.
Leaders are to create an intentional model to report near misses.
Leaders should be encouraged to report mistakes and near misses so the lessons can be teased out and applied.
4. Demand accountability
Even when people are aware of near misses, they tend to downgrade their importance. OneNote be comfortable is to hold leaders responsible for and to justify their assessments of near misses.
5. Consider worst case scenarios
People tend not to think through the possible negative consequences of near misses unless they're expressly advised to do so.
Research shows that examining events closely helps people distinguish between near misses and successes.
Research also suggests people will often adjust their decision-making accordingly.
6. Evaluate projects at every stage
When things go badly, managers conduct post-mortems to determined causes and prevent recurrences.
…….Research suggests this is too late.
When things go well, however, few managers do a formal review of the success.
Because near misses can look like successes, they often escape review.
Reward owning
Observing and intending to near misses requires people to be motivated to expose near misses.
In many organizations, employees have good reason to keep quiet about failures.
When critically examining projects while they are under way, leaders can avoid bias and more likely to see near misses.
A technique called pause-and-learn process typically uncovers near misses that have gone undetected in the past.
Conclusion
Two forces conspire to make learning from near misses difficult:
cognitive bias, and
outcome bias.
When leaders do not recognize these biases, leaders tend not to grasp their significance.
Organizations often fail to expose and correct latent errors even when the cost of doing so is small.
They miss the opportunity to improve and learn from these small mistakes.