Skip to main content

Blinkered vision?

wide open blue eyeAndrew McGregor and Barry Hughes consider the role of cognitive biases in incidents and incident investigations.

(This story was first published in the Jan/Feb 2012 edition of Safeguard magazine.)

 To trust or not to trust - intuition

In his 2005 book Blink, Malcolm Gladwell brought to the public’s attention a large body of scientific work suggesting that our minds often engage in a form of thinking that is often accurate despite its being rapid-fire: subconscious reactions that constitute a distinct category of thinking.

While Gladwell emphasised the utility of a fast-deciding intuition, we suggest that it is the source not only of errors that cause incidents, but also of compromises in their investigation.

Using applied cognitive psychology for safer thinking  

This article aims to introduce the reader to some practical implications of cognitive psychology. It suggests that an awareness of our susceptibility to fast rather than more analytical thinking, together with a willingness to think about our thinking, can not only help us to be safer, but also help our investigations to be more comprehensive, fair, holistic and preventative.

 The problems with thinking too fast

Psychologists associate fast, intuitive cognition with terms such as heuristics or cognitive biases: collectively, they constitute a mode of thinking that is fast precisely because it is shallow.

Of course, fast, intuitive decisions are sometimes the only ones to make: we have too little information or time, or lives are at stake, or contributing variables are complex. Sometimes these decisions can be as accurate as more deliberate ones.

But sometimes such decisions are disastrous. And while it is accepted that accidents frequently arise from such thinking, we argue that investigative processes can be undone by precisely the same modes of thinking that caused the accident, even when time or resources are not so limited. Incident investigators are people, too.

Thinking twice

Nobel Laureate Daniel Kahneman, in his 2011 book Thinking, Fast and Slow, has shown that fast decisions emerge (from what he calls System 1) rapidly and automatically, and that it takes an act of will and conscious oversight to engage a more rational thinking mode (System 2). His work is valuable precisely because it speaks to the human condition, not just the error-prone, the unskilled or the clumsy.

This research is remarkably applicable to a wide range of contexts. Wray Herbert, in his book On Second Thought: Outsmarting Your Mind’s Hard Wired Habits, has focussed on heuristics, those intuitive rules of thumb that serve the purpose of producing an answer when knowledge is deficient, or time is limited, or perhaps social pressures are more intense.

Is anyone prepared to take a hot air balloon ride? If the answer is a fast and emphatic ‘No’, this is more likely because of the recent disaster in which eleven people died than through any more deliberate consideration of ballooning safety. Familiarity is a product of intuitive, rapid reaction processes: that disaster is fresh in our memory, emotionally charged and easily retrieved. Why think more deeply when the answer –or at least an answer – has already come to you?

Who among us has not seen an accident and then and only then claimed that it was bound to happen sooner or later? This is hindsight bias and one that it is particularly relevant in incident investigations. These types of heuristics also exist in our work practices.

People draw upon the familiarity heuristic to develop cost-effective efficiencies and time savings. But this can also cause mistakes. The mental shortcuts that save time and money can blind good people to seemingly obvious hazards which, if missed, can make them look stupid and negligent.

 Taking  2

Sometimes techniques and measures are put in place to address these biases. An effective and simple de-biasing technique commonly used in industry is the catch phrase “Take 2”. It means stop and think before acting. This self-talk aids in changing the state of our minds from the relatively fast and intuitive System 1 mode to the slower and more deliberate System 2, and encourages us to think more deeply and prepare for a complicated analysis.

Navigating bias intelligently

But investigation processes do not have any tried and proven de-biasing techniques. Herbert warns: “We have a powerful bias for sticking with what we already have, not switching course. Unless there is some compelling reason not to, we let our minds default to what’s given or what has already been decided.”

System 1 heuristics can lead an investigator, even one experienced within the industry or work practice under investigation, to miss the possibility that a rule, code or standard is deficient or at fault or needs reviewing.

Instead, an individual accident participant is more likely to be judged at fault, particularly if the accident participant succumbed to a familiarity heuristic which, in hindsight, looks foolish. This is another case of ‘hindsight bias’. It means the inclination to see past events as predictable or foreseeable.

Confirmation bias often works in tandem with hindsight bias and is the tendency to actively seek and interpret evidence as supporting or confirming a desired outcome rather than challenging it.

Bias awareness 

Without an awareness of heuristics and biases, the investigator is likely to interpret the evidence at hand in favour of an established system, and at the expense of the individuals involved.

Without reminding oneself that these are human traits to which even investigators are susceptible, it may be difficult to justify the time and expense to dig deeper in order to see if other factors were also involved. If these are missed, important risks may not be identified, and the accident or incident is likely to recur.

Missing the bias

When the system misses an opportunity for correction or improvement, Kahneman calls this ‘an illusion of validity’. In Andrew McGregor’s experience, approximately 50 % of the accidents and incidents that he investigates are contributed to partly by established systems or processes, or have been poorly investigated in similar contexts previously.

Banishing hindsight for a safer future

Sydney Dekker, in his 2002 book The Field Guide to Human Error Investigations, has suggested a simple and effective de-biasing process to help investigators address the risk of hindsight bias. He maintains that in order to properly prevent accidents, one has to seek to understand what was going on in the mind of the accident participant at the time, and to do this consciously without the benefit of hindsight. Otherwise someone else is likely to repeat the mistake.

If investigators are more aware of the downside of intuitive, System 1 judgments, and become aware of occasions when they are relying on such judgments, investigation results are more likely to be realistic and accurate, and the recommendations more effective in preventing future accidents.

 If only our friends in white wigs and suits could see it that way.

 About the writers:

Andrew McGregor is a consulting mechanical engineer, experienced project manager and a commercial pilot with international training and experience in air accident investigation. He is a director of Prosolve Ltd, a forensic engineering practice.

Barry Hughes lectures in cognitive psychology at the University of Auckland, where he is also a member of the Research Centre for Cognitive Neuroscience.

What do you think?

Does intuitive thinking support a flawed safety culture? Is this straight or crooked thinking?
Please post your comments below. Let's hear it from you.

 

 

 

By Angela Gregory

Speak to a consultant

Can't find an answer to your question?
Contact our support team.

Request training

Contact our team to arrange training.

Tell us what you think

We'd love to hear what you think
of our products and support.