[Skip to main content]

Turnaround Management Association

Share this page:

< back to news index

Human Factors: The Dangers of ‘Confirmation Bias’

By Tyrone Courtman, Restructing Expert 

Some of you may be aware that I have a passion for all thing’s aviation, which stems from my childhood dream to become an airline pilot…. That was until I saw the light, qualified as a Chartered Accountant and became an expert in Financial Distress and Corporate Restructuring!... but my interest remains.

One of the more sobering aspects of that interest is the detailed review of reports provided by the UK’s Air Accident Investigation Branch (AAIB), which often analyses the sequence of events culminating in an air accident. Their purpose is simple. To share their professional analysis and conclusions reached, in the hope that the causal factors which led to the accident can in the future be avoided. One of the common themes in Pilots decision making is confirmation bias and I think there are many parallels, and lessons to be learnt, which can readily translate into the world of Business and management teams in crisis.

So what is it? Confirmation bias describes our tendency to seek out and trust information that confirms what we already think or believe or want to believe, and to avoid or discount information that goes against what we believe or want to believe. This can help to explain the seemingly irrational behavior that pilots often display, especially when they have a strong motivation to depart or continue a flight. Confirmation bias can strike first in the selection of what information to access, or whether to access any outside information at all. A pilot who desperately needs to make a trip may decide that “it doesn’t look that bad” or “it seems like it’s improving” or “it looks like it’s better just east of here,” and then not access any information sources that might negate that opinion. This could explain a pilot who:

  • Takes off in miserable conditions without getting weather information from any of the many available sources.
  • Fails to check icing reports even though the temperature is such that icing would be possible or even likely.
  • Takes off on a short high-elevation runway over gross weight without checking the weight and balance of the airplane or referencing the takeoff performance charts.

Even a pilot who seems to be carefully gathering and assessing all available information in a dispassionate manner may actually be heavily influenced by confirmation bias. The first problem is that, on a difficult weather day, the amount of information available is often more than a pilot can easily process. When this happens, our brains will naturally start looking for something in all that information to help us make sense of what is going on. Research shows that we are more likely to notice information that we agree with or consider positive rather than focusing in on information that is contrary to our beliefs or goals.

This analysis readily translates to management teams challenged by difficult trading conditions. Who perhaps fail to recognise and acknowledge technological changes in the market in which the business operates, believe that next year’s business plan, incorporating double digit revenue growth as a route to improving profitability, against a backdrop of the business never having previously achieved those growth rates, without any meaningful change in approach to their products or market.  

Confirmation bias can also affect how we gather the information we need. Studies have shown that a person will ask questions so that an affirmative answer supports what they want to do. A pilot who has just gotten a weather report with a forecast for marginal conditions might say, “So, there’s a chance we could make it.” While the answer might be “Yes, there is a chance you could make it”. In reality, that chance is miniscule. However, the pilot in the grips of confirmation bias will focus on the positive answer and will not try to define what the odds of making it actually are. To put it simply, we tend not to ask a question if we think we might not like the answer.

How the information is presented can also make a big difference in the decision a pilot makes. We tend to give a lot more weight to the information we receive early in a communication over information that is received later. Thus, in a weather report that begins with “skies are clear” and ends with a forecast for “low ceilings and visibility in blowing snow,” the fact that the weather is clear will have much more influence on the pilot’s decision than the forecast for low ceilings and visibility because it was presented first. A pilot who is told, “The forecast is for low ceilings and visibilities in blowing snow even though it is clear at this time,” would be more likely to make a conservative decision based on the forecast. It is believed that we start to form a hypothesis as the first information is received, and then are likely to discount information received later that might disprove that hypothesis.

On top of all this, we simply prefer pleasant thoughts to unpleasant thoughts. In flying, anything that supports our ability to make it to the destination on time is a pleasant thought, and anything that would cause it to be difficult or impossible to reach the destination on time is an unpleasant thought. Experiments have demonstrated that people typically demand more evidence to support an unpleasant outcome than a pleasant one. This explains why a pilot will easily accept the most tenuous bit of information that will allow him to continue on his way but will require much more evidence before recognizing that he should turn around.

Another problem is that the cost of taking the conservative response is often known and certain, whether it involves the cost of a rental car and a hotel room, the loss of a million-dollar opportunity or merely the embarrassment of not being able to make the flight, so these costs are very real to the pilot. On the other hand, the costs associated with pressing on in the face of evidence that it might not be wise to do so are uncertain. To make matters even worse, confirmation bias can also contribute to overconfidence in our abilities as pilots, so we are likely to think unreasonably that we can handle whatever conditions we might encounter. This in turn diminishes the likelihood that we will acknowledge that there could be negative consequences to proceeding on in the face of difficult conditions.

In a business context failure for the owner manager is not an option. Aside from the ego point, failure will often result in the loss of an income stream, the crysalisation of personal guarantee liabilities, perhaps the loss of their home.  How many times have you struggled to realign the over confidence of management to the realities of their predicament?

When you combine all the various aspects of confirmation bias, it is little wonder that we find pilots taking off or continuing the flight when it should be obvious the runway is too short, they don’t have enough fuel to make it to the destination or the weather is beyond their capabilities, or management failing to acknowledge the challenges facing a business and failing to do something about it and carrying in when all the signs around them are that their chosen course is not working. Even more discouraging, these are traps of human nature, so we usually aren’t even aware when they are working. Their impact has been felt in all aspects of society, from the financial crisis to military defeats to a major oil spill. Because confirmation bias is so strong and so pervasive, it takes strong tools to neutralize or counteract it. And that’s where you come in.

The first line of defense against confirmation bias is simply to be aware that it exists. Keep track of situations in which you find yourself or others making decisions that appear to be influenced by confirmation bias. Be especially alert whenever management are strongly motivated to reach a certain goal, or have a lot riding on a particular outcome. Listen to the words you and others use for questions or answers to see if they are biased toward a desired answer. Use aggressive skepticism to force yourself to consider the negative side of what might happen if things don’t work out as you hoped.

For pilots, the answer may be that you could die. Fortunately the consequences in business are often not so dire, but If you find yourself tempted to continue on a particular course, based on grasping for a tiny positive hope while ignoring overwhelming risks, ask yourself if the outcome you are seeking is worth destroying your business and your own financial position. Sometimes in military or search-and-rescue operations the answer may be “yes,” but in civilian life that is seldom the case. Finally, make a pact with yourself and the board members that, if there is ever any question about how to proceed, you will take the most conservative response until you can come up with a definitive answer one way or the other.

In summary, people are prone to believe what they want to believe. Seeking to confirm our beliefs comes naturally, while it feels strong and counterintuitive to look for evidence that contradicts our beliefs. This explains why opinions survive and spread. Disconfirming instances are far more powerful in establishing the truth. Disconfirmation would require looking for evidence to disprove it. The take-home lesson here is to set your hypothesis and look for instances to prove that you are wrong. This is perhaps a true definition of self-confidence: the ability to look at the world without the need to look for instances that please your ego.

Tyrone Courtman is a restructuring expert based in the Midlands in the UK, and formerly head of Restructuring at PKF Cooper Parry LLP

Share this page:

< back to news index

[Go back to the top of the page]