Start your day with intelligence. Get The OODA Daily Pulse.

Home > Analysis > An Executive’s Guide To Cognitive Bias in Decision Making: A Career Intelligence Officer Provides Context on Fighting Bias in Judgement

An Executive’s Guide To Cognitive Bias in Decision Making: A Career Intelligence Officer Provides Context on Fighting Bias in Judgement

This is part of a series providing insights aimed at corporate strategists seeking competitive advantage through better and more accurate decision-making. The full series is available at our special section on Decision Intelligence.

Cognitive Bias and the errors in judgement they produce are seen in every aspect of human decision-making. I learned as a young intelligence officer that failure to keep these biases in check will lead to flawed intelligence and can contribute to operational failures. Later in industry I was able to witness first-hand how these cognitive biases cause problems in corporate decision-making. Companies that have a better understanding of these cognitive biases can optimize decision making at all levels of the organization, leading to better performance in the market. Companies that ignore the impact these biases have on corporate decision-making put themselves at risk.

Here is a simple example: A very common human cognitive bias is to prefer that things do not change. This preference may be ok when stability is key to the organization’s strategy. But the bias can hurt when change is needed. When change is needed, this bias can result in stagnation of business processes, failure to respond to customer needs, failure to innovate and loss of market share to more agile, innovative organizations. This bias towards the status quo is well known and leaders in corporations around the world know the importance of taking action to improve agility and make sure phrases like “we’ve always done it this way” do not become a corporate mantra.

There are many other biases that deserve attention to ensure your corporate strategy and operations are optimized. Many are actually more dangerous than the human desire for status quo. But they can be harder to spot unless they are being consciously checked for.

Below I provide my view of the most dangerous Cognitive Biases that threaten both your personal and organizational decision-making. These are the ones that will bite you.

Confirmation Bias: In the Intelligence Community this is considered the most deadly and dangerous bias. In the commercial world it can be the most costly to business. A confirmation bias results from a natural tendency to search for information that confirms our beliefs, or interpret data in a biased way to confirm what we think. The bias also causes us to discount information that might contradict our views. You can think of this tendency like having an internal “yes man” in your head who tells you you are right about everything. This bias can be very strong, especially when there are emotional issues at stake.

I saw many examples of this at play in my career in intelligence. Perhaps the most deadly wrong intelligence analysis of the last 20 years was because of the influence of this bias. I am referring to the official assessments of the status of Weapons of Mass Destruction (WMD) in Iraq. Everyone involved had plenty of reason to believe Saddam Hussein still possessed WMD. This led to horribly wrong national intelligence estimates and to the Director of the CIA famously telling the President that he was so confident that Iraq had WMD that that it was a “slam dunk.” Years later, A congressionally mandated study would later show many reasons for the failure to get this one right, including “an analytical process that was driven by assumptions and inferences rather than data.” Too frequently our analysts fell prey to a confirmation bias.

You can probably find examples of this bias in your own organization as well.  Confirmation bias contributes to overconfidence when it comes to evaluating your strategies or plans or proposals or fundraising. It also contributes to poor resource decisions. Mitigating this bias is hard and usually requires multiple strategies and approaches including training of organizations to keep an open mind and be receptive to new information. It also requires leaders to ask good, neutral questions when seeking insights.

The Mirror Image Bias: In the Intelligence Community and among the senior political leadership of the nation, there is a tendency to think that adversaries are similar, including that they would use the same logic and come to the same conclusion. This mirror image bias sets up decision-makers for surprise. The same bias can apply to corporate decision-makers when considering the potential decisions of a competitor or analyzing the buyers in various markets. The way to mitigate this bias is by better awareness of the true nature of an adversary or competitor and their culture when analyzing actors.

Attribution Bias (Why They Did It): In our mind’s search for clarity we can frequently tell ourselves that we know why an organization or individual or did something. In the Intelligence Community answering the question of why a country or its leader did something is a frequent task. The problem is that when there is no data we may operate on assumptions to make an assessment vice using data. This assumption is usually, in my experience, wrong. When details and data are found later, it is usually found that adversaries make decisions for many reasons and the assumptions that were made may or may not have had anything to do with it. In the business world this bias can contribute to bad decision-making by implying that you and your team have more insights into your competitor’s next move than you really do. The attribution bias can cause conclusions that do not reflect reality and impact both current and future decisions. Counter this bias by asking yourself how you formed an opinion on why an organization or individual did something. Are you basing your assessment on fact?

Cyber Amnesia Bias: There is a natural human tendency to pay more attention to what is right in front of us. The analogy often used is the “alligator closest to the boat.” There are dangers around us and the visible ones are the ones that get the most attention. In cyberspace this bias is magnified to the point of causing significant errors in judgement. Cyberspace, our interconnected IT, is invisible. Humans cannot tell what is going on there without instrumentation and translation of data into forms we can understand. When danger becomes visible through a breach we take action, but soon, since cyberspace is invisible, we develop amnesia over the threat and we act as if nothing could ever go wrong again. I have searched for ways to mitigate this bias for years and have to report that every method I have tried has fallen short. But things that can help include workforce education on cyber threats and study of cyber history and current cyber operations. Still, be aware, there will be a strong tendency in your workforce to think the threat to you in cyberspace has gone away. It is up to you to keep awareness up.

Anchoring Bias: This describes the tendency to rely on the first piece of information when making decisions. When presented with something it can become an anchor that future judgements are based off of. The easiest examples of this bias are in our daily shopping or in financial negotiations. If you are told a used car will be a certain price then negotiations will be from that price. The seller will want to that anchor to be high. But these simple transactional examples are not our biggest worries with this bias.

In the Intelligence Community I have seen anchoring bias hold back assessments from considering a full range of options. I have also seen this bias shape broad discussions on budgets and technology decisions. For example, a senior leader in the technology domain once declared that all agencies would transition to a single common desktop for all analysts. Many of us saw this as irrational and illogical and had no intention of supporting that, but his ardent statement became an anchor that all further discussion was tied to. His anchoring of discussions this way added friction to those of us trying to do the right thing.

In industry, the anchoring bias also applies to business strategies, executive decisions and planning assumptions. Consider an executive who tells his workforce he wants to consider opening a new branch in Tokyo or Seoul or Singapore and wants to assess which is best. Already the workforce will have an anchoring bias towards Asia overall, perhaps with good reason. But the boss said Tokyo first, does that become a mental anchor?  Mitigating this bias starts with using caution in any decision to ensure you are not simply thinking the first thing you found out is what you believe will always be.

Pattern Detection Bias: Humans have an ability to see patterns in data where none exist. This is well documented among gamblers, who may believe they see patterns in lotteries, card games, roulette or slot machines. This is known as the gamers fallacy. But it also applies to those that work with data anywhere. This bias towards pattern detection is a key them in Nassim Nicholas Taleb’s book “Fooled by Randomness”. In this book Taleb presents example after example of times where humans overestimated causality when what they were seeing was really just randomness. It is human nature to view the world as more explainable than it really is. Examples include almost every explanation for stock market moves you hear on the news at night.  Today a Bloomberg report was “Stocks dropped and the dollar rose after the Federal Reserve minutes signaled tempered optimism about growth in the second half of 2020.” Do you really believe that is why stocks dropped? The reporter saw a correlation. Who knows, maybe there is. But if you are basing decisions on information like that you better check for facts to be sure.

The point here is that correlation is not causation. Business leaders should ensure information they are being presented with is not tainted with this bias by asking the right questions of staff and not being afraid to ask to see the data. With the rise of big data analytics in industry finding ways to mitigate this bias are more important than ever. Data scientists know to test models and theories and know it is important to evaluate conclusions with historical and similar data. Executives should do the same or should ensure their team is checking to confirm that the patterns being presented are really based on data. Even then, we should all have an awareness that we can be, as Taleb might say, be fooled by randomness.

The Power Bias: In any hierarchy, positions of power become apparent. This could be military rank in the DoD, or SES/GS positions among government positions, or could be VP/SVP/EVP/CEO positions in the corporate world. Individuals in these positions are used to making decisions and expecting them to be carried out and almost always come with a good bit of confidence and even optimism. This all plays to a bias that tends toward assuming they are always right. When they are right that is good. When they are wrong this can cause organizational failure. Ways to mitigate this risk including making sure all involved know it is ok to question the approach being taken or by giving other feedback mechanisms to the workforce.

Special Knowledge Bias: This is a variant of the power bias where the decision-maker has a tendency to believe individuals who have access to unique insights. In government this may be someone who has access to classified information. In industry it may be someone who has access to what the CEO is thinking or what the strategic planners of the corporation are considering. I have seen this bias multiple times, including, accidentally contributing to having my own decisions more rapidly adopted. Here is an example: Upon returning to government to be the CTO of the Defense Intelligence Agency, I had to make many decisions, some of which would resolve large disputes between competing camps. There were a couple of times when I noticed people gave special credence to my views not because of my senior position, office title, or my technical insights, but because they believed that since I had spent time as an executive in industry that I must have special knowledge or skills. In industry people frequently joke about the consultant who must know a lot because they traveled from out of town to be here. These are all examples of special knowledge bias.  To keep this bias from inappropriate influence on your decisions remember to not judge people on what special knowledge you think they have, but on what special knowledge you know they have.

Framing Bias: This is our tendency to accept the boundaries or trajectory of a discussion because of the explanation given to us by authoritative figures. It is an influence that we allow others have over our thinking because of the way they presented information to us. This bias is different than most others because it involves entities that want to manipulate the discussion, but it plays right into our own bias to believe.

As an example, I in the late 1990’s I was part of a team investigating what would be known today as the first “Advanced Persistent Threat.” It was a sophisticated campaign backed by a well resourced adversary that was conducting a major series of computer intrusions against US military, DoE and academic computers. Law enforcement was investigating these crimes. My job was not law enforcement, but intelligence, which allowed me to use different analytical methods and make assessments vice having to build a court case. I made an assessment that the Russian intelligence services were behind this. A senior DoD decision maker pushed back on my assessment and argued saying: “So you are declaring that Russia has conducted an act of war by attacking our systems, you better be careful with your assessment, you are putting us on an escalation ladder!”  This caused an immediate framing of the discussion. For a brief moment his framing gave me pause, but it was a framing I had to reject. As an operational intelligence officer it was my job to make assessments adversary activity. Deciding how the nation would react should not taint my assessment.

This type of framing of problems to shape decisions also occurs in industry, frequently. To test whether problems are being framed the right way it takes a good dose of skepticism and an ability to quickly assess the credibility of a person and whether they are serving their own parochial interests. It also helps to be able to shift a discussion out side the boundaries framed by others to ensure your interests are being addressed (and to ensure your analysis is not being tainted).

Halo Effect and Horn Effect: Another natural human bias is to form an impression of a person and then tend to stick with that over time. This also applies to organizations, products or brands. That on its own is not a bad thing. The problem is when the halo effect causes decisions to be made for the wrong reason. This effect is at play in business of course. It could manifest itself by you promoting a person into a job that the person is not really qualified for, or give a supplier a task they really can’t deliver on, or make other decisions based on the wrong weighing of observations. The opposite of the halo effect, but the same basic idea, but in this case individuals believe that negative traits are interconnected. If an observer dislikes one aspect of something, they will have a negative predisposition towards other aspects.

Self-Serving Bias: It is perfectly natural for humans to need to maintain and even enhance self esteem. But at times this can cause distortion of cognitive processes. When this bias kicks in it can result in a propensity to credit accomplishments to our own capacities and endeavors and attribute failure to outside factors or others. When unchecked this bias can cause people to dismiss the legitimacy of negative criticism, exaggerate their own accomplishments and disregard their own flaws and failures. This bias has been known to effect behavior in them workforce and to impact organizational decision-making. In some cases this bias can be mitigated by training and education. Organizational activities may also help reduce its occurrence.

Status Quo Bias: This is the desire in some to have a preference over the current situation. There are some that love to do things the same way even when the world calls out for change. This bias is so prevalent and widely known that it is surprising how frequently it is still at play. Managers and leaders in organizations around the globe deal with this through a variety of management and leadership techniques which you no doubt are already familiar with. Perhaps the most important way to mitigate this bias is to be ready to call people out when you think this bias is at play. If they should be able to respond with logic and facts.

Concluding Recommendations:

There is no single inoculation that will prevent these biases from impacting your own or your organization’s decision-making. However, study of these biases will keep them front of mind and will most certainly lead to improvement of your own personal decision-making and ability to lead. Also consider ways to get this information deeper into your organization. Depending on your needs, you may want to provide training on cognitive biases to your entire executive team and your workforce as well.

Cognitive biases are just one of many components of decision-making. Additional study can be done on mental models for decisions (our favorite is the OODA Loop of course!).

References and Recommended Reading: 

My first exposure to cognitive biases was in the intelligence community, which had benefited from the writings of career intelligence analyst Richards J. Heuer. His writings were updated in a 1998 book titled “Psychology of Intelligence Analysis.” which is available as a free PDF on the CIA website. Part of the power of Heuer’s articles and books was his ability to determine the most relevant research on how the mind works and then write about them in ways contextualized for the intelligence community. His insights into cognitive biases and other very human thought traps stayed with me through every part of my career and today drive how I approach work in service to OODA clients.

His book was largely based on the work of Daniel Kahneman and Amos Tversky, who had been publishing a series of papers on cognitive bias. Their 1974 article in Science Magazine, titled “Judgement Under Uncertainty: Heuristics and Biases” is still totally relevant today. It was later expanded to a larger book by the same title. Kahneman continues to publish on these and other related topics of cognitive decision making. His book Thinking Fast and Slow includes a copy of the article he wrote with Tversky that started this entire field of research.

In 1973 Paul Slovic published Behavioral problems of adhering to a decision policy. The research behind the paper made it clear that humans will frequently be more confident in their assessments just because they have more data, even if the data means nothing!

Nassim Nicholas Taleb has written extensively on the need for clear thinking. All his books are worth digesting, but one in particular deals with key cognitive biases like the very human tendency to see patterns where there are none:  Fooled By Randomness: The Hidden Role of Chance in Life and in the Markets.

Dr. Gary Kline has spent time both in academia and the corporate world applying research into how people develop expertise in chaotic situations. (His research was key to insights I published on the topic of Intuitive Intelligence in the Defense Intelligence Journal). For more of his insights see: The Power of Intuition: How to use your gut feelings to make better decisions at work.

The bias of Cyber Threat Amnesia has been examined at CTOvision (see Avoiding Cyber Threat Amnesia). This was further explored in A Fierce Domain by Jason Healey. This book also gives excellent methods for mitigating cyber threat amnesia, which includes study of the critical episodes of cyber conflict.

Speaking of books, one way to keep mitigating your own biases is to maintain and active reading program of your own and encourage others on your team to do the same. For a quick review of books we are certain to be of high interest to our readers we recommend reviewing Matt Devost’s yearly list of Book Recommendations.

 

Additional Resources:

A Practitioner’s View of Corporate Intelligence

Organizations in competitive environments should continually look for ways to gain advantage over their competitors. The ability of a business to learn and translate that learning into action, at speeds faster than others, is one of the most important competitive advantages you can have. This fact of business life is why the model of success in Air to Air combat articulated by former Air Force fighter pilot John Boyd, the Observe – Orient – Decide – Act (OODA) decision loop, is so relevant in business decision-making today.

In this business model, decisions are based on observations of dynamic situations tempered with business context to drive decisions and actions. These actions should change the situation meaning new observations and new decisions and actions will follow. This all underscores the need for a good corporate intelligence program. See: A Practitioner’s View of Corporate Intelligence

Optimizing Corporate Intelligence

This post dives into actionable recommendation on ways to optimize a corporate intelligence effort. It is based on a career serving large scale analytical efforts in the US Intelligence Community and in applying principles of intelligence in corporate America. See: Optimizing Corporate Intelligence

Mental Models For Leadership In The Modern Age

The study of mental models can improve your ability to make decisions and improve business outcomes. This post reviews the mental models we recommend all business and government decision makers master, especially those who must succeed in competitive environments. See: Mental Models for Leadership In The Modern Age

OODA On Corporate Intelligence In The New Age

We strongly encourage every company, large or small, to set aside dedicated time to focus on ways to improve your ability to understand the nature of the significantly changed risk environment we are all operating in today, and then assess how your organizational thinking should change. As an aid to assessing your corporate sensemaking abilities, this post summarizes OODA’s research and analysis into optimizing corporate intelligence for the modern age. See: OODA On Corporate Intelligence In The New Age

Useful Standards For Corporate Intelligence

This post discusses standards in intelligence, a topic that can improve the quality of all corporate intelligence efforts and do so while reducing ambiguity in the information used to drive decisions and enhancing the ability of corporations to defend their most critical information. See: Useful Standards For Corporate Intelligence

In Business, Like In War, Data Is A Weapon

Broadly speaking, a weapon is anything that provides an advantage over an adversary. In this context, data is, and always has been, a weapon. This post, part of our Intelligent Enterprise series, focuses on how to take more proactive action in use of data as a weapon. See: Data is a Weapon

Fine Tuning Your Falsehood Detector: Time to update the models you use to screen for deception, dishonesty, corruption, fraud and falsity

The best business leaders are good at spotting falsehoods. Some joke and say the have a “bullshit detector”, but that humorous description does not do service to the way great leaders detect falsehoods. Bullshit is easy to detect. You see it and smell it and if you step in it it is your own fault. In the modern world falsehoods are far more nuanced. Now more than ever, business and government leaders need to ensure their mental models for detecting falsehood are operating in peak condition. For more see: Fine Tuning Your Falsehood Detector: Time to update the models you use to screen for deception, dishonesty, corruption, fraud and falsity

Bob Gourley

About the Author

Bob Gourley

Bob Gourley is an experienced Chief Technology Officer (CTO), Board Qualified Technical Executive (QTE), author and entrepreneur with extensive past performance in enterprise IT, corporate cybersecurity and data analytics. CTO of OODA LLC, a unique team of international experts which provide board advisory and cybersecurity consulting services. OODA publishes OODALoop.com. Bob has been an advisor to dozens of successful high tech startups and has conducted enterprise cybersecurity assessments for businesses in multiple sectors of the economy. He was a career Naval Intelligence Officer and is the former CTO of the Defense Intelligence Agency.