Start your day with intelligence. Get The OODA Daily Pulse.
In August, Bob Gourley had a far-ranging conversation with Sir David Omand. One of the most respected intelligence professionals in the world, Omand is also the author of the book How Spies Think: Ten lessons in intelligence. His career in intelligence began shortly after graduating from Cambridge in 1969 when he joined the UK’s GCHQ (Government Communications Headquarters). He would later become the director of GCHQ. He also served as the first UK Security and Intelligence Coordinator, the most senior intelligence, counter-terror, and homeland security position in the UK.
In May of 2020, Bob’s OODAcast conversation was with Carmen Medina, who served 32 years in senior positions at the Central Intelligence Agency, most of which focused on one of the hardest tasks in the community, that of analysis. Carmen rose to lead the strategic assessments group for the agency, then was deputy director of intelligence, the most senior leadership position for analysis at the agency, and one of the most senior positions in the entire intelligence community. She also spent time as the director of the Center for the Study of Intelligence, where she oversaw ways to teach, mentor, and improve analysis for the community.
We continue our effort to underscore certain patterns and themes found throughout the OODAcast library of over 80 conversations with leaders and decision-makers, on topics such as leadership, empowering a team, finding the right people, clear decision-making while operating in a low information environment, situational awareness, the qualities and best practices of a true leader, the nature and history of intelligence, the importance of Joint Ops/Intel operations, the future of intelligence and future of cyber threats and cyber espionage.
These two conversations hit on topics such as disinformation, cognitive bias, cognitive traps, biased decision-making during the pandemic, strategic action, the paradox of warning, and decision-making.
Bob and Sir David begin by discussing the framework Sir David developed to capture the essence of how intelligence analysts and operational decision-makers can deal with the modern information environment through perception and analysis. He calls this the SEES model. SEES stands for:
Situational Awareness: A baseline understanding of the situation gained through observation.
Explanation: Contextualizing facts so they can be better understood.
Estimation: The formal methods used to articulate what is known and what may happen short term.
Strategic Notice: The provision of actionable insights to decision-makers.
“My first lesson in intelligence is we need the humility to recognize that our knowledge of the world is always fragmentary. It is incomplete, and it is sometimes wrong. So do the best you can…this is an art, not a science.”
Sir David Ormand: S-E-E-S, that is what the good analysts do, they are seeing. So, the first S in SEES is situational awareness, and that is accessing data about what is happening on the ground or in cyberspace. We all have factual questions that we start with – what, when, and where -and before we start arguing about what to do about some situation, let us nail down some reliable situational awareness. Now, this is where the intelligence analysts can say, yes, but be careful your choice of where to look for evidence can distort the picture. You see, this is what happens with the internet filter bubbles, where we are only accessing information that has already been filtered to suit our worldview. This is where you can fall victim to rumors, for example, on the internet, sometimes deliberate deception. So, my first lesson in intelligence is we need the humility to recognize that our knowledge of the world is always fragmentary. It is incomplete, and it is sometimes wrong. So do the best you can. Be very careful that you are not being swept along by fake news or deception or rumor, but just be conscious that this is an art, not a science.
Bob Gourley: You also build a point in the book of, be careful, do not buy in a hundred percent, and lock yourself into what the Intel analyst told you yesterday – because they may come with a change.
Sir David Ormand: Absolutely. A lesson in intelligence is that facts, which we have established, we have our situational awareness by themselves. They are dumb even with the best artificial intelligence, the best data science data needs explaining. Let me give you a simple example. The defense lawyer you have the young accused, his fingerprints were found on the fragments of a bottle that was thrown at a police patrol. So, he is in court and the prosecution says, well, we have the fingerprint evidence it is incontrovertible. It shows he is guilty. And the defense lawyer stands up and says, no, no, this bottle was seized from outside the accused’s house. It was out there in a box waiting for recycling. The more charge passed. They picked up the bottle. It is not his fingerprints are of course, on the bottle, but the reason is innocent. And the same forensic fact, the fingerprint has two different explanations.
That is what lawyers do. I mean, what is the best explanation consistent with the available fact during the second world war British intelligence found that these lawyers the – high-grade barristers – were the best people to put in charge of crafting deceptions to fool the German high command. They were experts in presenting a case consistent with all the information that is available to the Germans in that case but carefully marshal to lead them away from the truth and towards the desired deceptive conclusion.
“…getting the explanation right is really important. And what you must do is you test your alternative hypotheses against the data, and you look- this is the point that some people hesitate over – you look for the explanation with the least evidence against it. Not necessarily the one with most in its favor…”
The second output of the SEES model, the first E of SEES, is Explanation. Why? How? What I mean, why do we think that Russia was responsible for the NotPetya cyber-attack? How did it end up costing global business over $10 billion and nearly destroying mask, the world’s largest shipping company? So, you need to explain what has happened in that case. The most probable explanation is yes, it was indeed the Russians, and they launched this attack through some tax software aiming at some Ukrainian companies, it escaped into the wild, it did all this damage, but they did not intend to do that damage.
Now, if the explanation, our analysts had produced had said, yes, this was a deliberate sabotage attempt on global capitalism from Russia, you would have had a quite different response to Russia. So, in the world of international affairs, getting the explanation right is really important. And what you must do is you test your alternative hypotheses against the data, and you look- this is the point that some people hesitate over – you look for the explanation with the least evidence against it. Not necessarily the one with most in its favor, because if you look hard enough, you can always find some evidence that supports an argument, however crazy. That is what QAnon does, but it should only take one solid piece of contrary evidence to show anyone whose mind is open that an explanation is inadequate. Sadly in 2003, and the run-up to the invasion of Iraq, we looked too hard for evidence that supported what we thought was the case. And which turned out not to be – the infamous stockpiles of weapons of mass destruction. But you can always find stuff – evidence – if you look hard enough. The key is to look all around for the information.
So, nailing down an explanation is really hard. But it’s important. And therefore, the second E in SEES is Estimation. So, if you have some decent facts that you have managed to get data that you have on the situation, and if you have gotten a good explanation that you think stacks up, you can use that to try and estimate how events might work out, particularly in the shorter term. So, this is kind of modeling. It is what the scientists are doing with COVID-19 every day as we speak, you know, they have data, they have models of transmission and how the disease operates. And they are providing governments with estimates based on these assumptions. This is what we think is probably going to happen over the next month or whatever.
And it’s, what is called in the trade Bayesian inference – after the name of the 18th-century cleric from Tonbridge Wells in England, Thomas Bayes. Some of your listeners have heard of Bayes’ theorem. You work scientifically back to infer causes from observation. You have the facts on the ground. You can work out from that. What is a situation to have caused those facts to be in that place? As it were, the Reverend Bayes and his first demonstration – a billiard table. So, you have a pattern of bubbles in the billiard table and what was the starting position. And you can estimate that from your observation. So, you use this data about the world to explain the recent past, and you try to give – particularly governments, but equally companies – estimates of how things are going to unfold.
You have opened a new branch in a foreign country. You have got information about your set, what is going to happen to the market over the next month or so, but the important point here is these are probabilistic estimates. They are not predictions. I’ve always, certainly British intelligence analysts are always trying to avoid using the word prediction. Nobody can tell the future. There are no crystal balls, but within limits, you can estimate things, particularly if there is, what is known in the trade as path dependency. you have got a sound explanation. That is not going to change overnight. But there are, again, all sorts of problems say traps, you can fall into if you are not careful in trying to look at it. It’s crucially dependent on the assumptions that you make. And if you get your assumptions slightly wrong? So, making the assumptions public and being prepared to argue about them is an important part of that kind of modeling just to round it off.
“A bias that I think has affected us during this COVID-19 period is this bias I think almost all of us have, which is to just assume that worst-case scenarios do not happen.”
Bob Gourley: Carmen, you write about thinking and how we think, and you have been doing that for years ever since you transitioned out, but it is true recently, you have been writing about how we think in the context of the Coronavirus. And I want to ask for a little more context around that, this concept of cognitive traps, right? What did you mean by that?
Carmen Medina: I think that we all have a, for lack of a better word, neurological disposition. I mean, we are learning a lot about how our brains are wired, and then our experiences affect how we think. And so, we all have tendencies, and it is very important that no one, nobody can be objective with the idea, the goal of a perfectly objective analyst is just fallacious. It is a dream. The best we can get is to try to achieve objectivity about our own biases, you know, to come to understand them. So, I have a bias that I am an optimist by nature, and it is very hard for me to accept pessimism. For example, something I got wrong: I just did not see how Yugoslavia was going to break up in the early 1990s. I could not figure out why people who had McDonald’s would fight a war.
That was definitely my pessimism bias showing up. A bias that I think has affected us during this COVID-19 period is this bias I think almost all of us have, which is to just assume that worst-case scenarios do not happen. You know, worst-case scenarios are unlikely. And in reality, the impact of a particular situation is independent of its probability. So, there are two variables that are independent of each other. And yet in common thinking, well, not just in common thinking among policymakers, it is very normal to think, oh, well, that is the worst-case scenario, so it is not going to happen. And that is what happened with COVID-19. People could see what was happening in China and then Italy, Iran, and then Italy, and for reasons that completely escape me, they assumed it would not occur in the United States. Now in retrospect cannot really explain it, but we lived through it. So, we know that happened, right?
I recall being at a conference in mid-January, January 16th in Minnesota, and there was a contingent of attendees there from Seattle. And I was thinking, golly, if the virus or when the virus comes to the United States is going to be a likely initial port of entry because of the Asian American connection. And of course, it was the Lunar New Year. So, there was a lot of travel going on. And when I looked at what was happening in Wuhan and that the Chinese government was closing down, essentially, it’s Chicago, right? It is a major industrial city – closing it down and closing Lunar New Year, right? Stopping the celebrations. I was thinking, wow, why would a government do something like that? That is, for the Chinese communist party whose legitimacy depends upon economic growth. It just made no sense to me. This is one of my favorite heuristic devices as an intelligence analyst: that actions reveal intentions and motivations – that when someone does something, they are doing it for a reason and try to figure out what the most likely reason might be.
Bob Gourley: That really resonates with me. It really does. And I just wish I were better at that personally. And I wish our policymakers were better at that. Additionally, you write about the fact that sometimes even when you have all the information and all the evidence you come to the wrong conclusion. Can you explain what you mean by that?
Carmen Medina: It is an interesting phenomenon because you would think that scientists would be trained at the appropriate use of evidence. But we know for example, that the scientific community is full of this replication problem, right? That, someone, will conduct one experiment and think it has some kind of universal meaning, and then nobody else can replicate it. So, when you draw a definitive conclusion, you really have to have a strong body of evidence. And the Corona crisis shows that you know, everyone, not just non-experts, but scientists and politicians are all prone to cherry-pick the evidence that suits their bias. And you know, I loved when people would say, oh, it is no more dangerous than the flu. Only 500 people have been killed. And I am like going well, it has just gotten started. I mean, there is this element of time as it unfolds, right? We are going to have more casualties. Even that Stanford study, the first iteration of the Stanford study that people point to as one that is not as bleak as so many of the others, the first time I read it in mid-March it predicted that only 500 Americans would die. And I remember reading that at the time and thinking it is just bogus. There is no way it is going to be just 500 Americans. And yet a lot of people, a lot of experts hoping that it would be true, like, glommed onto it. Right.
“My fourth lesson in intelligence is if you devote sufficient effort to acquiring the strategic notice and you use it, you do not have to be so surprised when the surprise happens.”
Sir David Ormand: The final S in SEES is quite different because my experience is that you get really focused on producing these estimates. So, telling people how things are likely to unfold, how the world is likely to unfold – and something completely unexpected comes and hits you on the back of the head. There is another way of giving the decision-maker or the businessmen or the government information which is quite useful. And that is the final S, which is strategic notice. And that is working the other way around. That is not starting with today and yesterday and trying to work forward it, starting with the future and trying to work back to the present. So, what are the developments in the future that you can think of which were they to come about – they might not, but if it happened- would we discover some kind of difficulties? Or perhaps there are opportunities. Would we be able to seize them? I will give you an example.
Suppose China is the first to develop a workable quantum computer at scale. Nobody has done it. I mean, Google has made some great progress, but they are not really quite there yet. Suppose China found it a completely novel way of doing it and they did it. What would they do with it? Would they give it to the bureau of state security who then has accessed the ability to decrypt virtually all our internet communications and major financial flows around the world are defense communications, they are all based on algorithms that use public-key encryption, which would be susceptible to that? So that is an example where it may not happen, but you can envisage it might happen. And the message for the policymakers is you better put some money into developing some quantum-resistant algorithms. So, if this happens, we are not caught at, and I am glad to say that the NSA in the U.S.and GCHQ in the UK are busy working on this as we speak.
But it’s just an example. you can think of all sorts of things in the future now. Unfortunately, pandemics AND COVID-19 was not really one of those. We had strategic notice. Both governments had strategic natives, but probably we did not do enough to prepare for it. So, my fourth lesson in intelligence is if you devote sufficient effort to acquiring the strategic notice and you use it, you do not have to be so surprised when the surprise happens. You would not be able to predict exactly when it’s going to happen. You cannot predict when a volcano is going to explode or the San Andreas fault is going to suddenly open up, but you know, that it might happen. You can put an estimate on, you know, a one and a hundred-year type event, and then you can take some sensible precautions. So, you put all those together. The first three letters of SEES working from yesterday and today forwards, and this final strategic notice working back from the future.
Bob Gourley: All right, well, like the SEES model a lot because you capture all these lessons in a very succinct way that is relevant, not only at the strategic level for governments but for businesses and for individuals. SEES, I appreciate that.
“And the other thing I would just add, given how humans are behaving, you know, a lot of people are compliant, but a lot of people clearly object to any rulemaking by authorities on how they should conduct their lives. I think that has implications for the cyber community, right? A lot of people are skeptical about expertise and professional advice.”
Bob Gourley: Carmen, I just know that the average business leader or CEO would really appreciate your perspective is one something you talk about, which is the streetlight effect?
Carmen Medina: Well, the streetlight effect, that is an old joke and it’s usually told that there is a drunk person on their hands and knees looking for something it’s night. Then the policemen find them and say, “what are you looking for?” And this person says, “I lost my car keys.” And the policeman says, “is where you lost them.” And the person says “no, but it is the only place I can see.” So, it’s just another reflection on this problem of evidence that sometimes we just take the evidence that is available to us and use only that to make a decision and that at a minimum, you need to go through the thought exercise of asking yourself, okay, this is all the information I have. What percentage of reality does it represent? Right? People might say it represents 10% of reality or someone might say 50%, but just asking yourself that question, I think we will give you some perspective on the streetlight effect.
Bob Gourley: Yes. The streetlight effect is just so relevant for business people because you know, they will look to their internal databases and if that is all you are doing at this particular time of crisis, you know, you need to be expanding your horizon. Right? Another one you write about is the paradox of warning.
Carmen Medina: Oh yes. That is a tragedy. Right? So, the paradox of warning is that essentially, if you are a warning analyst, you are hoping to be wrong. You know, in other words, by warning, you hope that the policymaker will take action, that will prevent the bad thing from happening. Right? And, but then if you do take the action and the bad thing does not happen, then people will say, well, you were wrong. You should not have warned anyway. So that is the paradox of warning. And you are seeing that right now [in September 2021] with a disease where sadly actually the best-case models have proven to be wrong because we already have more than 60,000 deaths were easily headed in the U.S. to 100,000, I think by sometime in June. But there are still people who are going to say it was not as not so bad because we have not had 2 million deaths yet. And that is the paradox of warning. Hopefully, we will not get anywhere near 2 million deaths because we will have taken appropriate actions. But once you take appropriate actions – the person who you warned is at risk of being labeled wrong.
Bob Gourley: You know, another thing that I see again and again, in the cybersecurity world, that exact effect is a CEO will take appropriate action, like stand up a security operations center and fund a great CISO activity. And I will do vulnerability management and this group will secure that organization. And then the CEO will be biased to think, oh, well, they will see I am spending too much on security.
Carmen Medina: Exactly. Right. Yeah. It’s a funny thing because if you want a resilient organization, you can’t spend 100% of your time and money on just executing your plan because by definition then you’re not resilient. You don’t have any kind of strategic reserve for capability. And yet people are, you know, it’s one of the things this crisis is teaching us that this drive to be just in time, absolutely efficient can have a catastrophic cost when a worst-case scenario happens. Right. And the other thing I would just add, given how humans are behaving, you know, a lot of people are compliant, but a lot of people clearly object to any rulemaking by authorities on how they should conduct their lives. I think that has implications for the cyber community, right? A lot of people are skeptical about expertise and professional advice. And even during the pandemic, they have remained skeptical about expertise and professional advice. So that should give cyber professionals all pause about the motivations and the thinking ability of the average person in their business.
Bob Gourley: Right? This does give me pause. And it also raises another issue, which is going back to your discussion about actions that can reveal intentions, right? If you look at what the cyber actors are doing right now, there were some in the community who hoped that maybe they would back off a little bit, well, the criminals are doing what criminals do and they’re getting more aggressive and they’re doing cyber fraud, regular fraud. I’ve seen DDoS attacks against organizations that are trying to make vaccines counterintuitive, but they’re just attacking and attacking, attacking. So, to me, you know, your point about actions shows you something about intent.
Carmen Medina: It is a really good point, right? Their malevolent destructive motivations have become evident.
OODAcast – Sir David Omand on Leveraging How Spies Think In Our Business and Personal Lives
OODAcast – A Conversation With Expert Practitioner of Analysis Carmen Medina
Sir David’s Book: How Spies Think: Ten Lessons in Intelligence
Carmen’s blog posts:
Thinking in the Time of the Coronavirus Part One
Thinking in the Time of the Coronavirus Part Two
Carmen’s book: Rebels at Work: A handbook for leading change from within
Other recent OODAcast thematic posts
OODAcast 9/11 Perspectives
Decision-Making Inside the CIA Counterterrorism Center Before, During, and After 9/11
A CIA Officer and Delta Force Operator Share Perspectives on 9/11