Start your day with intelligence. Get The OODA Daily Pulse.
We continue our effort to underscore certain patterns and themes found throughout the OODAcast library of close to 100 conversations with leaders and decision-makers, on topics such as leadership, empowering a team, clear decision-making while operating in a low information environment, the qualities and best practices of a true leader, the future of intelligence, the future of cyber threats, the cybersecurity marketplace, innovation, exponential technologies, and strategic action.
In Part I of this conversation, we highlighted previous generations of great technology leaders, with Porter recounting several situations in her early career where she learned from role models like George Heilmeier and Dr. Tether. A core theme of the conversation is the importance of quickly assessing challenges that were not being addressed and then forming an ability to express what needs to be done and how to do it quickly This approach is very consistent with the famous “Heilmeier Catechism”, which ended up producing a wide range of DARPA breakthroughs. Porter and Tether discussed the framework at length, with Tether sharing a really interesting personal anecdote from his early career.
In Part II, Dr. Tether shares the origin stories of DARPA, NASA, and the “Heilmeier Catechism.” Dr. Porter shares her experience at NASA, IARPA, and the DoD. The role of Congress is discussed in a variety of contexts, as well as courage as the primary quality of a leader. Finally, Dr. Porter discusses why Marcus Aurelius’ Meditations is her go-to book when the heat is on and she needs a stoic reminder to stay true to herself as a leader.
In Part III, as the final question in both conversations, Bob made a point of asking both Tony and Lisa their perspective on the future of cybersecurity, Following are those conversations.
“The problem is that we have made it, so people don’t have to, don’t have any way of protecting themselves. Cause we deliberately made it that way. And they like it that way up until the time when they get violated.”
Bob Gourley: I just want to say, I really appreciate all this. And I wanted to conclude with a hard question. And I figure if I can’t ask you an incredibly hard question, who am I going to ask? I would love your thoughts about cybersecurity. Any thoughts you have on this topic at all? But something that really concerns me is the fact that we invest so much in technology, advanced technology, not just in government, but commercially only to have that stolen by adversaries that want to leverage our work and the creation of intellectual property for their benefit.
So we have this issue of needing to protect our intellectual property. Additionally, we’re building this world where everything is interconnected, so it makes it easier for the adversaries to use digital means to come right up next to us and threaten our personal privacy – or lock things down with ransomware or steal information for their financial gain. And it is really a concerning world we live in. And I just wanted to ask your thoughts on that dynamic.
Dr. Tether: Well, it is the problem – and the problem is that we created it. We recreated this world with networking and all, and we deliberately created it so that when a person wants to go get some information, he doesn’t have to know how to make it happen. It happens. And then there are people who then actually get to know a little bit more. And so rather than having to be a random thing, they can kind of help steer it to a location. And so we have created a network which is open to somebody doing Management Information Systems (MIS) now, one way to solve and we can try to come up with all kinds of ways, but really all the people, when they’re coming up with ways to protect it, what they are really are doing is that they’re pulling up barriers.
Right. They are closing it down. They are not letting it be. So you don’t have to know what you’re doing, it’ll go figure it out for itself type of thing. And I don’t think we’ve gone so far now that we can’t reverse this, but, and maybe what we need to do is to maybe one way to satisfy it is to basically figure out what is it that we really want to protect? What is it that we really want to protect and really put them put a wrapper around that where you don’t get in if somebody doesn’t give you permission? Yeah. It makes it a little bit harder. That means you got to get permission and you got to, and that is a problem that takes time.
And it won’t and it just can’t happen at the speed of light. But so you do that. And, but then the question is, well, what is not so important that you let people get in at their will? Do you mean my home? Well, I do not want people going it around in my home and look at cameras and seeing what the hell’s going on. So you can bring yourself to a conclusion that really what we really want to do is somehow find a way that we can go and have my home be one of those places that you just don’t get in without permission? Yeah. If I’m not around or to give you permission or whatever – it will just take longer for you to get in.
It might even make my wife mad because she might forget the code word or something like that, but I mean, there’s all this downside of doing this, but I don’t have any other really good answer. I really don’t have any good answer except to sort of like “Back to the Future”? Yeah. Right. Just say What is the Problem? The problem is that we have made it, so people don’t have to, don’t have any way of protecting themselves. Cause we deliberately made it that way. And they like it that way up until the time when they get violated.
Gourley: Right.
“There are ways to do it, but that means we have to slow down a little bit. We no longer can be going a hundred miles an hour… we now must, maybe we can only go 60 miles an hour? There’s a consequence. But somebody needs to start thinking about that.”
Tether: So I don’t have an answer, I guess. I mean, really I don’t, I don’t have a good answer anyways. I don’t have a good answer. I think we got to recognize that is the problem. That’s a problem. And I don’t have any solution. I mean, without the difference being that we go back [to when] were less flexible?
Gourley: I think you’ve hit on a key point here, which is: there’s not going to be a solution. This is going to be with us indefinitely. It is an infinite game. We are going to continuously be under cyber-attack. Just like a crime is going to be with us indefinitely. And law enforcement will always be an issue in the real world in cyberspace. We need to understand there’s always going to be a threat.
Tether: Well, what, on our cars, we’re making our cars so that they are going to drive themselves. By the way, there’s no question in my mind that that can be done. Cause obviously, once again, we did prove that, right? We had the [DARPA] contest and people showed they could do that in traffic with other cars. The problem we got is that can somebody get in and take over your cart? Yeah. I don’t know why not. There’s nothing there stopping it. There is really nothing there stopping that from happening.
And so, maybe we can figure out a way to when we send out a signal that we have a code on that signal so that when it reflects to us, we know we were the ones that started it. So let that one [signal] goes through right away and others don’t. There are ways to do it, but that means we have to slow down a little bit. We no longer can be going a hundred miles an hour and having this car so flexible that it keeps itself. We now must, maybe we can only go 60 miles an hour? There’s a consequence. But somebody needs to start thinking about that.
Gourley: Right.
“…just knowing the problem is not [the only thing. It is also] what are your ideas for solving this problem?”
Tether: And that is the only answer I know, is that you are going to have to put some restrictions on it – and restrictions mean you are not going to be as fast and flexible as you were before. And I think, and this is a DARPA hard problem, by the way. I mean, this is really a DARPA hard problem is how do I come up with a system which allows only me to get in, or me and the people that I have explicitly allowed to get in and yet not lose the performance that I love so much today. That’s a DARPA hard problem. And I must believe there’s somebody in DARPA trying to solve that problem. I don’t know who it is, but I must believe there is somebody doing that. It is such an obvious thing that somebody had to come and I’m sure they got the money to go and, well, if they have any ideas, I mean, just knowing the problem is not [the only thing. It is also] what are your ideas for solving this problem?
“I’ve never seen a secure system nor have you, nor has anyone else.”
Gourley: That’s great. I wanted to ask about security because you have had a lot of time in engineering and design, and you’ve worked with a lot of folks in government, and I know you have seen unsecure systems and you have also seen secure systems and would just love any context you can provide on your approaches to securing systems.
Porter: So I would say Bob, I’ve never seen a secure system nor have you, nor has anyone else. So that is sort of the fallacy that I think we’re all falling into. Fortunately, I think now people are really waking up to this. So that is the first thing, Way back when I started IARPA (Intelligence Advanced Research Projects Activity), the whole, you might remember CNCI (The Comprehensive National Cybersecurity Initiative)? The whole cybersecurity national…
Gourley: …Melissa Hathaway…
Porter: …yes. And the billions of dollars that were accompanying that. And the whole question they were asking is: How do we build secure systems? And I was having arguments with a lot of people saying, that’s not the question. That’s not it. The question is how do we operate effectively and resilient in systems that are inherently not secure? I wasn’t doing the one asking. I don’t want to imply that I’m a brilliant person. There were other people asking this, but our voices were kind of suppressed versus those who wanted, they wanted that easy button that you could press. It says, now I have a system that is secure. I don’t care how much money it costs me, just tell me what I need to do. And I can press the easy button. And now everything is perfect. Life doesn’t work that way.
“We are seeing NSA come out publicly and say, hey, Zero Trust is really the way we need to think.”
We all know that, but we all seem to want that to the point where we allow that to drive us to make decisions that ultimately are not good. So all of this to say what that led to is an approach where if you can build secure systems, great. Now I have this approach where I believe I can trust my system. So, once I have trust in my system, now I’ve introduced a vulnerability because I believe it is trusted. And, and then Edward Snowden comes along and reminds you why that is not the approach. So nowadays I think you’re seeing the community, at least in the networking community, at least, in the cyber community, they’ve really learned from this. And they’ve said, we have got to embrace this notion of zero trust. I’m a huge advocate of this philosophy. It doesn’t come from cyber security. It comes from the intelligence community and how they operate when they operate well.
Which is, hey, I’m not going to trust because trust is a vulnerability. I cannot simultaneously say that I am implementing zero trust and that I have a trusted network. That means I don’t understand zero trust. I must embrace the reality that I can operate without trust. But that means I must approach things differently, to recognize that I am either already penetrated or I can or will be penetrated. How do I operate in that situation? Okay. Well, now that means I must think more in terms of resilience versus prevention. Resilience to the fact that ultimately things are going to happen. That doesn’t mean you throw out prevention. It doesn’t mean you do “stupid.” It doesn’t mean you voluntarily say, I’m not going to work with this person or this entity that I know is going to try to hurt me.
But it does mean that even if I think I can trust you because you’re a good person, I must stop myself and say, well, what does that mean? That doesn’t mean they aren’t vulnerable. It doesn’t mean that they don’t have somewhere in their supply chain that they’ve been breached. It doesn’t mean that everybody on their team doesn’t have something going on there. So I’ve got to be smarter and employ a zero-trust philosophy.
And by the way, Bob, this is important because it’s not just the network. So we are seeing the network community. We are seeing NSA come out publicly and say, hey, Zero Trust is really the way we need to think. Great. But it’s not just networks. It’s foundries. People love this notion of trusted foundries. Why do you think that putting a perimeter around your foundry makes it secure?
Porter: All it does is prevent you from accessing the cutting edge in the state of the art. And we’ve put ourselves in quite a box in the DOD, in the IC, by embracing this notion of a trusted foundry, which is not provably more secure and, more importantly, has prevented us from accessing the state of the art. So therefore we’re not secure. Supply chains, same point. Just because you, first, you are not going to have everything built in the United States. You just can’t. So at some point, something must come in, whether it’s the raw material, or whatever it is, something is going to come in. Exactly. And so, instead of saying, I’m going to make everything secure by building a wall and not letting quote “anything” in, I’m instead going to recognize that there are inherent insecurities.
And I’m going to try to design architectures and approaches that are data-driven and flexible. That allows me to take a risk-based approach to recognize I’m always making that trade. And to do it with my eyes open.
“…the thing you got to believe and understand, I should say, is you are not ever going to be perfectly secure. Life is not like that.”
Gourley: Well, let me try this on you, then, let’s say the approach should be, yes, use best practices to mitigate as many vulnerabilities as you can. Raise your defenses, but assume breach…
Porter: <affirmative>
Gourley: …assume that you’ll be surprised. So build in detection and automate that as much as you can and seek to protect your critical data separately. So if an adversary inside your system tries to attack that, you get even more opportunities to detect and detect breaches (the movement of data out).
Porter: I’m not saying, of course, that you tear down the first line of defense, so to speak. You just must recognize that that is not doing very much for any kind of determined adversary. So it doesn’t mean you don’t do it because, of course, anything that you can already implement that allows for some protection is fine. But this notion that somehow now you are protected is what’s gotten us into trouble.
And so, if you look at what NIST has done in putting out a zero-trust architecture document that says, okay, this is our first step at educating the community and how to think about this. You will see, and they highlight themselves, it is not perfect. They acknowledge these are areas where this still must be figured out. Nobody who advocates zero trust, including myself, would say that is the solution that makes you secure.
Porter: That is the whole point. But the thing you got to believe and understand, I should say, is you are not ever going to be perfectly secure. Life is not like that. I mean, people who are not technical should realize the analogy here. <Laugh> the only time you’re secure against anything bad happening to you is when you are already dead. Well, what the heck point is that? So, similarly, if we operate any kind of complex system, of course, bad things can happen and probably will. And so the question is resilience. How do you ensure that when the bad stuff happens, you’ve architected a system that allows for resilience? Mitigates whatever the bad actor is doing? Confines what the bad actor is doing to a local area, to your point, segments what they have access to from most of the rest of the data. Make the data distributed as much as possible, encrypt as much as possible, all these tools, but with a mindset of I don’t think I’m making it perfect. I’m making it resilient. It’s a very different approach.
Part II: DARPA, NASA, IARPA, DoD, Courage, Leadership and Aurelius’ Meditations
Tony Tether On Technology Leadership and Lessons Learned From DARPA
Slides Dr. Tether Reviewed in this OODAcast are at this link: Dr. Tether Presentation
Lisa Porter On Innovation, Technology, Security and Lessons in Leadership
George H. Heilmeier, a former DARPA director (1975-1977), made all who came to DARPA (with a new idea or project request) answer a set of very simple to understand questions that are still in use today. These simple questions, now called Heilmeier’s Catechism or Heilmeier’s Rules, were not always simple to answer, especially if an idea was not firmly rooted. They are:
Deep Tech, the “Valley of Death” and Innovative Technologies for the Warfighter
It should go without saying that tracking threats are critical to informing your actions. This includes reading our OODA Daily Pulse, which will give you insights into the nature of the threat and risks to business operations.
Use OODA Loop to improve your decision-making in any competitive endeavor. Explore OODA Loop
The greatest determinant of your success will be the quality of your decisions. We examine frameworks for understanding and reducing risk while enabling opportunities. Topics include Black Swans, Gray Rhinos, Foresight, Strategy, Strategies, Business Intelligence, and Intelligent Enterprises. Leadership in the modern age is also a key topic in this domain. Explore Decision Intelligence
We track the rapidly changing world of technology with a focus on what leaders need to know to improve decision-making. The future of tech is being created now and we provide insights that enable optimized action based on the future of tech. We provide deep insights into Artificial Intelligence, Machine Learning, Cloud Computing, Quantum Computing, Security Technology, and Space Technology. Explore Disruptive/Exponential Tech
Security and resiliency topics include geopolitical and cyber risk, cyber conflict, cyber diplomacy, cybersecurity, nation-state conflict, non-nation state conflict, global health, international crime, supply chain, and terrorism. Explore Security and Resiliency
The OODA community includes a broad group of decision-makers, analysts, entrepreneurs, government leaders, and tech creators. Interact with and learn from your peers via online monthly meetings, OODA Salons, the OODAcast, in-person conferences, and an online forum. For the most sensitive discussions interact with executive leaders via a closed Wickr channel. The community also has access to a member-only video library. Explore The OODA Community.