Start your day with intelligence. Get The OODA Daily Pulse.

Home > Analysis > People, Culture, Organizations, Cybersecurity, and Technology

We continue our effort to underscore certain patterns and themes found throughout the OODAcast library of over 80 conversations with leaders and decision-makers, on topics such as leadership, empowering a team, clear decision-making while operating in a low information environment, the qualities and best practices of a true leader, the future of intelligence, the future of cyber threats, the cybersecurity marketplace, innovation, exponential technologies, and strategic action.

In December 2020, OODA CEO Matt Devost had a conversation with Masha Sedova, an award-winning people-security expert, speaker, and entrepreneur focused on helping companies transform employees from a risk into a key element of defense.  She has been a part of our OODA Network for years, including speaking at our legacy FedCyber event, where she introduced the behavior-based and gamified cybersecurity training and awareness she put in place at Salesforce.  She is the co-founder of Elevate Security, delivering an employee-risk management platform that provides visibility into employee risk while motivating employees to make better security decisions.   In addition, Masha has been a member of the Board of Directors for the National Cyber Security Alliance and a regular presenter at conferences such as Black Hat, RSA, ISSA, Enigma, OWASP, and SANS.

In May 2021, OODA CTO had a conversation with Bryson Bort, the Founder of SCYTHE, a start-up building a next-generation attack emulation platform, and GRIMM, a boutique cybersecurity consultancy. He is widely known in the cybersecurity community for helping advance concepts of defense across multiple critical domains. He is the co-founder of the ICS Village, a non-profit advancing awareness of industrial control system security. Bryson is also a Senior Fellow for Cybersecurity and National Security at R Street and the National Security Institute and an Advisor to the Army Cyber Institute.

“What does the employee element of security look like?”

Masha Sedova: A  lot of security is people attacking people with a whole bunch of technology in the middle. And when you pull out that far back, you begin to realize it’s as much of a human problem as it is the bits and the bytes and the tech stacks that we’re looking at. There is an adversary who has intention, motivation, and a goal and then there is their target on the other end. And you can play with this ecosystem and help secure it when you understand both players. And so you have to do some work looking at and understanding the adversary when I was working for the defense sector. And then in 2012, I had the opportunity of joining Salesforce, where I got to build and run the security engagement team.

And in that capacity, I got to focus on all aspects of insider threat, both intentional and unintentional insider threat and finding ways to get employees to make more thoughtful security decisions, be proactive about detecting and reporting attacks, secure development practices for engineers and even the security or the ecosystem for Salesforce customers and getting them to adopt security features. And in that work, I became obsessed with the question of what it would look like if our employees wanted to do security, instead of having to?  what if security wasn’t just a check box or compliance exercise?  because frankly, that’s not effective for anybody. But how do we actually move the needle in a way that our employees are part of our defense and our networks? And really started focusing on that question from a totally different lens. It brought me to the fields of behavioral science, behavioral psychology, positive reinforcement, gamification, and trying to understand how and why we make decisions as employees and as human beings  – and apply them to security.

Fast forward a little bit:  built a world-class program at Salesforce that got international recognition, got to teacher class a class at the Black Hat, and was speaking to Salesforce customers. And it made me realize that this was a huge unsolved problem in the security space that was a blind spot for almost all of us, right?   The human element is the soft underbelly of security. And when ignored is also, is our trojan horse, but when unpacked and empowered is a huge element of strength for organizations and helping understand how to defend themselves against attackers. And so I started Elevate Security almost four years ago now with the intention of helping companies quantify employee risk.

And then once they’re able to quantify it and understand which employees are making good and bad decisions on their network, give them a suite of tools to help with those interventions. So following up with scorecards or tailored pieces of training, or even leaning into adaptive access based on people’s strengths and weaknesses. So I have been obsessed with the question of what is the employee element of security? What does the employee element of security look like? How do we help strengthen it, understand it, and help it be part of the equation, not a forgotten piece of security defense?

“There are a lot of components related to motivation. And the way you tap into motivation is through engagement…”

Matt Devost:  So it seems like a deliberate word choice with security engagement, right? Kind of choosing to engage the users, vice dictate or lecture to the users. Can you step us through a little bit kind of the genesis of that approach – because I know it was unique, even though we’d been focused on these issues for nearly 20 years, it was a unique framework to bring to the problem.

Sedova: Yeah. So the secret sauce of my work is the element of motivation. So I am just going to back up a little bit and give people a little bit of a framework around behavioral science. So for anybody to do anything, a habit or behavior, you need three things to exist. You need to have the ability to do it. You need to have the motivation to do it. And you need a prompt, something to remind you to do it. There’s a lot of literature on this equation and security teams for the longest time have only been focused on ability. I told you phishing was bad. Why did you keep clicking on it? Like I just keep telling you more about, more about how bad phishing is and you keep doing it. We fail to recognize there is a whole pillar that we need to be focused on, which is motivation.

Some of our employees don’t care about security. Some of them don’t know that it’s part of their responsibility and their job. They think it’s the security team’s job to do that. And so there’s not an intrinsic desire to do it. They don’t understand how security applies to their work. There are a lot of components related to motivation. And the way you tap into motivation is through engagement, which is why that word came into being because when we can motivate somebody to action and then also give them the ability – the results we start seeing is incredibly notable. And if any of you who are listening reflect on your own lives, when you do something because you want to, because you’re motivated to do so, the output of what you do is so much more, not just meaningful, frankly,  but of higher quality than when you’re being told to do it.

“…the same human being that swims in two different types of water needs different types of communication and needs different types of motivation and engagement.”

Devost: How did you find that transition from the public sector to the private sector? Was there any kind of interesting findings or barriers that you encountered as you transitioned from one to the other?

Sedova:  Yeah, I think about this a lot. It was actually a very hard transition. There were a lot of skills I spent time developing inside the public sector that I didn’t find translated really well to the private sector. And it took me a little bit of time to figure out that even though I had a unique set of background and skill sets that weren’t like other people in my work environment, that was a strength and not a weakness. It is not just public to the private sector, but I also, in addition to having a degree in Computer Science, I also have an AA in Liberal Arts. One of the things that I’ve realized is I get to see connections between things in a different way than my peers (who had a more traditional approach to security or started in the private sector) didn’t see.

So I’ll give you one example: and that is an understanding of cultures and security cultures. The security culture in the public sector is so different than in the private sector, especially in tech in the public sector. The reason you do security well is that there is an enemy out there.  You have secret information. There are lives at stake if you mess this up. And there is a sense of mission and purpose associated with security.

That is a very different culture when you move into Salesforce or a fast-growing tech startup, where it is move fast and break things. And your whole developer organization is from India, China, or Russia. And they’re your golden children. So you have a totally different mindset in which you must think about security behaviors and security culture in a way of  “everybody does this together because we care about the company.”  We care about what we’re building. We care about the well-being of our customers. And for our customers to succeed, they need to trust us.

And for us to gain the trust, we need to treat their data with the utmost amount of respect, privacy, and security. So it’s two very different frameworks that ultimately get to the same outcomes, right? Don’t click on phishing, report suspicious activities, and handle sensitive data well.  But it makes you realize that the same human being that swims in two different types of water needs different types of communication and needs different types of motivation and engagement. And that transition was probably the best teacher that I could have ever asked for this lesson.

“The security culture of “no” creates its own version of an insider threat – not the insider threat that we always talk about…”

Bob Gourley:  I’ve heard you talk before about the state of attack and defense and we would love your comments on that. The cybersecurity field overall.  What is our situation?

Bryson Bort:  Okay. This is where I get to rant, right? Yeah. Rant. rant time!  Okay. First up  – on the solution side:   too many nerd solutions to nerd problems with nerd results.  I’m tired of it.  I’m sorry.  There is a reason that there has been as much money. And by the way,  I’m not saying I’m the smartest guy, they’re way smarter people than me who have been a part of this solution. And the problem is that you are serving at the behest of an operational concern. If you work at a company, whatever that company does is the primary purpose for that company  – and security ain’t it!

You have got to speak in business terms. You must talk English in laypeople terms and do not look down on users. Do not look down on the manager who doesn’t get security. It is your fault. Not theirs. Sorry, okay. I’m so frustrated by this stuff. And I’m frustrated on the technical side where these technical solutions solve part of the problem. And everyone’s like, “look at the greatest piece of this.”

Your largest risk area is people. It is the users.  The users that use computers are the surface area. It is not the computers by themselves. If you’re looking at a computer by itself, you’re missing that they’re sitting right next to the person that is the bigger threat, which is an employee anytime you tell them “no.”  The security culture of “no” creates its own version of an insider threat – not the insider threat that we always talk about where somebody is coming in and stealing company data.

But it’s the person who is like “I really need to do this to get my job done because that’s how we make money and stay in business.”  And it and the security organization is not making it easy for me. So I’m going to figure out my own way.  The second greatest hackers in the world, kids, right?  It seems to be the same thing. You put a parental control.  You put a restriction. I don’t care if it is technical or non-technical, they will social engineer you:  they will avoid, they will lie. They will do whatever they need to do, to get the thing they want to get done. So when you are that culture of no, and when we are only looking at a part of the problem, we’re not understanding the problem realistically at all.

“…in security, and almost every kind of culture, we lean too heavily on negative reinforcement…”

Devost:  If I think back to your keynote at our Fed Cyber Event.  I don’t know that was probably five or six years ago now it’s been a long time  – and it was one of the most popular keynotes.  And in the session where you talked about changing the game and changing the incentives around security and insider threats was incredibly well-received. So if you had to reverse that:  if you were going back into the public sector, what lessons would you take with you from the private sector that you think would be most important or most valuable to replicate?

Sedova:  That is a great question. So there’s one thing that I have learned that is true across all sectors. So what I did in the tech sector was lean heavily into gamification, which was badges and rewards and leader boards, which would not work in the public sector. The public sector appreciates challenge points, which again, don’t translate, but you could do that. But there’s one rule that I and one framework that I found works across sectors. And that is the difference between positive reinforcement and negative reinforcement. So negative reinforcement is when we do something to avoid a bad outcome. So I put on sunscreen because I don’t want to get sunburnt. I don’t particularly enjoy putting on sunscreen, but I’m trying really hard not to get to that painful outcome. Positive reinforcement is the exact opposite in which I know something good will happen to me because of this action.

So we train our pets like this. We train our kids with potty training.  When you get to go the bathroom outside, if you are a dog, then you get a treat. So I’m going to adjust my behaviors to get that treat, right? So in security, and almost every kind of culture, we lean too heavily on negative reinforcement:   change your password or you’ll get locked out. Take your training on time or we will cut down or shut down your access. Don’t click on phishing links or you’re going to get escalated to HR. And there are just a lot of only opportunities to fail, right? At best, the best news is when you never hear from security.  And usually, when you hear from them, it’s an “Oh God, what do they want?” I really would like to stay off of their radar.

“Security doesn’t have to be the bad news bear all the time.”

We can take exactly all of those “asks” that we have of our employees and flip them around to positive frameworks. So gamification is one way. Say “Hey, if you report phishing three times in a row, you are going to be top of the leaderboard – but it doesn’t have to look like gamification. It can look like kudos and an acknowledgment from the security team, CCing your manager, and saying “You have an employee who’s doing a great job at keeping our company secure. Thank you.” If you can do that systemically and measure it, which is a whole other conversation we should get into, but if you can measure it, you can actually tie it into bonuses and rewards and recognition. So things that people can take home and get financial benefits from.  You can also use it to elevate values in your organization.

One of the best examples I’ve seen is providing a slide to a CEO for an all-hands, saying “these are people who have gone above and beyond and detected tricky attacks in their organization. They’ve detected malware instances that, had they deployed it, would have caused ransomware in our organization, but they did the right following actions where you highlight what the correct behavior is. And we wanted to give them a shout-out because while they are in sales and marketing and engineering, they have done their job in protecting this company. So thank you.”

Now I have access and visibility to things because of security, because of good actions, which makes me want to keep doing it more. It’s a huge way of flipping this mindset around  – and we can do it with almost anything we ask of our employees. So if you have your annual security training coming up, instead of punishing for late behavior, donate a dollar to a charity for every employee who completes it before the due date.  I’ve seen this work incredibly well to get 90% of the company without any nagging in before your compliance due date.

So this is something that we can learn from so many other industries and so many, frankly, our own parenting skills and apply them to security.  Security doesn’t have to be the bad news bear all the time.

“I’m looking forward to what comes out of the US government taking lead on this. But I do want to note…do not look past the amount of work that private industry has been doing independently…”

Gourley:   Is there anything else on the state of cybersecurity today?  Anything else that should be top of mind?

Bort:  Do you want to talk ransomware?

Gourley:  Yes. What is the state of ransomware?  Is that another one where there will be an end to this challenge? There will be a patch to our antivirus that will detect all this and knock it out of the enterprise and we will have won?

Bort:  Ransomware. So Chris Krebs and I at RSA in February 2020, got up on stage as part of ICS village and the Cyber Security Infrastructure Security Agency (CISA).  You can tell I have worked with them because I can just rattle that name off for CISA. We called out the coming scourge of ransomware and did, oh boy, did it happen in 2020.

And is it continuing of course in the news in 2021 and why? Because it’s lucrative. They are making money hand over fist. Yeah. DarkSide (I can’t tell if it was the developers or the operators, which by the way, are two different things) has been attributed to $90 million in cash in hand. So they ransomware dozens of companies. The Colonial Pipeline was the fourth attack in six months on the US energy infrastructure. It’s just the Colonial Pipeline that got the attention because it turns out, you affect consumers at the gas pump in a hydrocarbon economy. You’re going to get attention. The other ones, eh, hospitals.  Well, we will read about those in news articles, but nobody’s really cared and done anything about it. The gas one, the Colonial Pipeline? Now it has gotten serious attention. So I’m looking forward to what comes out of the US government taking lead on this. But I do want to note, as I’ve said several times in the past: do not look past the amount of work that private industry has been doing independently, without government, to fight off a lot of what’s happening with these kinds of gangs.

“Even if a malware detonation was prevented, you still have a log that an employee had the intention to do so.”

Devost: So one of my favorite quotes and I use it in almost every presentation that I give is the famous one from Jack Welsh, the CEO of GE, and it says, “You get the behavior that you measure and you reward.” So you talking a lot about some of the rewards and the positive reinforcement and kind of flipping the script. But if I’m in an organization, how do I measure that? How should I be thinking about progress against those objectives?

Sedova:  That is a great question. And frankly, this is one of the areas why I think human risk, employee risk is the Achilles heel of security because we have failed to measure them so far. We have accepted compliance-driven measurement, which only asks us how many people have completed a standard training. That’s not actually the measure of risk in our organization. It is a measure of what people know or at least can guess in a quiz question; not what employees do. And so what you need to be thinking about is measuring the behaviors that are associated with risks in your organization and the employees that exhibit good or bad behaviors. So let me unpack that. There’s, there’s a lot of words there that might be new to people listening.

So let’s say the risk that I care about is ransomware, which seems to be top of mind these days.

So there are a couple of behaviors associated with that, that employees would need to exhibit for that risk to happen. An employee would need to click on a phishing email. They would need to download malware. They would need to execute it. There are obviously permutations and combinations of this and they would need to, if they were suspicious at all, not report. So there is potentially one more. They might have access to the network infrastructure that they may not need, for example, over access, therefore creating a wider blast radius. So the behaviors that I’m trying to measure are: what, who in my organization is clicking on links?

I really recommend looking at phishing, which is helpful but also real phishing, real-world phishing like a dataset from a Proofpoint-like data set.  Who’s reporting?  Even if they click and compromise, who then detects and reports it?  Who tries to download and execute malware on your machines? There is an incredible source of this based on your endpoint logs.  Even if a malware detonation was prevented, you still have a log that an employee had the intention to do so.

“…we can also course-correct when people are off based exactly on what they’re doing, not on what they know…that is really where the gap is in our capabilities right now…”

So probably not malicious, but are unwitting enough to try to do that, that technology had to step in and you are able to start creating, quite frankly, a user reputation score.  How good is an employee called Steven at these types of behaviors that then ultimately introduced risk? And these data sets already exist in even relatively mature enterprises.

You are already investing a lot in the tech in your tech stack. I was just talking to someone the other day and said, most, say Fortune 1000 companies have between seventy-five to a hundred different security tools already in place. All of those, many of those, have logs that intersect with the human decision that is trying to create change controls in your organization without prior approval or having cleared their backlog of known vulnerabilities.

Who is slow in finding and fixing bugs?  That is your data set. So, you actually have a lot of these logs around.  Who is being more proactive and not?  And what that gives you insight into is fundamentally a risk map, a heat map of your organization. Where are you likely going to stumble upon employees who are going to introduce risks to your environment in a way that you would have to clean up after the fact?  And so when you’re able to do that, you’re also then able to tie this back to what we just talked about, give employees very individual course-correcting feedback. You can say, “Hey, Steven, you’re doing a really good job at reporting things when you detect it, but you are clicking on phishing links five times more than anyone in your department.

It’s time for us to talk about how we can level those skills up, right?  Do you want to sign up for more frequent phish test practice? Would you like to take training? Can we talk about what tools you might need to ensure that you are detecting external emails more effectively? So there are ways that we can reinforce when people are doing great with kudos in the all-hands when we know, and we can measure it.  And then we can also course-correct when people are off based exactly on what they’re doing, not on what they know, because the difference between Steven having a perfect test score on a quiz for annual secure training and him ignoring red alert banners in his browsing and in his inbox and proceeding are world of difference. And that is really where the gap is in our capabilities right now, and maybe thinking about knowing and doing and measuring those two things.

“Backup. Backup. Backup. Backup…up there as one of the worst feelings in the world is going to a backup in an incident  – and realizing you didn’t have the backup you thought you had, right? Test your backups.”

Gourley:   Would you give me a recommendation for, let’s say a company that maybe has a small IT department, maybe even a CIO. And I hope this company is not just going to wait for the government to do something. What can a, mid-size business does to mitigate the risk of ransomware to their organization?

Bort:  So I’m going to give two answers. The first is the overtly self-serving and then the second is going to be the technical implementation for backup. So first is SCYTHE is a modular post-access malware platform. We can safely emulate any kind of malware, including ransomware. And it has been, we have received a lot of interest this year. And it’s funny because, from a site perspective, that was just a small little feature that we happen to have.  It’s like, oh yeah, we also do that. Except for now, since it’s become such a part of the consciousness, that’s now actually the primary driver for folks coming to us because nobody else can safely do that at scale with the ability to adapt it to each campaign. The DarkSide one, within two business days, we had an emulation plan out shared with the community.

We’re going on a series right now, where we are going to launch an offering here shortly where companies will be able to download those emulation plans off the website themselves and immediately run it locally for a very reasonable price.   I’m not kidding. Like we’re not going to be charging a lot for that. So you can actually see what will happen with your defensive stack against ransomware. All right:  taking off the sales hat.  Backup.  Backup. Backup.  Backup. Again, it is going to happen. Breaches are going to happen. It is going to happen. And assuming you have not tested and improved with us, what can you do? You can make sure that you can recover.  Have a great backup strategy that has two prongs: one local, hot backup, so that you can immediately recover, recognizing of course, that those are going to be a target.

So ransomware knows that is the biggest thing. That is going to keep you from paying them. They are going to try to get to those. So the thing that makes the convenience for you to quickly and easily back up as well as restore, they’re going to be gunning for those. The second part, then, is to give yourself a little bit more leeway and make it harder for that. And, it’s of course harder, is the same thing that we’ve seen in business continuity planning for a long time, have periodic offsite backups. Yes. You are not going to be doing those every day, whether that’s weekly or monthly, it’s something that’s going to give you a chance to come back versus the cost it can be, I mean the latest ransomware payment that a company just paid was $40 million. Colonial was a…

Gourley:  …CNA Insurance.  $40 million…

Bort:  Yeah. CNA Insurance was $40 million. Colonial was $4.4.  Compare those costs and then make that business decision. And the final piece of advice, back on to backups, is:  test your backups. Yeah. There is no worse feeling in the world. Well, there might be a few, but up there as one of the worst feelings in the world is going to a backup in an incident  – and realizing you didn’t have the backup you thought you had, right? Test your backups.

The Original OODAcasts:

Masha Sedova, Co-Founder of Elevate Security on Human Risk Management

Scythe CEO Bryson Bort on Enhancing Security with Realistic Adversary Emulation

Related OODAcast Thematic Posts

Cybersecurity Investment, Due Diligence, Innovation and Growth (Andy Lustig and JC Raby)

Leadership, Management, Decisionmaking and Intelligence (Paul Becker)

Nate Fick on Company Culture, the Cybersecurity Community, Endgame/Elastic and Emerging Cyber Threats (Part 2 of 2)

Nate Fick on His Early Career, Writing ‘One Bullet Away’, The Stoics and Dynamic Leadership (Part 1 of 2)

Related Reading:

Black Swans and Gray Rhinos

Now more than ever, organizations need to apply rigorous thought to business risks and opportunities. In doing so it is useful to understand the concepts embodied in the terms Black Swan and Gray Rhino. See: Potential Future Opportunities, Risks and Mitigation Strategies in the Age of Continuous Crisis

Explore OODA Research and Analysis

Use OODA Loop to improve your decision-making in any competitive endeavor. Explore OODA Loop

Decision Intelligence

The greatest determinant of your success will be the quality of your decisions. We examine frameworks for understanding and reducing risk while enabling opportunities. Topics include Black Swans, Gray Rhinos, Foresight, Strategy, Stratigames, Business Intelligence, and Intelligent Enterprises. Leadership in the modern age is also a key topic in this domain. Explore Decision Intelligence

Disruptive/Exponential Technology

We track the rapidly changing world of technology with a focus on what leaders need to know to improve decision-making. The future of tech is being created now and we provide insights that enable optimized action based on the future of tech. We provide deep insights into Artificial Intelligence, Machine Learning, Cloud Computing, Quantum Computing, Security Technology, Space Technology. Explore Disruptive/Exponential Tech

Security and Resiliency

Security and resiliency topics include geopolitical and cyber risk, cyber conflict, cyber diplomacy, cybersecurity, nation-state conflict, non-nation state conflict, global health, international crime, supply chain, and terrorism. Explore Security and Resiliency

Community

The OODA community includes a broad group of decision-makers, analysts, entrepreneurs, government leaders, and tech creators. Interact with and learn from your peers via online monthly meetings, OODA Salons, the OODAcast, in-person conferences, and an online forum. For the most sensitive discussions interact with executive leaders via a closed Wickr channel. The community also has access to a member-only video library. Explore The OODA Community

Tagged: OODAcast
Daniel Pereira

About the Author

Daniel Pereira

Daniel Pereira is research director at OODA. He is a foresight strategist, creative technologist, and an information communication technology (ICT) and digital media researcher with 20+ years of experience directing public/private partnerships and strategic innovation initiatives.