Search
Close this search box.
EPISODE 160
How to Find Your Top 5 Cyber Risks

EP 160: How to Find Your Top 5 Cyber Risks

Our bi-weekly Inflection Point bulletin will help you keep up with the fast-paced evolution of cyber risk management.

Sign Up Now!

About this episode

June 18, 2024

You can find your top 5 cyber risks using a “top down” approach with the NIST Cybersecurity Framework. Along the way, you can shift your organization towards better practice of reasonable cybersecurity. Know how? Let’s find out with your hosts Kip Boyle, CISO with Cyber Risk Opportunities, and Jake Bernstein, Partner with K&L Gates.

You can see our “zero through ten” scale scorecard here — https://b.link/scorekey

You can watch our interview prep video here — https://b.link/interview

Tags:

Episode Transcript

Speaker 1: Welcome to the Cyber Risk Management Podcast. Our mission is to help executives thrive as cyber risk managers. Your hosts are Kip Boyle, Virtual Chief Information Security Officer at Cyber Risk Opportunities, and Jake Bernstein, partner at the law firm of K&L Gates. Visit them at cr-map.com and klgates.com.

Jake Bernstein: So Kip, what are we going to talk about today in episode 160 of the Cyber Risk Management Podcast?

Kip Boyle: We're going to do something that I can't believe we've never done before. I don't think we've done this before. We're going to tell people how to discover their top five cyber risks.

Jake Bernstein: I mean, who can remember what we've done over 160 episodes? I know that I have had that experience when I've been looking for something and, oh wait, we did an episode on this. So, I guess that's what happens when you've been doing it as long as we have. So this is a perfectly super practical topic for a podcast episode. Whether we've done it or not, let's go ahead and move into it. Really, one could say that discovering someone's top five cyber risks is kind of how we met.

Kip Boyle: Yeah.

Jake Bernstein: So is this the same way that you've done it before? I mean, we've had many shared clients we've done this with, but I'm curious, has your thinking changed? And let's dive into this.

Kip Boyle: Well, so I think that's one of the reasons why I am confused about whether I've done this as a podcast episode with you, because we have talked about this a lot together, and I just don't think we've done it in front of the microphones together. You're right, we've done this for dozens of times for paying customers. Some of our clients are shared, some are not. And I've actually been teaching people how to do this for years. I've got a course on Udemy. I talk about it in my book. I've got a course on LinkedIn Learning. And I've been teaching how to do this onsite, in person, live at SecureWorld for the last 12 months. But I just don't think we've-

Jake Bernstein: So it's top of mind.

Kip Boyle: Yeah. I just don't think we've shared it here.

Jake Bernstein: I don't think we have either. So, where do we start? Because discovering our top five cyber risks is obviously... Maybe we'll start by asking why do we care about our top five cyber risks?

Kip Boyle: Right. Yeah. Well, so-

Jake Bernstein: And then where do we start when we want to do that?

Kip Boyle: Yeah. So I want to take some inspiration today in this episode from the SecureWorld course that I've been teaching lately. That's what's most front of mind for me. It's called Implementing the NIST Cybersecurity Framework. We focus on version two because that's out now, but we've been doing this for a long time with version one, and then we did it with version 1.1 of the Framework.

And what I think is relevant here, to your question, why would you want to find your top five cyber risks, is cyber has become a material business risk. And that means that when you lose control of your systems due to a ransomware attack or something like that, you can't sell, you can't fulfill orders that you've taken. You don't even know who's ordered from you. You can't collect the money that people owe you, you can't pay the money that you owe other people. I mean, this can take you out of operation in a flash.

And so it turns out that cyber risk is hiding everywhere in an organization. It's not just something that happens in your tech stack, and it's not something that just happens way, way, way down at the ones and zeros level. So we want to look everywhere for it. And that's one of the reasons why I think the Cybersecurity Framework is a great tool to use because it advocates a top-down approach rather than a bottom-up approach.

Jake Bernstein: And I think bottom-up approaches, so-called grassroots efforts, they have their place in many different contexts, but I think in cybersecurity it's very difficult to successfully have a reasonable cybersecurity program, and I chose that word on purpose, with a bottom-up organic approach. And one of the reasons is evidence. Evidence of what you're doing. Being able to prove that you've done it. Because, Kip, as we know, it's not possible to be perfect at cybersecurity. In fact, I know someone has at some point mathematically proven almost like a natural law that thou shalt not be able to create bugless code. It's not possible. There will be bugs, which means there will be cyber risks, which means at a certain level maybe someone could be 99.9999% secure, but it's literally impossible to be 100% secure.

Kip Boyle: Right.

Jake Bernstein: So what does that mean? It means that when something goes wrong, we have to be able to prove that we've done what was reasonable under the circumstances, which for now remains the legal standard, it remains the only way that we really can express what we want organizations to do.

So given that, top five cyber risks really does seem like an important place to start. But it can be overwhelming. So what does it mean to conduct a top-down analysis of our top five cyber risks? And before we get into that, I actually do want to ask, because I'm not sure we've talked about this very much. What are some counter examples of bottom-up approach and what do you think that means?

Kip Boyle: Yeah. Well, bottom-up is sort of... I like the term you used a moment ago, grassroots. A bottom-up approach is kind of where those of us who have worked in cybersecurity as a profession for a while, that's kind of what we're used to. We are used to doing hardening systems. We're used to focusing on secure code development. We do network penetration testing, vulnerability assessment. All this stuff is happening way, way, way down at the bottom of all of our systems. These are just the ones and zeros layer.

Jake Bernstein: Kind of the keyboard warrior level, right?

Kip Boyle: Yeah, that's right. And guess what? That is where a lot of the bad stuff happens, for sure. And so, it's understandable why we would be so focused there. And the world is full of frameworks and checklists and things that reinforce this bottom-up approach, like the Payment Card Industry Data Security Standard, PCI DSS, a very bottom-up approach. The Center for Internet Security Critical Security Controls, the top 18, used to be the top 20, used to be the SANS top 20. These are both examples, and I could go on, of a bottom-up approach. They're very control focused. We want controls at the technical level. So yeah, that's kind of where we've been. And there's absolutely still a place for that.

But let's talk about what a top-down approach is. Let's get super clear about that. Because I have found in teaching this that a lot of cybersecurity professionals are disoriented by this idea because it's so very different. But I think it's absolutely the way to go if you want to tackle cybersecurity as a material business risk. And, maybe more importantly, if you're having trouble getting buy-in from the senior decision makers at your organization, that trouble could be rooted in the fact that you talk ones and zeros all the time and that you're taking a bottom-up approach. If you can take a top-down approach, I think the opportunity here is not only to find your top five cyber risks, no matter where they are, but also to be able to increase the buy-in. When I hear people say, "Oh, I just don't have the budget I need," that is a perfect example of a lack of buy-in. So in a top-down approach, we defer dealing with controls until we know which areas need our attention the most.

Jake Bernstein: Yep.

Kip Boyle: All right?

Jake Bernstein: And I just want to point something out too. As you were talking, I was like, hmm, I wonder how many controls there are just in NIST SP 800-53 Revision Five alone.

Kip Boyle: Hundreds. Thousands.

Jake Bernstein: And I haven't counted yet, but it is a 400 and... What is the hold? I just saw it. 483 page PDF. It's almost a six megabyte PDF. And if you've looked at NIST Special Publications, these aren't flashy, full-color productions. This is like utilitarian raw text. So to have a six megabyte PDF, almost 500 pages long, even just trying... Well, first of all, by its own terms, this document never suggests that you should be implementing the document. That's why it kind of drives me crazy, and I think it drives you crazy as well, when we see companies say, "Well, we're SP 800-53 compliant." It's like that-

Kip Boyle: Doesn't mean anything.

Jake Bernstein: That has no meaning. It's nonsense.

Kip Boyle: Right. It's so meaningless that it tells me you don't know what you're talking about.

Jake Bernstein: Well, that's true. I suppose it's not meaningless. It does signal that you don't know what you're talking about.

Kip Boyle: Yeah, that's the meaning.

Jake Bernstein: That is the meaning. But I think the point there is that there's a difference, a big difference, between cyber risk and controls. And I'm going to derail slightly. It's not really-

Kip Boyle: Because you never do it.

Jake Bernstein: Because I never do it. It's drifting the point, but a conversation I was having yesterday, I was walking one of my younger colleagues through how to read and review for a due diligence process a SOC 2 Type II report.

Kip Boyle: Oh, good.

Jake Bernstein: I was trying to explain to him why just having a SOC 2 Type II, it does not indicate that one is secure. And what I eventually... Well, I eventually asked ChatGPT to explain it, and of course it did so perfectly, after I prompted it correctly. But the reality is is that SOC 2 Type IIs don't measure your cyber risk. They don't pass judgment on how, quote unquote, "secure" an entity is. What they do is they validate controls. And this is a perfect illustration of the difference between measuring and evaluating cyber risk and measuring and evaluating controls.

Kip Boyle: Yeah.

Jake Bernstein: And so there's a huge problem in the industry where someone will trot out their impressive-looking SOC 2 Type II report or attestation. And just to be clear, I'm not saying these things have no value. I'm not saying that at all. What I'm saying is that you have to understand the limitations and the purpose of a SOC 2 Type II, and it really is the difference between understanding your cyber risks and understanding the purpose of controls.

So with that, I will put us back on track. I think you see that it's not really going off track because it's such a common misunderstanding about the way that we talk about cybersecurity program maturity, cyber risk. And since we're talking about bottom-up, top-down, I mean, a SOC 2 Type II is evaluating your bottom-up approach. That's it.

Kip Boyle: Right, right. It's saying, "Do you have controls in place and are they reliable?" But what a SOC 2 Type II doesn't really do is say, "Are these the right controls?"

Jake Bernstein: Exactly.

Kip Boyle: And it doesn't tell you are the ones that are in scope for this report, the ones that are most material to the situation that you're facing. So there's a lot of issues with it.

Jake Bernstein: There's a lot of issues. And the reason for that is that it's the auditee, the one getting the attestation, that defines the controls that are in scope. So again, all it's doing is you, the company being audited or reviewed, is saying, "We have these controls." Then an auditor, typically a CPA firm, will come in and say either yes, you have these controls, and there's maybe evidence of them, or you don't have these controls. But they're not saying, they're not even asking, are these the relevant controls? Are these the best controls? They're not even asking what your cyber risks are. So with that, Kip, what if we really need to understand our top five cyber risks. Where do we start?

Kip Boyle: Yeah, so let's get into that. All right, so the way that we at Cyber Risk Opportunities, the way that I advocate doing this top-down approach is by conducting what we call structured interviews with... again, another term that we use is internal influencers.

So let me unpack what I just said a little bit because these are some overloaded terms. First of all, what is an internal influencer? Well, that's not somebody with an Instagram account. These are people in your organization who are considered to be highly influential, who deeply understand your organization's purpose. And these are people that others go to when they have questions. So when a tough question comes up or a tough problem emerges and people don't know what the answer is, these are the people they go to.

These folks are typically found as middle managers. So they might have a manager title, senior manager title. They might be a director. That's where most of these folks are. We also to find them as senior level individual contributors. So these are people who don't supervise other folks, or maybe they're a first-line supervisor, but maybe that's only happened because they wanted to pay the person more and that was the only way they could do that. But these people are just deeply, deeply experts in what your organization does.

Jake Bernstein: This is... What's the guy's name from the Phoenix Project? There's the character that... We've talked about this before. I'll have to figure that out. But there's a specific character who's famous in the Phoenix Project, which is a book about developing DevSecOps, essentially.

Kip Boyle: Yeah. The Phoenix Project is a great book. Highly recommended. And yes-

Jake Bernstein: But the guy whose name I can't think of at the moment-

Kip Boyle: I can't think of his name either. But that whole book is about the theory of constraints. And there is a person there that deeply, deeply understands how that organization does what it does. This is the person that if you went to and said, "How do we make money?" they could tell you. And they could either give you a thumbnail sketch of how you make money, or they could give you a five-hour dissertation on how you make money. And anything in between.

Jake Bernstein: Clearly, an internal expert.

Kip Boyle: Yeah, an internal expert. Now, are they experts in cyber risk? No. But they are experts in the organization and they understand deeply how things are done. Now, Jake, pop quiz. What's a single word that describes how we do things here?

Jake Bernstein: A single word that describes how we do things. So I have to scroll down into the script. Do I have an answer here?

Kip Boyle: Nope. This is a pop quiz. It's culture.

Jake Bernstein: Pop quiz. Oh, culture. Cyber culture.

Kip Boyle: Just culture, right? Culture is the shorthand word for how we do things here. That's what really culture is. It's just like our unique way of being. Our unique way of getting things done. And these folks are just deeply embedded in the culture of the organization. And that's why we want to talk to them, because they really-

Jake Bernstein: It was Brent. Brent is the name of the engineer.

Kip Boyle: Brent. Yes. Brent in the Phoenix Project. So think about Brent. We want to find about anywhere from 15 to 20 Brents. Now, we're going to do two things when we interview Brent. First thing we're going to do is we're going to get some information that's going to help us know where are the top five cyber risks in this organization. But we're going to do something else at the same time, which is we're actually going to teach Brent what the definition of reasonable cybersecurity is. That way, when Brent leaves the encounter that he has with us, he's going to change the way he views decision making at his organization. He's going to add a filter, because he has many filters already, and he's going to add another filter about cyber risk. And because this person is deeply influential, people seek out their advice. This person is going to spread the new filter of what is a reasonable cybersecurity like a virus. All right?

Jake Bernstein: It's very clever.

Kip Boyle: So we're actually going to release a virus into the organization-

Jake Bernstein: Releasing a virus. Got it. Got it. Okay. And why would we target the... I mean, we know who we're targeting, but let's just make it clear. We're not going to target junior level employees because they don't know enough, right?

Kip Boyle: Correct. They lack perspective.

Jake Bernstein: And curiously enough, the same is true oftentimes of the most senior level executives.

Kip Boyle: Right.

Jake Bernstein: Is they don't... Sorry, I wouldn't say they lack perspective, but they lack the right perspective.

Kip Boyle: Yeah. It's a different type of perspective.

Jake Bernstein: It's a different type of perspective. Junior levels just lack perspective in general, right?

Kip Boyle: Mm-hmm.

Jake Bernstein: That's not fair either. They have their own perspective, but it is not the one that we're seeking here.

Kip Boyle: Exactly.

Jake Bernstein: And the same is true of senior executives.

Kip Boyle: Right. And I've also found that senior executives tend to be cheerleaders. They tend to talk about how great the organization is. They tend to want to project only the best perspective of the organization.

Jake Bernstein: And I would say this is more true, the bigger the organization becomes, right?

Kip Boyle: Yes.

Jake Bernstein: A very small org, you might have a CEO who might as well be the internal expert on some things. Certainly for a company below 10 people, that's almost always the case. The CEO is the expert on everything.

Kip Boyle: Yes.

Jake Bernstein: But we're not talking about that. We're talking about hundreds of thousands of people. By that point, the high level executives, they won't have the same perspective.

Kip Boyle: No, they don't understand how things get done at the level of detail that we need to engage with that organization. So that's why we're focused on that.

Jake Bernstein: This all makes sense.

Kip Boyle: Yep. Okay, good.

Jake Bernstein: But what are you going to ask them? Because you can't just say, "Do you have good cybersecurity?"

Kip Boyle: Correct. Right. You can't ask them that because they lack a vocabulary to explain that to you. So we're not going to do that. But what we are going to do-

Jake Bernstein: And some people, if you ask... You also have to be careful not to ask people, "Are you doing your job well enough or should you get fired?" in the subtext. So that's also a tricky thing to deal with.

Kip Boyle: That's right. You don't want to approach this like an audit. You absolutely don't want to do that. You don't want people to get the sense, when you're talking to them, that you're trying to catch them doing something wrong, because they will retreat into their shell and they will not tell you what's really happening. So you've got to carefully prepare them for this interview, and you have to talk very carefully about how you're doing an assessment. We find that that word works really well. Because it's not as threatening. It doesn't have the connotation that an audit has.

We also make sure that before we interview people, we have made sure that some messaging has come to those folks through their line management. In other words, they've been given permission by their supervisor and by their CEO to speak with us and to speak with us in a candid way so that we can find opportunities for improvement. And specifically, we talk about the need to balance reasonable cybersecurity with productivity. And that's a big concern for people. They respect security, but not when it prevents them from doing their job. So yeah, context is important. Now, the questions-

Jake Bernstein: And that... Sorry, I feel the need to interject that what you just said about cybersecurity preventing people from doing their job is the origin motivation of shadow IT, right?

Kip Boyle: Oh yeah.

Jake Bernstein: People want to do their jobs, generally, and one of the big risks of being overly "secure," quote unquote, is that it's impossible to actually use. Therefore, people still need to get their jobs done. And as Dr. Ian Malcolm in the famous-

Kip Boyle: Jurassic Park.

Jake Bernstein: ... Jurassic Park said, "Life will find a way."

Kip Boyle: That's right. Life will find a way.

Jake Bernstein: Your employees will find a way to get their work done. And if that means bypassing your security, unfortunately they will do it.

Kip Boyle: And that's an awful situation because then what you end up with as the security manager is a hellacious case of a false sense of security. You'll be thinking everything's great because you'll have no idea that everyone's using Gmail to get their work done. And that's awful. So we don't want to do that, right? And if there is any case of that happening, we want to detect that so that we can correct it.

Jake Bernstein: And just to be clear, there's nothing wrong with Gmail. We're just assuming that Gmail is a proxy for non-approved corporate email accounts.

Kip Boyle: Shadow IT. Shadow IT. And then these days we got to worry about shadow AI.

Jake Bernstein: Shadow AI as well. Yes.

Kip Boyle: Yeah, that's going on too.

Jake Bernstein: Which is in some ways even harder to regulate because shadow AI... Well, let's not get that distracted.

Kip Boyle: Well, we already did a two-part episode on that.

Jake Bernstein: We did. We did.

Kip Boyle: So yeah, we don't need to unpack that again. Anybody who hasn't listened to those episodes, it was just a few back. Just go back into your podcast listener and find them. Okay. So now, what do we ask these internal influencers? Well, we're going to ask them questions that are derived from the NIST Cybersecurity Framework, and we're going to find out how well the organization governs itself with respect to cyber risk, how well it identifies, protects, detects, responds to and recovers from cybersecurity incidents. And each question in the questionnaire that we're going to use is going to start with the phrase, "How well does your organization blank?" And then people are going to give their response to us. The way we do this, they don't narrate a response, they give us a response numerically. They give us either a number between zero and 10.

And if this was a video podcast, at this point, I would put up a score key so that you could see what I'm about to describe to you. So what I'm going to do is describe it, but I'll also put a link in the show notes where you can actually pull it up off our website in case you're really interested.

But here's how the scoring system works. The first thing you have to understand is that most people think that you can never have too much money and that your significant other can never be too good looking, and your car can never be too fast. In life, there's almost no such thing as too much. But we know, because we just talked about it, that in security, it is possible to have too much security. It's possible to have too little security. So there's really three different states of security. You've got not enough, too much, and just the right amount.

Jake Bernstein: I never thought of it this way, but in life, there really are a lot of things that you can have too much of.

Kip Boyle: Well, but that's not what the popular perception is.

Jake Bernstein: I would agree. Okay. That's true. I think the popular perception is that I think the wise perception becomes, no, no, there are many things you can have too much of. And what I love about, and for some reason I've only thought about this now for the first time, really, is that the same is true of cybersecurity, is that it's also possible to have too much security.

Kip Boyle: Mm-hmm.

Jake Bernstein: Which means that there's... And it's obvious that it's easy to have too little cybersecurity. Which means there's a green zone, a habitable zone, where it's the mythic just right porridge, the life-giving zone where the earth orbits around the sun, and it's the zone where you're spending just the right amount of money on cybersecurity.

Kip Boyle: Exactly.

Jake Bernstein: And that's what we've overlaid that zero through 10 scale on. And I think it's... I'm having a little déjà vu with this episode, and I'm realizing it's not because we've recorded it before, but it's because we've both said it so many times.

Kip Boyle: Oh yeah. To clients.

Jake Bernstein: To clients in doing interviews. I mean, essentially every interview we talk about this. And I think one of the things I've always loved about this system is that it allows for some interesting analysis at the end.

Kip Boyle: Oh, absolutely. And we are not going to be able to talk about that in this episode.

Jake Bernstein: No, we won't.

Kip Boyle: Because we're just talking about finding the top risks.

Jake Bernstein: Just the top five. But the scores are going to be zero through four, which represent various levels of insecurity. Why do we have five choices for insecurity? Because there really is a difference between doing nothing at all, which is a zero, and doing something, but either not understanding or not feeling that it's truly going to be enough.

Then five through eight, which is only four choices, represent kind of the minimally acceptable to fully optimized. I think that's a good range. And then nine and 10 have always been my favorite, which it's always exciting when someone says nine or 10. It's too much. It's wasteful of time, money, morale. One could argue that a nine or 10 encourages shadow IT.

Kip Boyle: Definitely does.

Jake Bernstein: Yeah. So I think the scale is useful. It's been, gosh, time-tested now for a better part of a decade.

Kip Boyle: Definitely. But that's how we do it. Because what we want is we want this semi-qualitative, semi-quantitative approach. So by having people give us a number instead of a narrative, we can accumulate those numbers into a dataset. And I know that you like this because I know that before you became an attorney, you worked as a scientist, right? You did research. And so you know the value-

Jake Bernstein: That's true. Well, I mean, I was a graduate researcher, but yes.

Kip Boyle: But that's okay. That's all right.

Jake Bernstein: Yes. That's okay. And so Kip, one question that always comes up when we talk about this is why isn't it enough to just say yes or no?

Kip Boyle: There's not enough granularity there. And the world is never that black and white. We actually want to know if an organization is doing something to deal with the cyber risk, but maybe they're just not doing enough. So for example, I think this is the quintessential example, passwords. So when you and I first started using computers, I don't even think passwords were a thing. But over time, we got passwords, and then, oh, then our passwords had to be a certain length, and oh, our passwords needed to have certain character sets included in them. And oh, well, passwords are no good alone anymore. Now we have to have two factors of authentication. So it changes over time, and we've got to be able to measure what we're doing now so we can figure out if it's sufficient or not.

Jake Bernstein: Yep. And let's just quickly hypothesize, what happens if only choices are yes, no? Or even worse, yes, no, or maybe?

Kip Boyle: I just don't know how to get any traction with that.

Jake Bernstein: Well, so think of the results that you would get. Either you're going to get inaudible. If you get all nos, well, either that's the same as getting all zeros, which I suppose is possible, but you really don't know where to start because everything is zero.

Kip Boyle: I'm looking for top five.

Jake Bernstein: Remember the goal of this episode is how do I find my top five? So if everything is a no, that doesn't really help. And then obviously, if everything's a yes, then it's going to look like you have no risks.

Kip Boyle: And that's not likely.

Jake Bernstein: And that's also incredibly unlikely. That would be the equivalent of someone scoring all eights under our zero to 10 scale. And that is a very, very, very, very, very difficult to... I mean, I suppose you could lie to yourself and just give yourself all eights, but that's not helpful.

Kip Boyle: That's probably the only way you can do it.

Jake Bernstein: Okay, so what's an example? Because here we are talking about this. We've figured out how we're going to score it. We know what we're going to base the questions on. We know how they're going to be formulated. But what are some examples?

Kip Boyle: So what we found works is that you're going to ask a lot of questions, first of all, and you're going to ask them in a pretty rapid-fire pace, because you really don't want to have a narrative conversation with somebody. You want to get as many numbers from them as possible. It takes about an hour.

Jake Bernstein: How many is that?

Kip Boyle: About a hundred.

Jake Bernstein: About a hundred?

Kip Boyle: About a hundred questions.

Jake Bernstein: That's a fair number. And that's going pretty quick, because if you're trying to get a hundred questions done in an hour, you're not even spending a full minute on any single question, for the most part.

Kip Boyle: No. No, you're not. And we found that this is okay. If I can ask 30 people a hundred questions and get numbers back, well, that gives me 3,000 data points. That's pretty good.

Jake Bernstein: That is pretty good.

Kip Boyle: I can do some analysis with 3,000 data points, especially if those points are all over the place between zero and 10. Now, I'm going to leave that to the imagination of the listener because that's not what this episode is about. But I do want you to understand why it's important that we're collecting numbers between zero and 10, because we want to be able to do analysis. Like a polling organization would. So we're going to use those kinds of techniques in order to understand where things are.

But the questions, remember, they're going to start with, "How well does..." So I'll give you a real-life example here. I was starting an interview by saying, "How well has our organization established and implemented the processes to identify, assess, and manage supply chain cybersecurity risks?" Now, that's a huge question.

Jake Bernstein: In fact, that's a loaded question.

Kip Boyle: It's overloaded. It's overloaded. I've actually asked several things at the same time. So I encourage people to not overload questions, but I wanted to use this as an example so you could understand that the more overloaded it is, the harder it is for somebody to give you a score. Because it talks about identifying, assessing, and managing supply chain cybersecurity risks. Well, you might be great at identifying them, and you might suck at assessing and managing them, or any combination there.

Jake Bernstein: Or maybe you wrote them, but no one's implemented them.

Kip Boyle: Right. So there's a lot going on here. Ideally, you would break this into three different questions that would give you the granularity to know, well, we know what our risks are, but we're really bad at assessing them, and as a result, we don't manage them very well. Aha. That's genuine insight. You can do something with that. That's actionable. So that's why we're asking these questions, and that's why we want them to be atomic. We want them to be taken down to the lowest way that you can measure it.

Jake Bernstein: Possible answer.

Kip Boyle: Yeah.

Jake Bernstein: And Kip, I've always... If you think way back to when we first met, I immediately got and understood this methodology, because it's quite honestly quite similar to either... Not that we're trying to depose people or question them or do an examination in court, but for example, if you had asked that initial question, it wouldn't be entirely unreasonable for someone to say, "Objection. Compound question. Overly vague." Those aren't... Any lawyers listening are like, "Well, that's not real objections." And that's true. But the point being that you want to ask questions where the answer is not going to be ambiguous, and this question results in ambiguous answers. So we atomize them in order to eliminate the ambiguity in the answer.

Kip Boyle: Atomize them. I love that. So if I hit you with a question, like, "How well does your organization establish processes to identify supply chain cybersecurity risks?" So I'm just asking about did you establish a process to identify?

Jake Bernstein: Yep. So we've itemized it.

Kip Boyle: Yep. So the person is going to look at the scorecard, and the first statement that associates with a zero says, "Our organization rarely or never does this." That's pretty clear. Like, "In my experience, we've never done that. Or maybe I've seen it done once or twice in 15 years, but that's it." That's a zero.

Now maybe your organization is better than that, so the interviewee is like, "Well, we're not as bad as a zero." So they read the next statement on the score key. "Our organization sometimes does this, but unreliably, and rework is common." Now that's a score of three. Okay, so we do something, but things slip through the cracks. We're just not that great at it.

Okay. Now maybe they think that the organization is better. So they go on to the next statement. And they go on to the next statement. Until they find a statement that resonates with them as being a pretty reasonable reflection of what they actually do. And they can even give me midpoint scores. They could say, "Well, we're not a five, but I don't think we're as bad as a three. So, four."

Jake Bernstein: Funny how people seem to have this-

Kip Boyle: Once people do this, they get into it fast.

Jake Bernstein: Yeah. What's really funny, Kip, is how eager people are to sometimes go way overboard and they're like, "Can I assign a 3.75?"

Kip Boyle: Yeah.

Jake Bernstein: "No sir. No, you cannot." I think sometimes we allow half scores, but really we shouldn't even do that, arguably. But sometimes people are just adamant that, "Oh, it's not a three, but it's not a four. I really want to give it a three and a half." It's just funny. You give people a zero to 10 scale and they still want to expand the scale.

Kip Boyle: They still want to split hairs. It's funny, we've definitely seen that. Now, we also let people respond, "Unknown," and we also let people respond, "Not applicable," but I find that if you've done a good job preparing your question set and you've selected people thoughtfully, that just doesn't come up very often. And if it does, then that is a finding in and of itself. But in any event, so that's how you generate the dataset and that's how you gather the information that you need. But I still haven't told you how to find your top five cyber risks yet.

Jake Bernstein: And that's the whole point of this episode. So what do you do?

Kip Boyle: Okay. So what you do is you get all these answers and then you average the scores. You calculate the mean score for every question across all the people that you've interviewed. Then what you do is you have to compare that to something so that you can figure out what is the gap. And it's the size of the gap that is going to determine which cyber risks are top versus which ones are not top.

And in the NIST Cybersecurity Framework, I do this at the activity level or at the category level. At the second level of detail. I ask questions at the third level of detail, but I average them up to the second level of detail. And so, what I also do is I help the organization select target scores before we even go out and do the data gathering.

And let me tell you about some profiles of target scores that I start with that I teach people, like, "Here's some starting places." So if you wanted to get a minimum score of five across all six functions of the Cybersecurity Framework, that may be completely reasonable if you're not a bank or not a national security agency or not a hospital, because a five just says you're doing the least you can do and still not generate audit findings. Now, maybe in the very beginning, that's the best you can do. So maybe that's just where you start. But I would say that represents a minimum, a minimum target that you would aim for.

But a lot of people aim for something a little different, which is what I call the strong castle profile. This is where all the functions are at a five, which is minimum, but they set the protect function at seven or six or eight or something like that. And so, these are people who kind of stuck in the past. They think, "Well, I know where my crown jewels are and I'm just going to put a ton of protection on them, and nothing bad's going to really happen." And if something bad happens, it's not going to be that big of a deal. So we'll just deal with it if and when it comes up.

I mean, this is what perimeter networks were designed for, is the idea that we could put a digital wall around our digital assets and we'd be okay. But come on, we all know that isn't the way it works anymore. We talk now about assume breach. We talk about zero trust networks. And those little aphorisms are telling us that the strong castle approach is bankrupt. So I really encourage people not to do that.

What I encourage them to do instead is to set, say, the respond function at a seven or an eight or a six or something like that, and then leave everything else at a five. And I call this the first responder profile. Does that make sense? If I really am going to assume breach, that means that I would need to respond to a breach at any moment. And so I should have a really great response capability, right? Like a fire department. I should be able to roll the fire trucks at a moment's notice if a fire breaks out. Despite the fact that I'm doing everything I reasonably can to prevent fires, I know that they could still happen, so I spend money on a great fire department.

Jake Bernstein: And there's an argument to be made that recover is just as important.

Kip Boyle: Yes.

Jake Bernstein: And there's different strategies you can take. For example, let's just take an example where a company uses those awful, fully virtual desktops, where everyone's computer in the company is really just a terminal that connects to a server. That's a situation where you don't care about anybody's individual virtual machine, and so you can just focus on recover. Oh, there's even a slight abnormality. Delete. And I just spin up a new one. That would be the ultimate form of recover. I'm not actually advocating this, I'm just giving an example.

Kip Boyle: But there are absolutely people who create tech stacks these days that can handle adversity exactly the way you just described it.

Jake Bernstein: It's true.

Kip Boyle: There's no time to unpack that.

Jake Bernstein: It's not an invalid strategy, but it is...

Kip Boyle: No, not at all. And so the other profile that I wanted to mention is what I call the big city profile, which is where you set your respond and your recover function target scores out of six or a seven, and you keep everything else at a five, because you want to be able to roll the fire department to put out the fire, but you want to have a recovery capability that returns the organization to business as fast as possible. So I think that's the future, is this big city concept, where you have no gates on your city. No modern city has a gate on it. So you have no idea who's coming into your city. You don't know what they're bringing with them. Your best hope is that you're going to respond and recover, whether it's an overturned truck of salmon, which is what we had in Seattle not too long ago. It shut down one of our arterials.

Jake Bernstein: Which by the way, Kip, you're not saying, "And go ahead and put detect and protect at zero."

Kip Boyle: No.

Jake Bernstein: We're not saying put them at zero. We're just saying that you have to prioritize.

Kip Boyle: Yeah, you have to prioritize. You have unlimited cyber risks coming at you. You have limited resources. That's the whole point of identifying your top five. And your top five could be anywhere. It could be because you don't have great contractual firewalls. Well, PCI DSS isn't going to identify that. It could be that you have a really bad employee handbook with respect to setting your acceptable use. Again, PCI DSS doesn't talk about that.

So there's all kinds of things that could be the source of cyber risk. You've got to be able to identify them. And so, now you're just doing simple arithmetic. You've got your average scores. You compare them to your target scores. You get your gaps. And then you sort the list by the size of the gap.

Jake Bernstein: Size of the gap.

Kip Boyle: And there you go.

Jake Bernstein: There you go. So we have time for one more question here, Kip. How can you trust what the interviews told you? Because unless I missed it, you didn't collect any hard evidence, did you?

Kip Boyle: No, not at that stage, you don't. You don't collect any evidence. You just listen to people. Now, how do you trust them? Well, here's the thing. I already told you. They're internal experts. They understand the unique culture of the organization. They understand how that organization makes money. If you can't trust them, I don't know who you can trust. And not only that-

Jake Bernstein: You've got dozens of them

Kip Boyle: And you have dozens of them. And not only that, but when you go to senior decision makers and you tell them, "Hey, I just figured out what your top five cyber risks are."
And they say, "Oh, that can't possibly be true." And you say, "Well, but I asked the people on your team that you told me were the experts on your organization. Are you saying they don't know what they're talking about?"

Jake Bernstein: Yeah. That's way to put someone in a rock in a hard place.

Kip Boyle: Well, it's true. I mean, senior decision makers rely on the intelligence and the loyalty of their team, and if their team says this is what's going on, then it's really hard to refute that. And so, remember I told you that if you're struggling to get buy-in as a cybersecurity decision maker, look at what you're able to do. You're able to go to senior decision makers and say, "Your people, that you love and have hand-selected to lead this organization, have told us that these are the top five cyber risks. What are we going to do about them?"

Jake Bernstein: So you can either accept that those are your risks or fire everyone because you think they don't know what they're doing. And clearly, one choice is probably better than the other.
So yeah, I think we could do another three episodes, Kip, on our process, but I think it's time to wrap this one up.

Kip Boyle: Yeah, let's do that. That wraps up this episode of the Cyber Risk Management Podcast. What we did today was we looked at how to discover your top five cyber risks, while at the same time, shifting your dominant culture towards more reasonable cybersecurity. And we'll talk about how do you mitigate this stuff? Other things. How do you make sure you're getting great business value from the money you're spending? We'll do that next time. See you then.

Jake Bernstein: See you next time.

Speaker 1: Thanks for joining us today on the Cyber Risk Management Podcast. If you need to overcome a cybersecurity hurdle that's keeping you from growing your business profitably, then please visit us at cr-map.com. Thanks for tuning in. See you next time.

Headshot of Kip BoyleYOUR HOST:

Kip Boyle
Cyber Risk Opportunities

Kip Boyle is a 20-year information security expert and is the founder and CEO of Cyber Risk Opportunities. He is a former Chief Information Security Officer for both technology and financial services companies and was a cyber-security consultant at Stanford Research Institute (SRI).

YOUR CO-HOST:

Jake Bernstein
K&L Gates LLC

Jake Bernstein, an attorney and Certified Information Systems Security Professional (CISSP) who practices extensively in cybersecurity and privacy as both a counselor and litigator.