Search
Close this search box.
EPISODE 130
How To Assess Cyber Risk

EP 130: How To Assess Cyber Risk

Our bi-weekly Inflection Point bulletin will help you keep up with the fast-paced evolution of cyber risk management.

Sign Up Now!

About this episode

April 25, 2023

What’s the definitive method for assessing cyber risk? Does it exist? How do you do it? Let’s find out with your hosts Kip Boyle, CISO with Cyber Risk Opportunities, and Jake Bernstein, Partner with K&L Gates.

If you want to know more, Kip has a course on LinkedIn Learning you can check out:

“IT and Cybersecurity Risk Management Essential Training” — https://www.linkedin.com/learning/it-and-cybersecurity-risk-management-essential-training/

Kip also has a Udemy course that describes our semi-quantitative approach:

“Implementing NIST Cybersecurity Framework” — https://www.udemy.com/course/nist-cybersecurity-framework/

Tags:

Episode Transcript

Speaker 1: Welcome to the Cyber Risk Management Podcast. Our mission is to help executives thrive as cyber risk managers. Your hosts are Kip Boyle, Virtual Chief Information Security Officer at Cyber Risk Opportunities, and Jake Bernstein, partner at the law firm of K&L Gates. Visit them at cr-map.com and klgates.com.

Jake Bernstein: So Kip, what are we going to talk about today in episode 130 of the Cyber Risk Management Podcast?

Kip Boyle: Oh, well that's really coy because you know exactly what we're going to talk about. You actually requested this topic, and I think it's good. It might seem basic to the audience, but it really isn't, it's actually the source of a lot of healthy, let's say, creative tension in the community which... And the topic is going to be defining and assessing risk. Oh my gosh.

Jake Bernstein: Oh man. And you know what's funny? I think if anyone does think it's basic, well just wait for the next... Nobody really knows exactly how long, but we'll continue. So the first thing is, I don't know, maybe it's because we talk about this together off the podcast quite a bit, but have we specifically talked about this on the podcast before? What's the genesis of this episode is a relatively recent episode where we had a guest, Karen Worstell, who... And we really talked a lot about risk and assessment and management, and I really think that this one was inspired by both that, but also something else. And that is that we are starting to see the phrase, risk assessment or assess your risk, showing up in legislation and rules, regulations as requirements. And I have to say Kip, but quite honestly, none of these statutes or rules go into any detail. No.

Kip Boyle: inaudible.

Jake Bernstein: So let's just start with a seemingly basic question. Kip, what is risk?

Kip Boyle: Oh man. Okay. Well, yeah, we are the Cyber Risk Management Podcast, so it's reasonable per se that we would be able and willing to talk about this. Okay, it's not simple because risk as a term is really overloaded, right?

Jake Bernstein: Oh, it is.

Kip Boyle: A lot. So many of the terms that we need to use to get work done are completely overloaded. I mean, I could talk about risk and we could go to the racetrack. I could talk about risk and we could go to a construction site. I could talk about risk and we could watch a ship depart for a long overseas voyage, right? I mean there's just so much packed into that word risk.

Jake Bernstein: I made this joke later in the script, but we can ignore it then. So what you're saying is we have to define our terms, right, Kip? Is that what you're saying?

Kip Boyle: We have to define our terms. Yeah. Well that's why I like hanging out with you.

Jake Bernstein: I'm so excited.

Kip Boyle: Yeah, that's what you like to do, you like to define terms also. And when I define terms, I like to go to NIST. I like the National Institute of Standards and Technology because that's what they do is they standardize. And so it's a good place to go. In particular for this episode, we're going to look at NIST's Special Publication 800-30 release one, which is called the Guide for Conducting Risk Assessments. And it's a really great document for a number of different reasons and we'll unpack why during the course of the episode here, but-

Jake Bernstein: Point of brief order, it's Revision 1, it's not release one.

Kip Boyle: Oh my God.

Jake Bernstein: Technically it's release two, Kip.

Kip Boyle: So do I sustain your objection? Is that how that works? Or do I overrule it?

Jake Bernstein: No, yeah. I think you sustain it in this case.

Kip Boyle: I sustain it. Okay.

Jake Bernstein: Yeah.

Kip Boyle: You sustain it. Okay. So what's risk? All right, that's your question. And now I'm going to give you the 800-30 answer. Risk is a measure of the extent to which an entity is threatened by a potential circumstance or event. And it is typically a function of, one, the adverse impacts that would arise if the circumstance or event occurs, and two, the likelihood of occurrence. So you've got a little dynamic interplay there between those two ideas. And in our mode of operating, we really talk about information security risks or IT security risks. And those arise from the loss of confidentiality, integrity or availability of information or of information systems. So what do you think about that?

Jake Bernstein: I think that's a really, really helpful start. And for an extended discussion of confidentiality, integrity and availability and the risks and the impacts of those, please go back and listen to our most recent Verizon DBIR episode where the DBIR authors hopefully have started to focus in on that concept, and we do talk about that in our episode. So that's where you should go. And then second, Kip, I was thinking, this and I promise that this will be clarified later on, but I want to start with what I thought was actually an interesting bit of confusion on my part is that I remembered from my CISSP studies that there are all kinds of little formulae that people use as shorthand for what you just said, and a couple or three of them are risk equals threat times vulnerability or risk equals threat times vulnerability times impact, and then sometimes people switch impact for cost.

I mean, they're basically synonyms. That's I suppose more or less-ish what you said, but it's not quite the same. And we can talk about why that is.

Kip Boyle: It's a derivation, right?

Jake Bernstein: It is, it turns out, yeah, and we'll get to that. It does turn out to be a derivation. I think the NIST one is the better, more comprehensive, more technical definition. And I think that what makes this so difficult is that the formulas I just gave, what do you do with that? Okay. So yes, risk, I can sit there and impressively state to an audience that risk equals threat times vulnerability times impact, but so what?

Kip Boyle: How do you instantiate those variables?

Jake Bernstein: Yes, exactly. Well first, what do those variables even mean?

Kip Boyle: Right. What are the units?

Jake Bernstein: And then, Kip, I have very helpfully left an amazing, I think my favorite bit of scripting ever in this script, and I'm going to set it up for you, is, you have to be able to measure this stuff if you're going to use those formulae as they're presented.

Kip Boyle: Yeah. Yes. I appreciate this little digital soapbox that you've prepared for me, and do let me stand on it for a moment. So I've spent all my career in, first it was called computer security, and then it was called network security going all the way back to the first stuff that I did in this career back in the 1990s. But I have studiously tried to discover and use a risk assessment methodology that was reliable that I could actually depend on, and I never really found one. I went to classes. When I was on active duty, I went to sponsored training by the Department of the Defense on how to do risk assessments, and none of it was really ever definitive. And so I just have to say that there's no easy, straightforward answer here. In fact, if you've been listening to this podcast for any the time, you probably know that I got so frustrated by this that I invented my own way of doing it and which I really like, and we'll talk more about that later on.

But if you love math and you like advanced statistics and you're good at explaining those kinds of concepts like montecarlo simulations and probabilities and that sort of thing, then by all means embrace that because that's probably the best choice. Unfortunately, what I find is that very few people actually want that. Certainly nobody with money to spend in my world has ever asked me to bring advanced mathematics into risk measurement. I know some organizations do it, and God bless them, but it's really difficult and it's very resource intensive. And unfortunately the red, yellow, green stuff I learned very early that spending a lot of money on doing red, yellow, green is a bad idea because I can back the envelope a quick qualitative analysis and get pretty much what I'd get if I did it over the course of a month and involved a lot of people and spent a lot of money, so there you go.

Jake Bernstein: All right. So we're going to get to our risk assessment methodologies, but it sounds like we've preemptively shot down the quantitative approach and the qualitative approach.

Kip Boyle: Well, again, it can. Just please remember, it depends on what you want.

Jake Bernstein: It does.

Kip Boyle: If you're okay with math and if your senior decision makers are okay with math, like if you work for an engineering firm or something like that, then going to something like fair, which is very math intensive and statistics intensive. Doug Hubbard has several wonderful books that talk about how to measure cyber risk. And again, very math intensive. If you can embrace it and make it work for you, again, God bless you, but it's just not something that I have seen a lot of, that's all.

Jake Bernstein: Yeah. And it's not going to be fast. It's not going to be cheap.

Kip Boyle: No. And the quality of the inputs are everything.

Jake Bernstein: So garbage in garbage out is always the rule-

Kip Boyle: If you're not careful.

Jake Bernstein: ... if you're not careful.

Kip Boyle: Yeah.

Jake Bernstein: Okay. So I think that we have discussed that as briefly as we can. So let's go ahead and move forward here.

Kip Boyle: Sure.

Jake Bernstein: Let's look at the four components. And I realize I'm taking your spot, but that's all right, because that's how things go. One other NIST publication that I want to bring up for people, so we already mentioned NIST Special Publication 800-30, which is currently in Revision 1.

Kip Boyle: It is a good one by the way.

Jake Bernstein: It's a good one.

Kip Boyle: It's a good reference.

Jake Bernstein: It is, very good reference. But there's another one that's NIST SP 800-39, which is about the risk management process. And it's got a lot of detail in there. We're not going to go into that.

Kip Boyle: Can I just say really quickly that risk management and risk assessment is not the same. Those are not synonyms, please don't get confused.

Jake Bernstein: It's not, and I'm about to say that, you're going to find out. This is where sometimes I wish, Kip, that we had really done our video podcast, not because I want people to see me in the morning when we're recording, but because every so often a visual aid would be helpful, so I'd like... Dear listener, imagine a triangle, at the top point you have assess, at the bottom right point, you have respond. And then the other point is monitor. Now in the center, add in a circle or whatever you want and label it frame. Now all three of the vertices, the external points connect to the center, the frame.

And what we're trying to do here is describe the risk management process diagrams that you can find in 800-30 and 39. But the idea that I want you to do is to think about risk management, and I say that in... That's intentional, not assessment, risk management as a process and a continuous one at that, full of these wonderful internal feedback loops that continually make adjustments as facts and circumstances evolve. So that's kind of where I want you to start. And Kip, you really said some of it's super important and look, it's right there for you.

Kip Boyle: Is that, again, risk management and risk assessment, not the same, but you have to be able to assess risk if you want to manage it somehow. But we're not going to talk about risk management. We're going to continue to focus on risk assessment. But I like the idea of maybe coming back and doing another episode later on on risk management. And I actually think there's a lot more useful guidance on risk management than there is in general on risk assessment. But inaudible, hey, let's tackle the hard thing first, why not?

Jake Bernstein: Yeah, let's do it. I think it's great. And that's interesting, right? Risk assessment. Why do we care, right? Why do we care? Not a rhetorical question.

Kip Boyle: No, it's a really important practical question, which is why I spent so many years going to all kinds of trainings, reading all kinds of books, getting laughed at, and I'm not kidding. I would try different risk assessment ideas and I would bring them to senior decision makers, and sometimes... I mean, they were never really mean to me, but they'd smirk, because they're just like, "Dude, what are you doing? This doesn't make any sense to us."

Jake Bernstein: Oh, the enthusiasm of young Kip.

Kip Boyle: So it's an important question and you have to figure out some way to assess risk or you just don't know how to invest your limited resources against your unlimited risks that are coming at you. This is a huge pillar of how I work with senior decision makers and frame up the issue. And there's a lot at stake. Oh my gosh, there's so much at stake. Either you're just not getting business value for the money you're spending, or you are going to have this false sense of security, but you're going to be completely open for ultimate failure in the worst case.

Jake Bernstein: It would be fair to say then that a poor risk assessment methodology is at best going to lead to just major inefficiencies, but at worst, complete failure. And as I mentioned at the start, from the legal standpoint, there is a growing body of law that actually requires and uses the phrase, risk assessment. And as I also mentioned, none of the laws go into great detail about how you assess risk. And that's the point of this episode. We know that the risk management process is about framing the issues and assessing, monitoring and responding over time. The rest of this is that assess bubble. So let's do what we can here in the remainder of the episode and try to figure out what does it mean to assess risk.

Kip Boyle: I almost feel like you're getting me back for some slight, I don't why. I don't know why I feel that way. Why should I feel annoyed because you keep asking me simple questions for which there is no simple answer.

Jake Bernstein: Well, to be fair, it's a really hard question.

Kip Boyle: It is a really hard question, but I like... I'm going to go back to 800-30. And the reason why is because in the back of that document is a really helpful appendix. Do they call it an appendix? I don't know what they call it. Anyway, in the back of that document, there's actually a list of risk assessment techniques that they approve of. And there's all kinds of stuff in there, the things that we often hear about as well as a lot of things we don't often think hear about. And I'll just give you one quick example. A gap analysis is considered to be a legitimate risk assessment technique according to Special Publication 800-30. So if anybody tells you that a gap analysis is not a risk assessment, no, maybe for them it's not. But it can be according to NIST, which I think is very powerful and useful.

But what does 800-30 actually say about risk assessments? And what it says is that they're not simply one-time activities that provide permanent and definitive information for decision makers. You can't just do it once and expect that the results will guide and inform their responses to information and security risks forever, it's too dynamic. Cybers is a dynamic risk. We've talked about that so many times. So as a result, again, straight out of Special Publication 800-30 organizations need to employ risk assessments on an ongoing basis throughout the system development life cycle. If we're talking about systems that we are creating or the systems acquisition life cycle, if we're actually buying systems to use, which of course cloud computing, people are buying stuff rather than making it themselves more and more. And you have to do this across all of the tiers and the risk management hierarchy. So that's what Special Publication 800-30 says. I'm totally on board with that, of course. And the takeaway, assessing risk is an ongoing, not a one-time thing.

Jake Bernstein: And the other thing that just occurs to me, Kip, is that one of the things that... Obviously we've said why you should do it, but I think you just said one of the most important factors here, or one of the most important concepts, which is what is the actual purpose of it? And it is to provide information for decision makers so that they can be guided and make informed decisions about anything in a business. That I think is, we cannot lose sight of that as being one of the most important things.

Kip Boyle: And senior decision makers talk about risk all the time.

Jake Bernstein: All the time.

Kip Boyle: So if you're a small business... My small business is going to turn eight years old in another two or three months, and for the first several years of its existence, the biggest risk we faced was product market fit. How can I figure out how to serve people in a way that they will say, "Yes, please, we will sign up for that." That was the number one risk to our future and our ability to succeed and be a viable organization. And every other risk that was facing me paled in comparison because if I didn't get the product market fit risk, sort it out, then none of the stuff would've mattered. And so if you go to senior decision makers, realize that they're dealing with way more risks than what you're talking about.

Jake Bernstein: It's true. And that's really great because... Continuing on with 800-30, it tells us that organizations conduct risk assessments to determine risks that are common to that organization's core missions or business functions, its mission or business processes, it's mission or business segments, common infrastructure and support services or information systems. You can use risk assessments for many purposes. They can be used to design and develop InfoSec programs and systems. You can use it to create definitions for connectivity requirements, which let's be honest, is one of the most common things in today's world. You can use it to figure out how authorization is determined for use or access of information systems. I mean, all of this stuff are real day-to-day decisions that have to be made. And generally speaking, it's best if you make those decisions using a technique other than flip a coin or, "Because I feel like it".

Kip Boyle: Or guessing

Jake Bernstein: Or guessing

Kip Boyle: Or listening to what vendors tell you is most important, which is horribly biased if they're trying to sell you a product or just watching newsfeeds, listening to sensationalized news stories. If that's how you attune yourself to what's risky in the world, then that's also incredibly biased and not very useful. So where you get your risk assessment information is super, super important. And I also want to say that one of the things I try to encourage people in the information security community to do is to focus on these core processes. Well, this is another reason why I love 800-30 because senior decision makers are very focused on this. They got to get their sales function working and keep it working.

They have to fulfill the orders that they receive, and they have to collect the money that people owe them, and they have to pay the people who provide services. So they have to pay their vendors. And when that stuff falls apart, it can destroy the business. So I always tell InfoSec people to get out there and understand how those processes work, and then figure out where are the information assets in there and get busy securing them, because that's what's going to make senior decision makers get really interested in your work is when you focus on those core business processes and functions, so... Anyway, just reinforcing what you said here.

Jake Bernstein: No, it's very important, and I think it too. So as we continue to move through this in define terms, we already talked about what risk is, right? It was the measure of the extent to which an entity is threatened by potential circumstance, taking into account the impact and the likelihood of occurrence. So risk assessment then is the process of identifying, estimating, and prioritizing, in this case, information security risks. And we already hinted at this. You actually said the phrase risk assessment methodology. And we need one. If we're going to assess risk, it's definitional. And what is a methodology? Well, it's going to consist of a risk assessment process of which gap analysis is, you gave one example.

This is something that I think does not get well represented or discussed enough is an explicit risk model that is really defining key terms and accessible risk factors, right? I mean, just because we want to talk about threats and vulnerabilities and impacts and all these different things. I mean, trust me, well, you don't have to trust me because you've done it yourself for your entire career. I've done it myself as well with different clients who haven't gone through the process completely. And let me tell you, as you no doubt know, it is painful if we don't have clear definitions for these types of things. And then you need these-

Kip Boyle: inaudible talk about the problem of overloaded terms already.

Jake Bernstein: Yes. And then you need to have the assessment approach. And that's where we get into quantitative versus qualitative and or semi quantitative. And remember too, that approach has to describe the range of values that we can use and how different combinations of risk factors are identified and analyzed so that those factors can be functionally combined to actually evaluate risk. I cannot tell you the confusion that I have seen on people's faces, as you take what seems to be these relatively simple formula and tables and then you try to apply it to any given situation and oh my Lord, it just becomes so much more painful.

Kip Boyle: I'm getting nauseous just listening to you.

Jake Bernstein: I know. I know. The last thing we need as part of our risk assessment methodology is an analysis approach. And there's a lot of different ways to think about this. It can be threat oriented. You can instead focus on assets and impacts or vulnerabilities. Those are just some different ways to think about it.

Kip Boyle: Or patches. I mean, I see people focusing on all kinds of interesting things to put at the center of their risk assessment work. One thing that I think is important to say is that I see most people, and I've done this a lot in the past, I see most people in our line of work trying to go from the bottom up, trying to say something like, "Our biggest problem is we don't patch systems fast enough." Well, is it? I mean obviously you're sitting right next to it. And so you must have some perspective there that there's a problem. But is that really the biggest cyber risk that we're facing?

Jake Bernstein: It probably isn't, lets be honest.

Kip Boyle: No, it probably isn't. Probably the biggest cyber risk we're facing is that we don't have any indemnification language in our contracts around third party risk. I mean, maybe that's the biggest inaudible.

Jake Bernstein: Or it might be that we don't know how many different cloud services our people are using using unofficial accounts. That's probably a bigger cyber risk than our patching.

Kip Boyle: Right. And that's why I think doing risk management from the top down is something we haven't done enough of. And I really enjoy promoting that as a risk assessment approach.

Jake Bernstein: So Kip, is there a limit to the number of risk assessment methodologies that an org can use?

Kip Boyle: I don't think so. I don't think there's really a limit, but I think you have to be careful. I don't think it's a good idea to have a risk assessment method of the week. I mean, I've made that mistake as I cycled through different tries, like, "I'll try this, I'll try this, I'll try." And in the search for something that works can confuse senior decision makers rather than provide clarity.

Jake Bernstein: So there's two words here that I want to hone in on that I think are great. And SP 800-30 talks about the key being to increase the reproducibility and repeatability of risk assessments. And if you don't have that, you can't really get the data you need. And that's the problem with doing a risk assessment of the week, right?

Kip Boyle: Right.

Jake Bernstein: I think, again, words are important and let's quickly define, so reproducibility is the ability of different experts to produce the same results from the same data. In other words, you did the risk assessment and came up with this result. Can you give the methodology to your friend Rob and have Rob do the exact same? Does he go to the same spot? That's really important.

Kip Boyle: Yeah, but now you're blinding me with science, Jake.

Jake Bernstein: inaudible science.

Kip Boyle: That's what we're talking about here, right?

Jake Bernstein: Sort of. But then the other one is repeatability, and that's the ability to repeat the assessment in the future in a consistent manner. And I think about the-

Kip Boyle: The science?

Jake Bernstein: It is. Yes, it is science. But I also think about the technique that we have used with our clients, and it is designed to be both reproducible and repeatable. Otherwise you don't get to see trends, you don't get to see anything over time. And that's really, really important.

Kip Boyle: It's super important because you go back to what we said early on, which is that risk assessment is not a one-time thing.

Jake Bernstein: Exactly. So I told you this, well, we both knew this was complicated, but hopefully our listeners are now agreeing with us.

Kip Boyle: Oh, well, I'm going to watch our podcast listening statistics and figure out where people are actually tuning out on this particular episode.

Jake Bernstein: Yeah, I don't know. I have a feeling this could be a popular one just because it's such a core issue for our audience.

Kip Boyle: Well, I hope we're doing a good job of unpacking this. Dear listeners, you can tell us if we totally got it wrong. You know how to find us.

Jake Bernstein: It is hard, and I suspect that I've bitten off more than we could chew on for one episode. So this willl probably become a two-parter, may not be a two-parter back to back, but I'm sure we'll revisit this. But before we summarily just give up, let's talk about risk models. A risk model defines the risk factors to be assessed and the relationships among those factors. And look, SP 800-30 is your friend here. There are all kinds of characteristics used in risk models as inputs to determining levels of risk. In fact, as Kip said, they do have tables of risk assessment methodologies. They also have tables full of quantitative and semi quantitative and even qualitative ways of looking at threats and vulnerabilities and all that stuff. It's really, really helpful document.

Kip Boyle: Think about your audience, don't think about what you like.

Jake Bernstein: Yes.

Kip Boyle: What you like isn't first, what the audience is going to connect with, that's first.

Jake Bernstein: Yep. So here by the way, is where we get to those terms that I remember from my CISSP studying. So typical risk factors according to 800-30 include these terms, threat, vulnerability, impact, likelihood, and predisposing condition, which is just a way of saying it's kind of a vulnerability. And here's what I love, Kip talked about great words, great language. One can continue to "decompose" these factors into sub-factors, as an example. Threats can be decomposed to threat sources and threat events. And let me tell you, based on personal experience working with a client, you really actually do need to do this because what's going to happen is you're going to go along and at a high level, everything is going to seem great. You're going to feel really good about your awesome risk assessment methodology and how you're managing risk. And then someone's going to say, "Okay, how do we apply it to this situation?" And suddenly there's a likelihood that it's all going to come crashing down, and you're going to realize that as cool as everything looked, it does not survive contact with reality. And these-

Kip Boyle: You wouldn't know this firsthand, right? This is the research you've done.

Jake Bernstein: No, this is... Honestly, this is the type of thing I think you only learn from firsthand pain, right?

Kip Boyle: I've been there.

Jake Bernstein: Yeah. This just means that you've got to define these terms and get them as straight as you can before you get too deep into the weeds, because if you don't, it'll create a gigantic mess.

Kip Boyle: Yeah. And be consistent. And if necessary, I don't think it's patronizing to your audience to actually hand out a sheet of paper with terms on it. I really don't think it's patronizing because-

Jake Bernstein: Not at all.

Kip Boyle: ... this is hard. This is hard and everybody needs to be on the same page. Okay, look, I get the idea about a second part to this episode, but I actually think we can get through everything that's left and not have to create a part one, part two. Let's try to run through these typical factors. So what are the terms that everybody has to be on the same page with? Well, one of the common ones is threat. And what is a threat? Well, it's any circumstance or event with the potential to adversely impact the organization, disrupting their operations, destroying their assets, whatever it is. And this is a complex concept, and it can be further broken down. I don't think we should do that in this episode. I think we should let people break it down or decompose it in whatever way makes sense for them. But there are threats, there are vulnerabilities and predisposing conditions. I don't tend to use that turn of phrase very much, but-

Jake Bernstein: That you can blame 800-34.

Kip Boyle: Yeah. But these are weaknesses in an information system or in procedures or in some internal controls that you're relying on or the way that technology has been implemented, or we talk about zero day vulnerabilities and that sort of thing. I think we talk about vulnerabilities an awful lot, and so that's probably a term that is unlikely to be misunderstood compared to all the other ones. But importantly, vulnerabilities are exploited by threats, that I think is something we need to hang onto likelihood of occurrence.

Jake Bernstein: And don't forget the... I think it's actually super important to remind folks that a vulnerability sitting around doesn't do anything, the question is, can it be exploited by a threat source?

Kip Boyle: Yeah. Boy, have I had a lot of arguments about people who told me that was a theoretical vulnerability, and until I could demonstrate it, they wouldn't take it seriously, even though I knew that it was an issue because I knew there were cyber attackers out there who were actively exploiting it, but I had to bring that evidence in in order to get somebody to pay attention to it. And I think that's one of the reasons why we have bug bounty programs and why people want to see us. When we do network penetration tests, they want us to show evidence that we actually accessed a sensitive system or whatever, and sometimes they want us to actually show them how we did it. It's not enough to write about it. So anyway, that's kind of interesting.

Couple other key terms, likelihood of occurrence, very tough to estimate, but it refers to a weighted risk factor based on probabilities that a given threat will exploit a given vulnerability and you can decompose likelihood into several different sub-parts. And likelihood is one of the things that I think I've fought the most with other people about, how likely is this to happen? How many times a year? How many times a week? It's difficult. I mean, you're talking about a crystal ball here. And so I know that if you use advanced statistics, you can do a good job of this, I just don't know if your typical senior decision makers will understand what you're saying. And then impact. Impact maybe is the simplest idea, but again, it's guesswork, but it's the magnitude of harm that you can expect to experience as the result of a successful exploited vulnerability.

Jake Bernstein: Yep. And like all of these different things, you can use a quantitative or a qualitative or a semi quantitative approach. For example, for impact, we could say something could be, that's going to hurt real bad or it won't be so painful. That would be a qualitative approach. And then a full quantitative approach is, well that impact is going to be $700 versus $700,000.

Kip Boyle: Yeah.

Jake Bernstein: Okay. And then look, there are other aspects of risk to consider, right? In case it isn't clear by now, there is a lot of uncertainty. That's another key term in 800-30. In all of these different terms and concepts, it has to be considered.

Kip Boyle: But some people actually think uncertainty is a synonym for risk.

Jake Bernstein: Well, in some ways it is.

Kip Boyle: Yep.

Jake Bernstein: And you can see why, right? Because by the time you break risk down into all of its component parts and then you think of all the different uncertainty at each of its component parts, like it just exponentially increases the uncertainty. One thing you might want to do when considering risk, particularly to big organization, is aggregate the risk, roll that stuff up to higher and higher levels to get a more relevant understanding for the highest levels of decision makers.

Kip Boyle: We didn't talk about frequency, did we talk about frequency?

Jake Bernstein: No, but I think-

Kip Boyle: How about likelihood?

Jake Bernstein: Frequency and likelihood, I think that... You know how we said likelihood can be decomposed into several sub parts, frequency would be a sub part of likelihood.

Kip Boyle: Okay. Now, the reason why this is coming up real quick is because if a risk is small in the way that you've assessed it, but you're only looking at one materialization of it, but it materializes a lot, in other words, it has a high frequency but a low cost when it does happen, well then it starts to add up, so you can't ignore that.

Jake Bernstein: And of course we have a cinematic example of that in the classic film, Office Space, where the threat actors, in this case the main characters, skimmed fractions of ascent off of every single transaction that went through their company and individually microscopic impact. On the other hand, the frequency was thousands of times per minute and it really did add up.

Kip Boyle: It's so old, way beyond what they expected.

Jake Bernstein: What they expected precisely, so it's... Whenever I can toss in an Office Space reference, Kip, I appreciate it. So that was good.

Kip Boyle: That'd be great.

Jake Bernstein: That'd be great. Let's wrap up this episode then with a discussion of assessment approaches. And I know you said we could get through it, I think maybe that's true, but I'll be honest, I already built in the idea that we probably need to go to another episode and I think we will. But let's talk about the assessment approaches. So we have some pretty strong opinions here. The approach to assessing risk is going to determine so much about the overall process. Do you want to take that fully quantitative approach that Kip has mentioned over and over again as requiring a lot of advanced math, a ton of time, a ton of effort, a lot of money, or do you want to do a qualitative approach, green, yellow, red? I don't think either of us would say that it is an utterly useless approach, devoid of any value at all. It's not. That wouldn't be fair. Or do you want to take a semi quantitative approach where basically you smash the two together and see what you get?

Kip Boyle: Find the middle ground, yeah.

Jake Bernstein: What do you think, Kip? Let's wrap it up talking about this.

Kip Boyle: Yeah. I'm going to repeat something that I said, which is, it's about the audience. Think about your senior decision makers. Think about the people who own the risk that you are trying to assess and think about how they'd like to see you talk about it. Don't say, 'Well, I want to put fair as a quantitative risk management method or risk assessment method. I want to put that on my resume. I want to get good at it, so I'm going to use that." That is like the tail wagging the dog, and I don't think you're going to be successful. So view some empathy and try to figure out what your senior decision makers are going to want to see. I made a mistake one time. I was working with a customer who was the head of an engineering firm and had actually come up through the engineering firm as an engineer. I thought for sure this person was going to want to do some probabilities and Carlos simulations and so forth, and they said, "No, I don't have time for that." inaudible.

Jake Bernstein: At the risk of going another 10 minutes, which if we do is fine. I have a question for you here, Kip. And I think it's really a request for a story because what I love nothing more is to have those aha moments when something that you brought to me years ago and have been using and developing for many more years than that suddenly smacks right into this NIST Special Publication. And that is the semi quantitative approach that we have used with our clients affectionately known as our zero to 10 measurement scale. So maybe just tell our audience, how did you get there? What was the story, what's the origin story of the zero to 10, how we measure risk scale?

Kip Boyle: Well, I appreciate you asking, and I'll give you a thumbnail type response. I just got fed up with red, yellow, green and probabilities and all these different statistical approaches, and I just sat down one day and I just said, "What is it really like in the real world? Can I learn anything from just seeing things happening day by day in my work?" And what I realized was there's really three states of security. There's no security or very little security. There's acceptable levels of security. And then every now and then, there's too much security. And I realized that that is what most people experience. And so if I wanted to talk to people about how much security we needed, that that would be a good basis for grounding the conversation on. But then I said to myself, okay, but that's still pretty squishy. How can people tell me whether the amount of security they have is too little, the right amount, or too much?

So like, my porridge is too cold, my porridge is too hot, my porridge is just right. How do I capture that? And so I thought, well, let me see if I can figure out how to put a little pain scale next to it. So anybody's ever gone to the doctor and you're in pain. You've probably have seen the little chart they show you where you have zero to 10 and zero is like no pain, there's a little happy face, and 10 is like, I can't breathe. And there's this red really super unhappy looking emoji face staring at you, and then there's all kinds of choices in between. I latched onto that and I thought, well, that's super practical. But then I realized, okay, well that is not the continuum I'm trying to describe. I have a different continuum. And so I borrowed the one to 10 measurement and then I threw a zero because I realized that if anybody ever said, "Kip, I have no idea what you're talking about, we don't do any security for that thing," then I needed to be able to capture absolute zero.

I needed to be able to capture the other extreme, which is we have so much security, I don't even use that system anymore. And then the other thing I realized too was that there was more granularity on not enough security than there was on too much security, so I kind of skewed the numbers a little bit so that there would be more like zero to four, there would be five possible places where you could say, "We don't have enough," but there's only two numbers on too much. So anyway, that's kind of how it all came around.

Jake Bernstein: And what do you think is the benefit of having this truly semi quantitative as opposed to either red, yellow, green, or even worse, Kip, we've both seen it, yes or no?

Kip Boyle: Yeah. Well, what I learned is that if you're going to do top-down risk assessment, it's really helpful to talk with people about that. And I've done top-down risk assessments where I did have interviews with people, but they were free flowing, unstructured interviews. And when I aggregated all my interview notes, I had a hard time really understanding what was most important. I could kind of pull out themes, but I wasn't always able to actually point to anything in particular. So I found that lacking. And so I realized that if I could ask people questions and get them to give me numbers in return, then I could use polling, I could use the kind of approach that organizations have used for decades.

If they want to ask people how engaged they are on the job, they pull together a sample audience and then they go and they ask questions. And so I realized that that would actually be workable, that I could actually engage with people, find out what they know, do this on a top-down basis. But then when I was done talking with them, I would have maybe 1500 or 2000 data points, and then I could do a little statistics, very, very lightweight amount of statistics on that data set, and then I could actually learn something useful.

Jake Bernstein: So Kip, and this is not in the script, and I'm just going to toss you some slow softballs here, but if our listeners like what they're hearing, and by the way, folks, this episode is sponsored by Cyber Risk Opportunities and Kip Boyle, author of Fire Doesn't Innovate.

Kip Boyle: Yeah, inaudible.

Jake Bernstein: I'm joking, but also not. So obviously yes, your book, we've mentioned it before, the book, Fire Doesn't Innovate, your book does talk about this process. What if someone is like, "Ah, I just want to watch a video." What do you got?

Kip Boyle: Oh, yeah. So we just put a Udemy course up. It's called Implementing the NIST Cybersecurity Framework, and I'll put the URL in the show notes. You could typically buy that course for 10 to $20 when it's on sale, get a coupon code. It's very inexpensive. And if you watch that course, you'll see how I use the zero through 10 scale in order to conduct a gap analysis on the NIST cybersecurity framework compared to your organization. That is a risk assessment. And then what we do is we take that, we prioritize the results, we figure out a mitigation plan, we prioritize those, and then we make a risk management implementation plan. So yeah, get the whole thing if you go get that NIST course.

Jake Bernstein: All right. Well, I think that's a wrap.

Kip Boyle: Almost. One more thing I want to say.

Jake Bernstein: Oh, okay.

Kip Boyle: I recently released a course on LinkedIn Learning, so if you have access to that, and if you just want to just go a little bit deeper in the concepts that we covered today, including 800-30, as well as some of the ISO standards and so forth, and a lot of very practical examples of how to do risk assessment and risk management in the field, then you're going to want to take this course, it's called IT and Cybersecurity Risk Management Essential Training. It's in the LinkedIn Learning library. And that has a completely different pricing model, I'm not going to get into it, but I'll put the URL to that course in the show notes too. There you go.

Jake Bernstein: Okay.

Kip Boyle: That's what I wanted to say.

Jake Bernstein: All right. Well, we all know that I can't wrap up an episode. That's always you.

Kip Boyle: All right, that wraps up this episode of the Cyber Risk Management Podcast. Today we talked about an incredibly complex but crucial concept, and that's assessing risk and all of its related issues, and we hope we did a good job for you and we hope we'll see you next time.

Jake Bernstein: See you next time.

Speaker 1: Thanks for joining us today on the Cyber Risk Management Podcast. If you need to overcome a cybersecurity hurdle that's keeping you from growing your business profitably, then please visit us at cr-map.com. Thanks for tuning in. See you next time.

Headshot of Kip BoyleYOUR HOST:

Kip Boyle
Cyber Risk Opportunities

Kip Boyle is a 20-year information security expert and is the founder and CEO of Cyber Risk Opportunities. He is a former Chief Information Security Officer for both technology and financial services companies and was a cyber-security consultant at Stanford Research Institute (SRI).

YOUR CO-HOST:

Jake Bernstein
K&L Gates LLC

Jake Bernstein, an attorney and Certified Information Systems Security Professional (CISSP) who practices extensively in cybersecurity and privacy as both a counselor and litigator.