
EP 88: How to Really Make Sure that Cybersecurity is Everyone’s Job (Part 1)
Our bi-weekly Inflection Point bulletin will help you keep up with the fast-paced evolution of cyber risk management.
Sign Up Now!
About this episode
Episode Transcript
Speaker 1: Welcome to the Cyber Risk Management Podcast. Our mission is to help you thrive as a cyber risk manager. On today's episode, your virtual Chief Information Security Officer is Kip Boyle and your virtual cybersecurity council is Jake Bernstein. Visit them at cyberriskopportunities.com and focallaw.com.
Kip: Jake, hi. What are we going to talk about today?
Jake: Hey, Kip. So today we're going to dig into something that we mentioned during part two of our Verizon data breach investigation report series, which has gone semi viral, I'm proud to say.
Kip: That's great.
Jake: It is. So we often talk about cybersecurity culture and maybe we don't talk about it as much as we need to on the show here, but we're going to remedy that today. And-
Kip: We certainly talking about it with our customers enough.
Jake: We do. We absolutely do. And it's a really, really important part of the puzzle. And if you recall, the Verizon DBIR kind of gave a hint. It was one page. And it mentioned using behavioral science to build a cybersecurity culture. And we thought that was really enticing. And so here's a full episode about it. And in fact, I'm going to warn people now, this actually has already become a two-parter. So this is part one of building cybersecurity culture. We think it's that valuable and that important.
Kip: It is. It's also something that I have been careful to speak about in... Whether we're doing podcast episodes or whether I'm writing a white paper or an article or giving an interview, I'm really careful about talking about culture because in my experience, conversations about culture and culture change doesn't resonate with a lot of people up front. And so I don't want to turn people off by showing up and talking about culture are right off the bat. A lot of people think, and I certainly have an opinion about this, that cybersecurity is all about technology. And sure, there is an awful lot about cybersecurity that is technology. And in fact, it's our use of technology that kind of got us into this situation and this problem space. But I really think that the way out is a combination of culture and technology. But anyway, so there you go.
There's my own personal bias for why I haven't talked about it too much on the podcast. But anyway, let's just dive right in. So what we found, Jake and I, is that when we saw that reference in the DBIR, we went ahead and pulled two academic papers behind what they were saying in the report. And what we want to do is bring you the most valuable aspects of those papers and use this episode as a setup for the next episode that we're going to have, where we're really going to talk about the practicality of it all. So there is going to be real world practicality here. So stick with us. So again, two papers. The first one is from 2019, and it's from Huang and Pearlson so. And it's a cybersecurity culture model that they describe in their paper.
And then of course, there's a second paper which we want to talk about, and this was the white paper about Verizon Media and what they did to shift their cybersecurity culture. And they issued their own paper about it. And they based their work on Huang and Pearlson's paper. And in fact, I can see, as I look at the paper, one of the authors of the 2019 paper was actually an active participant in the Verizon Media work. So there's a really good thread here. So if you want to, again, if you want to know what we're basing this on, open up your 2021 Verizon DBIR report, turn to page 53 and kind of that's where we're taking off from. So let's take a look at this model here, right? So the name of the paper is, For What Technology Can’t Fix: Building a Model of Organizational Cybersecurity Culture. Jake, where should we start?
Jake: So I like to start whenever I'm reading an academic paper with the abstract. And we are going to limit ourselves in terms of how much of the paper we're going to try to read you verbatim. However, the abstract is one where it's important to get at least most of it because it's very, very... I mean, it's already very, very short. So organizational cybersecurity requires more than just the latest technology. Kip and I agree. That's not part of the abstract. To secure and organization, all members of the organization must act to reduce risk. Leaders have a special responsibility to understand, shape and align the beliefs, values, and attitudes of the entire organization with overall security goals. Managers need practical solutions for dealing with the human side of cybersecurity. The model presented in this paper describes organizational cybersecurity culture, the factors that contribute to its creation and how it can be measured. And the abstract does go on a little bit talking about the case study, but that's really for the next episode. So, Kip.
Kip: I mean, it's hard for me to disagree with anything. I think I could say that I literally agree with every word in that abstract and it's something that I have struggled to do over and over again throughout my career. And this is hard and I don't think anybody can debate that in terms of suggesting that it's not difficult. It is very difficult to shift culture no matter what the reason. I mean, you could shift culture... I mean, Microsoft shifted their culture to catch up to the internet change, right? So in the early 2000s, they pivoted, right?
And then they did a massive culture shift in order to build security into their products, which I think they've been pretty successful at. IBM had to make a massive shift multiple times. So there's all kinds of reasons why an organization would want to shift their culture. But this one is really, really... This is a really important shift and this is the one that I've been focused on. So it's really exciting to be able to see something practical, even though we're reading an academic paper. But let's see if it can help. I think it will.
Jake: So I think I'm going to take my lawyer perspective here and say, let's define some terms because I don't know about you Kip, but it's very easy to talk about things like, you have to shift your culture. You need to have better cyber security culture. You need to create a culture of safety.
Kip: Easy to say.
Jake: What does that... It's easy to say, but what does it even mean?
Kip: Yes.
Jake: And so we start with what is a, quote, culture of cyber security? And one of the ways that it can be described by the way, I think it's Huang or Hwang. I'm not... I don't know if there's an R in there. crosstalk
Kip: Okay. Okay. But listen, just so you know, whenever I come across a name, I don't know, I go to the internet. And I actually took a moment to look up how to say that name. So-
Jake: Well, there you go.
Kip: It's Huang. That was what I learned when I went and looked it up. So you're right.
Jake: I stand corrected.
Kip: There's no R in there, because guess what? It's not English. I think it's Chinese. But anyway.
Jake: We apologize for that diversion. So a culture of cybersecurity underlies the practices, policies, and, quote, unwritten rules that employees use when they carry out their daily activities. And as we'll get to... the case study involves Liberty Mutual. It's right there. And so the Liberty Mutual CISO explained why he wanted to invest in a culture of cybersecurity. And he said, "It only takes one mistake from an employee clicking a wrong link or email to erase all the good work done by our professionals. Since a hacker can potentially go wherever they want once they are inside our systems, they can potentially compromise our entire investment."
Kip: Oh yes, God absolutely true. And that's something that we've talked about on this podcast, we talk about it with our clients and customers all the time, it's in my book. And so I just can't agree more with the quote that you just read. And I think what's difficult, right? About this topic and why I really appreciate these papers is whenever you say something like everyone's responsible for security, that's a nice aphorism. But the reality unless you do work to operationalize that statement, when everybody's responsible for something, nobody's responsible for it. And that's where I think things have gone wrong and that's what I love. I love the way that this paper really puts definitions into our hands, right? In a structure so that we can operationalize it.
It's fantastic. And again, they say this in the paper, it's extraordinarily difficult to identify, build, and quantify cybersecurity culture. And so they actually look at other types of culture within an organization and they talk about the safety culture. And you know what? I saw this when I was a kid and I would sort of look out the window of my parents' car as we drove around, the way that the construction sites looked over the years as I was growing up, they changed. They went from these sort of wild west, free wheeling, show up, drive park, wherever you want, wear whatever you want kind of culture. And over time, it became way more structured. And the workers started to wear hard hats and they started to show up in steel toed boots, and they wore gloves and they wore high visibility, safety vests, and that sort of thing.
And I would always reflect on that when I would think about culture change. And I'm like, well, if they could do that, why can't we do this from a cyber security perspective? So I think the safety culture shift is a really great model. And so what they say in the paper is that the goal is every employee must act in ways that keep the organization cyber secure. And that's what you see on a construction site, right? One accident can undo everything.
Jake: It can, perhaps quite literally.
Kip: Mm-hmm (affirmative). And you see the board on the construction sites, X days without an accident, right?
Jake: Yes.
Kip: And you at number go up and up and up. And when you know that a construction project's been going on for months, and that board says one day without an accident, you know somebody screwed up.
Jake: Yeah. Yeah. So that's a really... I mean, it's, it's such a strong kind of place to begin our understanding of this. And I think to start us define culture, and then the types of culture we're going to look at. So the paper looked at three different concepts; organizational culture, national culture, and information security culture. And when they dug into the... So obviously we're interested in organizational culture and we'll talk about what InfoSec culture looks like. But within an organizational culture, you've got three components. And I think this is true. You can really understand it. The belief systems that form the basis for collective action, the values representing what people think is important, and then artifacts and creations, which are kind of the art, technology, the visible and audible behavior patterns et cetera. Even myths, heroes, language rituals. And we have those things in organizations, right? We have those things, even as part of cybersecurity. I mean, there are rituals, there are... There's specific language. You might not think of it that way, but it's true.
Kip: It absolutely is. And what's interesting to me as I've studied organization behavior, I've done it academically, right? I went and took a graduate level certificate in executive leadership, and we talked about org behavior. And even before that when I did my master's degree, in management, we talked about org behavior. And what's so interesting to me about org behavior is that it's often unintentional. The belief systems, the values, the artifacts, the creations are often not intentional, but they just sort of happen spontaneously. And often it's based on what the people who lead the organizations, what they believe, what they value and what they create is what actually turns into the dominant culture. And so here's a case where we want to be intentional about creating a culture. So we've got to go about this a little bit differently. So I love that we're naming and claiming, and that we're going to flip the script and we're going to be intentional. So let's talk about InfoSec culture, right? So it's a subculture of the overall organization.
And what's really interesting about InfoSec, and this is something we talk about all the time in my career space is what's the difference between information security and cybersecurity, right? Are they synonymous? Are they different? Is it a vendor diagram where there's overlap? And what does that look like? And for culture purposes, Huang and Pearlson say, "Well, we're not really after an InfoSec culture. That's a little different. We are after a cyber security culture." And this is what they say, "Information security culture emphasizes behaviors that comply with information security policy, but a cyber culture includes not only compliance with policy, but also personal involvement in organizational cyber safety." And so they actually define organizational cyber security culture in this way, the beliefs, values and attitudes that drive employee behaviors to protect and defend the organization from cyber attacks. So they're actually saying that cyber security culture is a super set of information security culture. That's what I got.
Jake: That's what I got too. And I think what you just said a moment ago about information security culture emphasizing behaviors that really are just about compliance, I really like that because we often talk about sometimes not always in the most kind way about checkbox security and compliance almost as a dirty word. And it's obviously not, but at the same time, we'll say things like, that's not real cyber security. And I think this is actually really getting at that concept better than we've really been able to define it before.
Kip: It's really putting structure and vocabulary. And we've got this model that we're going to unpack here for how it all happens. But I mean, just because you're compliant with a regulation or a law that doesn't mean you're secure. And I think that's what this is getting to. I love it. I absolutely love it.
Jake: So here's the actual cybersecurity culture model, and this is how it begins. The ultimate goal for managers is to drive cybersecurity behaviors. So behaviors, that's the first component.
Kip: You're regulating behaviors.
Jake: You are. Yes. That is achieved in part by creating an organizational cybersecurity culture, which as we just said, is the beliefs, values and attitudes. So that's the next component of the model; beliefs, values and attitudes. The culture in turn is influenced by both external factors outside the control of managers. So that's actually the fourth component of the model, which is external influences. And internal organizational mechanisms managers use. And that is the third of the components of this model. So Kip, let's start by talking about behaviors.
Kip: Right. So we're going to unpack this model. We're going to do it in the most interesting way we can. But before we do, I just want to tell you that you really need to go get this paper. You really need to go get this paper from 2019 and read it. If you care at all about the manager's responsibility for being intentional about culture, you really do need to read this paper. I think this is a foundation for the future. So behaviors. So it's really employee behavior, isn't it? That either creates more cyber risk or reduces cyber risk. And there's really two-
Jake: I'm not sure what else it is. I mean, what else can do it?
Kip: Well, there... I mean, you could say that flaws in technology creates risk, right? I think you could make that argument. That a bug, the existence of a bug in a publicly facing system creates risk. But I think you could also argue that, well, that bug doesn't cause any problems until a person who behaves badly shows up.
Jake: And you could also make the case that the existence of that bug getting into production software is a component of employee behavior.
Kip: That's right. It was either a mistake, it was an error, or it was malicious. Who knows how that bug got in there, right? I mean, maybe it was just inattentiveness or the desire to cut core, or maybe just using a software development kit that had a flaw in it that you didn't know because you didn't do the due diligence. I mean, yes. I think that you could argue that the behavior of the developers is responsible for how that bug actually got in there. And it's the behavior of the cyber attackers that's responsible for exploiting it.
Jake: And I mean, not to beat this horse, but not just the developers, but the entire DevSecOps industrial military complex, for lack of a better word, right? I mean, that's-
Kip: Because we all move as in a herd, right? I mean, we all sort of look at each other to figure out what should we do? How should we do things? And we take cues, right? That's why we have things called best practices. Best practices are just behaviors. And then those tell us how we should behave. And those best practices aren't always really the best behavior. Maybe they were at one point, but we know that the outside world changes all the time. And so we've got to keep up with that. So our behaviors have to change when the external world changes around us. So sometimes we have to change our behaviors, but-
Jake: And dear listeners, this is why we prepared this and immediately realized that this was going to be a two-parter.
Kip: Right. There's a lot to unpack here. And there's a lot to comment on, right. And this is the part about making it real is we're not going to just tell you, here's the theory, we're actually going to explain to you... And by the way, we see this in practice and here's how we see it. And I talk about this all the time with my customers and with my team about culture and behaviors. And I love the two types of behavior that this paper talks about. It talks about in role cybersecurity behaviors and extra role cybersecurity behaviors, right? So if you're behaving well in a support of your cybersecurity program in your role, well, then you're saying, well, I'm a customer service agent and I don't develop software, so I can't help with that part of our cybersecurity program, but I do talk with customers and there are things I can do in my role.
There are ways I can behave that will support the management of cyber risk in the organization, right? Or if you're a software developer, you could say the same thing. Every role has behaviors associated with it that are either going to help the program or are going to hurt the program. So those are in role behaviors. And then you've got these extra role behaviors, right? And so these are things that an employee does that aren't strictly part of their job description. And some people might say, well, go the extra mile or whatever. But specifically there were two things that the paper talks about. One is helping. In other words, how can you provide some assistance to somebody who doesn't know what to do cybersecurity wise? Doesn't know how to make a passphrase, doesn't know how to use a password manager, doesn't know how to report a phishing email they just got.
And so it's outside of your role, strictly speaking, to help them or to voice, right? To actually just speak up and say, "Hey, I really like the way that you talked to that customer on the phone about the fact that you needed to reset their password, but you could only do it if they provided you with some additional information to verify their identity." And so it's probably not written in your role to say those things, but it makes a big difference when you do.
Jake: Well, and if you think about it, this is probably extremely common. In fact, I would venture just a wild guess that in the real world, these extra role cybersecurity behaviors are almost just as important as the in role cybersecurity behaviors, if not more so. Because people-
Kip: Got to have both.
Jake: I mean, you got to have both obviously, right? But if you think about how common this probably is, I think having someone who helps others with questions, is willing to voice concerns, that person has an outsized impact on an organization and cybersecurity culture.
Kip: These are the influencers, right? So when I come in and I start working in the organization, if I get the opportunity to do it, because it depends on what the assignment is, but I want to know who the influencers are because I want to talk to them and I want to do something to help sensitize them to this extra role cyber security behavior. And I've never had the words like we do now because of this paper, I've never had the words to really explain why I do this. But I've always felt that if I could get the influencers in the organization, the people who do behave in an extra role way, if I could get them to be more literate on cyber risk management, then just by the fact that they help and voice is going to make a big difference, a big positive difference.
Jake: It is. And it's so funny because I felt the exact same way when I was reading this, is we've been trying to talk about this, shifting the culture for years and years, but finding this paper gives us the ability to name concepts. And that's really what allows us, I think, to understand it and then repeat it. So this is-
Kip: And to be intentional, right? To be intentional.
Jake: And to be intentional.
Kip: Mm-hmm (affirmative).
Jake: Exactly. So next up is the beliefs, values and attitudes. And this is a big part. In fact, it comprises basically nine concepts and these nine concepts are split kind of into leadership, group and individual. And so let's... and going to what Kip just said, these belief, values and attitudes is like everyone kind of knows them, but very few can articulate them. And I think that's... Even that is kind of a bit of an oxymoron. Do you really know that which you cannot articulate? I don't know. That's-
Kip: I think you can, right? Because that's intuition. That's intuition, right? It's professional judgment. Kip, why should we do it that way? I'm not sure how to tell you, but I can just assure you that based on my experience, this is the way we need to do it. And I see this all the time and you can see it in all kinds of disciplines. It's not just cybersecurity. People with a lot of experience just know what to do, but they can't always explain why.
Jake: That's that's right. And it's funny because that type of guidance is probably more and less effective on different types of people. Lawyers don't really... That's not necessarily satisfying. Engineers probably are not satisfied in some ways by that, right? And so you have to kind of remember that that's a potential problem.
Kip: Well, just an aside here, right? The way that we go and work with companies when they want to shift their culture is we interview people, right? We find out who the influencers are and we interview them. And because there's a lot of squishiness, we actually have a score key, right? Where we talk to people and we say here's a question. Now, when you give us the answer, just give us a number between zero and 10. And by the way, here's the score key so you know what each one of these things means. And that's one way that we've found where we can go in and we can talk to people about something very squishy but turn it into something a little bit more quantifiable. And I don't want to totally unpack this here. But just to say that, man, I've seen this a lot, it's hard to know navigate, and I don't think there's any ideal ways to do it, but there are some.
Jake: Agreed. So let's start with the leadership level. And so the leadership at an organization plays a significant role in creating and propagating the organization's culture. That's pretty clear. Probably the most important. And what I really like about the paper here is it says there are three ways to assess the quality of cybersecurity culture among leadership. The first one is fairly intuitive; top management's priorities. Quite simply, when top managers believe that cybersecurity is important, they will make cybersecurity a priority for the organization. And you see that in strategic discussions, decisions that they make about allocation of resources, et cetera.
Kip: Behavior.
Jake: Behavior. Correct. Exactly.
Kip: They demonstrate their belief with action.
Jake: Exactly. The second part is top management's participation. And I think this is also really interesting. It could be in the form of communicating cybersecurity policies and attitudes or in actions that specifically secure the organization, like funding and attending the training, creating games and participating in other cybersecurity activities. This is really... Another way to say this, I think, is putting your money where your mouth is, right? Which is show up and show interest.
Kip: Well, that's Microsoft, right? When they decided they were going to get serious about product security in the early 2000s, this is what they did. This is exactly what they did. And then people didn't believe them at first. But when the money started showing up and the senior decision makers started showing up in the rooms where they'd never been before, people knew it was real.
Jake: People knew it was real. Exactly. And then the last of the three kind of components here of leadership is the top management's knowledge, which refers to specifically the cybersecurity related acknowledge, skills and competencies that leaders have. And this is really fascinating to me because it's intuitive, right? Leaders who know and understand their cybersecurity vulnerabilities are simply more likely to have values, beliefs and attitudes around building a more cyber resilient organization. I think that that is something that is far... I mean, our best clients get that, but I think way too few out there do.
Kip: Absolutely. Absolutely. And it's kind of our mission to help more people get this. So, I mean, I just... Guys, I'm so motivated-
Jake: Isn't that the purpose of our podcast? To help become better cyber risk managers.
Kip: It is. That's exactly why we do this. It's very energizing. It's very motivating. But we got to finish this episode and we got to do it without making it last too long. So again, we just talked about leadership. There's three components of this model of what is successful cybersecurity culture. And so leadership is at the top, right? And this follows the org chart. So from leadership at the very top, now you go down to the group level and eventually we'll talk to you about the individual level. But at the group level, right? We've got collections of people who collaborate, create and communicate. Well, so you've got beliefs at the organizational level and shared values. Now, you're going to get them at these individual groups. And and there's three things going on here.
There's norms and beliefs that are unique to each one of these groups that do different things like customer service as a group. Maybe you've got... In the insurance world, you've got claims adjustment, right? Is another group. And so you've got all these groups, you've got support groups and they all have their own norms and beliefs. You need to know that and respect it. And if you're are going to give them in role behaviors, then you've got to unpack this, right? So that's one component of groups. Another component of groups is perception. So this is the way teams within the organization work with each other, right? So you've got the customer service group and the claims adjuster group. Well, they have to work with each other, but how are they going to do that so that they reinforce each other in terms of cyber secure behaviors?
So we have to pay attention to that. And then also inter-department collaboration. Well, actually, I'm sorry, I got that a little messed up. So teamwork is the way organizations work together within a group, but then you've got how the groups work with each other. All right. So anyway, the point is, is that you've got these subgroups with these subcultures and they're all part of a larger culture. And you just have to recognize this. Because as you get intentional about cybersecurity, you're going to find that there are some levers that you can pull and those levers are going to influence these things. And that's why we're naming them now.
Jake: Exactly. And just to make it super clear, I mean, I think the development and existence of what was originally DevOps and then DevSecOps, that that is-
Kip: That's a great example.
Jake: ... that is a great example of kind of that inner department collaboration and the need for different crosstalk
Kip: Because you've got people operating IT, you got people developing new software, you got people pushing new software into operations. And how do you orchestrate all that? That's a great example.
Jake: Exactly. That's really... And that's that was considered a kind of a defining moment in moving cyber security management forward to hold DevOps and DevSecOps. So it's nice when you can kind of see a clear example of some of this theory. Real quick, the individual. So obviously groups are composed of individuals. That's just how it is. For the individual, we're talking about including the individual's understanding of cyber threats, awareness of organizational cybersecurity policies and then knowledge of personal capabilities to impact security. So the three constructs are employee self-efficacy, cybersecurity policy awareness, and then general cyber threat awareness. And these three things work together. Self-efficacy is what does the person know about how he or she can personally execute actions to increase cybersecurity? Policy awareness is do I know what behaviors the company wants me to do?
And then general cyber threat awareness is... That's pretty clear, right? It's just the individual's knowledge of understanding threats overall. You get that usually both from both internal training, but also just from living in the world. So Kip, we've covered two parts of the cybersecurity culture model, behaviors and then beliefs, values and attitudes. Now, we're going to hit what I think is probably going to be one of your most favorite parts and that is organization mechanisms. In other words, how organizations actually get things done. And what's the alternate phrase here, Kip, that I think you'll like?
Kip: Well, I said that before, right? So these are the levers, right? What levers can management pull or push in order to be intentional about cybersecurity culture? That's where we're at now in the model is we're talking about these levers. And there's quite a few actually. And so we want to unpack those. Because remember what we're trying to do is we're using this episode to build up to the next episode, right? So in the next episode, we're going to talk about these levers and some of the foundational ideas. And so we want you to have some exposure to them before we get into the next episode.
Jake: All right.
Kip: So there are six components that we want to highlight. And let's just go through them real quickly. And then in the next episode, we're going to apply them. And probably from here on, out in future episodes, we'll just continue to refer to them because it's so useful. So cybersecurity culture leadership is one of those levers and it refers to somebody who is the leader, right? The chief information security officer. It doesn't have to be the CSO who is the culture leader. It could be somebody else on that person's team or it could be somebody in maybe human resources. But whoever it is, you need a leader.
Jake: Next one is performance evaluations. This is quite simply the inclusion of measures of cybersecurity compliance and behaviors in the employees' formal evaluation processes.
Kip: Yes. Yes.
Jake: It's not something that is necessarily the most fun, but it is a really important lever.
Kip: Right. And this is where you pin people's behavior to a consequence. And this is what makes it real for people. Because what I've found is that in organizations, the one person who matters the most to an employee is their supervisor. Who writes the performance evaluation? The supervisor. So if there's going to be cybersecurity measures in the performance evaluation, that is going to make employees sit up and pay attention. So that's an super important lever. And then related to that, the third lever in this model is rewards and punishments. Now, we want to use... Another way of saying this is carrot and stick, right? We want to use carrots as much as possible. Sometimes people don't... You're not putting out the right carrot or people eat the carrot and then they still don't act the way that you need them to, right?
You're going to regulate behavior. And you have to find ways to tell people when they're doing the right thing. And unfortunately, sometimes you have to tell people when you're doing the wrong thing. Now, this is really difficult ground because do you want to fire somebody just because they click one time on a phishing link as part of a test? I don't think so. I think that's pretty severe for most organizations. But if you get somebody who just clicks with wild abandon and never responds to your pleas to, "Hey, don't do that." Well, now what you're going to do in my opinion is you're going to get them into... Most organizations have something called a progressive disciplinary system where it starts maybe as a verbal warning and then you move onto written warnings and then eventually you show them the door because they're just not interested in following the rules.
Jake: And I think it's interesting too. I mean, there's the person who kind of refuses to learn, but then there's also the person who's, I guess, I'll call it anti-social behavior is discovered because of these tests. And the paper just points out that there was one company where that an employee was fired for repeatedly and purposely failing phishing exercises. And after several warnings, he was let go and I think that's justified.
Kip: I think it's justified. But I also know that based on my personal experience, there's a whole culture of what managers will tolerate from individual contributors. And that's led to some really interesting discussions about how much bad behavior can we put up with from a person who is otherwise extremely skilled at what they do. And there's a whole book called The No A-Holes Rule that... Where people are like, we're sick of putting up what a-holes simply because they're skilled at something. And so this is something organizations have to figure out what to do about if they're going to get good at this. So that's number three. What's number's number four?
Jake: That's number three. The fourth one is organizational learning. It's pretty self explanatory. Refers to the ways the organization builds and retains cyber security knowledge.
Kip: Mm-hmm (affirmative). So do you hire a training organization? Do you bring in consultants? Do you buy everybody a subscription to a magazine, right? So there's... And there's a whole discipline on organizational learning typically located in the human resources department of a bigger org and they can help you with this, right? So you're not on your own in all these things. Number five is cybersecurity training, which is, I think, related to organizational learning. Now, we're talking about individual learning and there's lots of vendors out there who will help make your training programs. So that's the fifth one. What's the sixth one?
Jake: And the sixth one is communications channel. And it refers to coherent well designed messages about cybersecurity communicated using multiple methods and networks. And this one, I think, needs a little bit more explanation.
Kip: A little bit.
Jake: So for example, some orgs create cybersecurity based marketing style campaigns to influence behaviors in order to keep the issues front and center. Another example would be to include a short communication at the beginning of every company meeting just to share a cyber security message. That can be handy, kind of like... It almost becomes a ritual, right? We talked about rituals at the beginning. We're having a company meeting. We start the company meeting with a 30 second soundbite about cybersecurity. That helps to keep things fresh and in memory.
Kip: I'll never forget my introduction to this item in here, number six. The fact that you have to communicate over multiple channels and you have to communicate over and over and over again. Because I was guilty of something that I see people do all the time. I want the organization to know something like, "Hey, we just changed our password policy. Now it's 10 characters instead of six." And in the past I'd be like, "I'm just going to write an email about that. All right. Send all. Okay, done. Policy implemented." And, no. Not even close, Kip.
You need an entire marketing program. And so what did I do when I learned that? I went over to the marketing department and I said, "Help. I don't know anything about marketing. All I know is I'm trying to communicate this policy change and I am flailing and not doing a good job." And they were wonderful. They said, "Well, not a problem. Here's our template for how we do marketing campaigns. And hey, we've got people who can make videos and help you. Do you want to get this into our employee newsletter?" And I was just like, a cool tall all glass of water. And I was lost in the desert. Thank you.
Jake: And Kip, it just occurred to me as we're having this conversation that if I'm a litigator and I'm questioning company leadership that I've sued on behalf of a class action, I'm going to ask question like, "Well, how do you communicate the importance of cybersecurity?" And if someone says, "Well, we just send an email," I think very rapidly that begins-
Kip: That's not reasonable.
Jake: It's not reasonable. That's exactly it. So we're almost done here with today's episode. So last, though, certainly not least is external influence. And-
Kip: We have to talk about the fact that not everything that goes on in the world happens inside your four walls. You've got to pay attention to what's going-
Jake: You do. And it's quite simple, right? Attitudes, beliefs, and values that an individual or organization has, are shaped by external influencers. And the model really defines three of them. One is societal cybersecurity culture. That refers to just the broad culture of the society in which an organization resides. Some countries have laissez-faire attitudes. Others have less, right? Second one is-
Kip: That influences.
Jake: It does.
Kip: Influences our culture.
Jake: Second one is super important, right? External rules and regulations. That's the laws, guidelines and regulations imposed by government and other industry organizations. I think that one is fairly straightforward.
Kip: We talk a lot about those.
Jake: We do. And then the other one is also extremely important, peer institutions. And that refers to the pressure felt by managers in an organization from actions that their peer organizations have taken. In other words, it's the feeling of kind of an organizational keeping up with the Joneses. You really do need... For some reason we feel this need to run as a pack. There's safety in numbers, right? And I think-
Kip: It's just human.
Jake: It is. It's human nature.
Kip: It really is. And a tangible example of this, right? Is the Center for Internet Security's top 20 critical security controls, right? That is something that your peers, if you're a very large organization, have probably embraced. And you look around and you're like, "Well, how do we know how to allocate our resources? Everyone's doing top 20 from Center for Internet Security. Great. We'll do that too." And it's reflexive. I mean, I've worked with midsize companies that said, "Gosh, we're trying to do the top 20, but it's killing us." And I go, "Well, that's because you're trying to make an enterprise class framework. You're trying to adopt it for your mid-market company." And that's like a 14 year old boy wearing his grown father's suit.
And you can cuff it all day long and it just is not going to work. So be careful about this, right? Because people do pay attention to what peer institutions do, but it doesn't always work. Anyway, but it's a factor in play. So, all right, everybody, guess what? We made it. We got through all the content, we talked about the model and as Jake said, we're going to, in the next episode, we're going to talk about a case study, the Verizon white paper. And I want you to go get it. I want you to look at both of these white papers. And how do you do that? Remember, you're going to go back to the DBIR from 2021, and you are going to turn to page... What page is that?
Jake: 53.
Kip: 53. Page 53, that's where you're going to start. And-
Jake: And specifically, footnotes 66 and 68 are the links to these papers.
Kip: There you go. So we've just served it up to you on a silver platter. You've got two weeks to go do your homework and then come back to class in the next episode. And then we're really going to start getting practical with you. So, is that it, Jake, for this one?
Jake: That's it.
Kip: Okay everybody. Thanks for sticking with us. That wraps up the episode. We did part one of this idea of using behavioral science to be intentional about cybersecurity culture. We thank the DBIR for bringing this to our attention, and we're going to unpack what Verizon did with all this information. So we'll see you next time.
Jake: See you next time.
Speaker 1: Thanks for joining us today on the Cyber Risk Management Podcast. Remember that cyber risk management is a team sport. So include your senior decision makers, legal department, HR, and IT for full effectiveness. So if you want to manage cyber as the dynamic business risk it has become, we can help. Find out more by visiting us at cyberriskopportunities.com and focallaw.com. Thanks for tuning in. See you next time.
Sign up to receive email updates
Enter your name and email address below and I'll send you periodic updates about the podcast.
YOUR HOST:
Kip Boyle
Cyber Risk Opportunities
Kip Boyle is a 20-year information security expert and is the founder and CEO of Cyber Risk Opportunities. He is a former Chief Information Security Officer for both technology and financial services companies and was a cyber-security consultant at Stanford Research Institute (SRI).
YOUR CO-HOST:
Jake Bernstein
K&L Gates LLC
Jake Bernstein, an attorney and Certified Information Systems Security Professional (CISSP) who practices extensively in cybersecurity and privacy as both a counselor and litigator.