EPISODE 145
Why Do Employees Keep Ignoring Workplace Cybersecurity Rules?

EP 145: Why Do Employees Keep Ignoring Workplace Cybersecurity Rules?

Our bi-weekly Inflection Point bulletin will help you keep up with the fast-paced evolution of cyber risk management.

Sign Up Now!

About this episode

November 21, 2023

Why do employees keep ignoring workplace cybersecurity rules? And, what should cyber risk managers to do about it? Let’s find out with your hosts Kip Boyle, CISO with Cyber Risk Opportunities, and Jake Bernstein, Partner with K&L Gates.

Tags:

Episode Transcript

Speaker 1: Welcome to the Cyber Risk Management Podcast. Our mission is to help executives thrive as cyber risk managers. Your hosts are Kip Boyle, Virtual Chief Information Security Officer at Cyber Risk Opportunities, and Jake Bernstein, Partner at the law firm of K&L Gates. Visit them at cr-map.com and klgates.com.

Jake Bernstein: So Kip, what are we going to talk about today in episode 145 of the Cyber Risk Management Podcast?

Kip Boyle: Well, we're going to talk about something that I think is pretty cool, which is why is it that employees keep ignoring these cybersecurity rules that we set up for them in the workplace? And then what can you do about it? I just think this is fascinating subject.

Jake Bernstein: Okay. So obviously, it's not cool for employees to ignore workplace cybersecurity rules. I think the reasons why they might are certainly interesting and then what to do about it definitely for our audience is cool. So now, part of me feels like we've talked about this subject a lot and maybe we haven't, but what are we really going to focus on this time?

Kip Boyle: Yeah. Well, we have talked about this before from different angles. For example, we were inspired from a prior edition of the Verizon Data Breach Incident Report to do a two-part podcast episode series on how to make sure cybersecurity is everyone's job. This time what I want to do is I want to look at it through a different lens. I want to go inside of the heads of the employees and try to figure out what's going on inside and can we learn anything from that. And I think we can. And then, okay, what can we do? What can we do with that? So just a slightly different take on it.

Jake Bernstein: So when you say that, and I admit I did look ahead in the script, but it still makes me think of the way that detectives and criminologists get inside the heads of criminals in order to figure out why they do what they do and what they might do next.

Kip Boyle: Certainly criminals, but we try to get in the heads of people all the time to try to...

Jake Bernstein: Everybody.

Kip Boyle: ... understand behavior. Right?

Jake Bernstein: Yep. We do.

Kip Boyle: I mean, that's what the whole surveillance capitalism movement is about. Why do these people do what they do and why won't they buy our stuff?

Jake Bernstein: So what do you got for us? What are we talking about because I know it's something special?

Kip Boyle: So it starts with a study by Gartner. So Gartner is in the IT world, is a very well-known organization. Many very large information technology shops have subscriptions to what's called Gartner research. And Gartner, if you've ever heard of the Magic Quadrant, it's that Gartner that we're talking about. If you've ever heard of, they have this other thing with the, what do they call it? It's the technology cycle. A Hype Cycle, that's what it is.

Jake Bernstein: Oh, is that a Gartner thing? Huh.

Kip Boyle: Yep. That's also a Gartner thing. So that's who we're talking about. So it's a reputable firm, it's very large, and they specialize in doing research. And so I recently came across this report that they had done. So in May and June of 2022, they went out and polled 1,310 employees. And what they were trying to find out is how often are employees bypassing their employer's security policies and why. They were just trying to study this. And so this is what they came back with. 69% of the employees that participated in this study admitted that they had in fact bypassed their organization's security policies in the past 12 months. 69%.

Jake Bernstein: Wow. So I mean, even the self-reporting bias can't help them with numbers like that. That's crazy.

Kip Boyle: Yeah, it is crazy, right? I'm used to hearing a low number on something like this and then maybe grossing it up a little bit because I know people self-report bias to make themselves look better. I don't know what these people did. I don't know how Gartner got them to tell the truth. But that's a big number. And Gartner went further and they said, "Would you be willing to violate your employer's security policies in the future if it would help you or your team accomplish a business objective?" And of those 1,310 employees, 74% said yes.

Jake Bernstein: See, now that surprises me less. And I mean, the fact that it surprises me less is itself telling. But I think it goes to show that maybe there's a lot more companies that score an eight, or sorry, a nine or 10 on our methodology of measuring cybersecurity where nine and 10 is almost seen as too much, too much cybersecurity that gets in the way of people doing their work. And I mean, to me, that's what those people are really saying. It's not necessarily we don't care, although I think we're going to find out it could be partially. Well, let's wait and see. But nonetheless, I wish that wasn't true, but I'm not surprised at the number.

Kip Boyle: Okay. Well, I was kind of surprised that the numbers on a self-reported basis we're so high.

Jake Bernstein: So just to be clear, I am surprised by the almost 70% who've admitted to doing it. The hypothetical question of if it helped you accomplish a business objective, would you bypass, that one doesn't surprise me as much, just because I think for many people, particularly for non-security operations people, getting the business done is the critical thing. And it's also hypothetical, right?

Kip Boyle: Right. Right.

Jake Bernstein: So anyway...

Kip Boyle: It is.

Jake Bernstein: ... it's interesting.

Kip Boyle: But even on a hypothetical point, I'm thinking, "Well, they could have lied and said, oh no, I would never do that." Right?

Jake Bernstein: They could have, but they didn't.

Kip Boyle: But they didn't. And that's what I found so interesting about this and why I felt like we needed to explore this on the podcast.
Okay. So that's why I kind of got an itch to open this topic again. Now, so then it's like, "Well, why? Why are people so willing and maybe even enthusiastic about this idea of working around security policies?" Well, I found a possible answer, and this came from two American criminologists. Criminologist.

Jake Bernstein: Criminologist.

Kip Boyle: Criminologists.

Jake Bernstein: Criminologists.

Kip Boyle: People who study criminology. Okay, those guys. Gresham Sykes and David Matza. And this was all the way back in the 1950s. So quite some time ago. Now, they came up with a concept, it's called neutralization. They were studying so-called juvenile offenders, and they were trying to figure out how is it that they could feel okay about breaking the law and not feel guilty? And so that's what they were studying. And I don't know, you can decide, the listeners can decide, but I think that this could explain what's going on with the people who responded to the survey. Now, there are many neutralization techniques, and I want to go through each one of them, but I just want to ask you if you've heard of this before.

Jake Bernstein: So with the very important caveat that I have neither taken criminology classes nor really taken psychology classes, the answer is no. But I have heard of, we're going to talk about some of these rationalizations, and I have heard of some of these. So...

Kip Boyle: Oh, okay.

Jake Bernstein: ... just maybe not in the context of this neutralization concept. But I do think it's really important to understand because most people will feel guilt about misbehavior and most people generally won't misbehave in at least in part because of the guilt. Now, I am absolutely armchair psychologizing here, which is also not a word.

Kip Boyle: Are you practicing psychology without a license, sir?

Jake Bernstein: Fortunately, no, because there's no individual patient anywhere nearby, nor should there ever be. Although I will tell you, they don't call us counselors at law for a reason. Yeah, they don't. Hold on. Let me get this right. Yes. They don't call us counselors at law for no reason, Kip.

Kip Boyle: Okay. So if anybody listening to us is wondering what the heck's with these two guys being so tongue-tied today, well, we are not recording at our normal time. So everything's mixed up.

Jake Bernstein: I'm going to say that's why. That's why it's all messed up. Yeah.

Kip Boyle: Yeah. Yeah, Friday afternoon...

Jake Bernstein: Okay. So...

Kip Boyle: ... will do that to you. Okay, so here's the thing. So there's six what I would call typical neutralization rationalizations. In other words, what employees are doing is they have to come up with some kind of an excuse at least to themselves for why they're breaking the rules. And these six neutralization techniques all have one thing in common, which is what you were saying, which is they would normally feel some guilt for violating security guidelines. But these neutralization techniques allow them to step past that. And so that's why they're not burdened with a lot of guilt is because they're trying to figure out, and these are at least six different ways that they can actually rationalize why they should not feel guilty. And then so they can still have this rule-abiding self-image as they drift in and out of compliance with various security policies and procedures. So that's what's going on here. You want to look at, should we look at each one?

Jake Bernstein: Let's look at each one. And by the way, I think I know the reason some of this sounds familiar to me is both philosophy, which I was an undergrad philosophy major, but also parenting books. Definitely the parenting books.

Kip Boyle: Oh, yeah. Yeah, absolutely.

Jake Bernstein: So that's why it's slightly familiar, but let's dig in.

Kip Boyle: Okay. So the first one is called denial of injury. And this is when an employee convinces themselves that no harm is going to come from them ignoring a security policy. So it's okay for them to break the rule and because it was okay, because there's no harm, then they don't deserve to be punished.

Jake Bernstein: So this is probably the most common reason that people speed.

Kip Boyle: Oh, okay.

Jake Bernstein: Or don't stop fully at a stop sign. The simple reality, whether you agree with it ethically, philosophically, morally or not, is that most of the time a little bit of speeding or a little bit of drifting through a stop sign is not going to cause anybody any injury. Imagine if every single time you went even a mile per hour over speed limit, what was Tinkerbell, a fairy, a fairy lost their wings and expired and you knew for a fact that this was happening because in our hypothetical world fairies are real, I would venture to guess that more people wouldn't speed.

Kip Boyle: Well, and imagine if those fairies screamed in your ear.

Jake Bernstein: Oh, yeah.

Kip Boyle: A blood-curdling scream.

Jake Bernstein: Yeah, that would clearly have to happen. Bottom line is that people wouldn't do it as often. But no, I mean, in all seriousness, this neutralization rationalization is absolutely the most common one for minor crimes.

Kip Boyle: And I would think that attorneys that specialized in defending people charged with crimes would probably be familiar with this list, don't you think?

Jake Bernstein: They would. Now, the thing is is so I said I'd never took a criminology class, which is true, but I was required to take criminal law. And criminal law in law school, so it has been 20 years ago almost.

Kip Boyle: Okay.

Jake Bernstein: But criminal law is not criminal procedure. So it's not like the Mirandizing rights and all the constitutional law. It's much more of the theory of criminal law and criminal behavior. And if you've ever heard the term mens rea before...

Kip Boyle: I have heard that.

Jake Bernstein: ... it literally means, I believe it's the guilty mind or the guilty conscience is what kind of the idea is. And it's one of the things that's necessary for most significant crimes. We talk about negligence, we talk about willful misconduct, we talk about strict liability in the law. And one of the reasons that you have different levels of actor culpability in the law is that some of these are so common that if we didn't have strict liability, nobody would ever follow the law.

So speeding is a good example. If you speed, you're guilty, whether you meant to do it or not. It doesn't matter that you had a mens rea to defeat the law. It only matters that you did it period. It's a strict liability type of offense. Now, it turns out that for constitutional and historical reasons, we can't put you in jail for that kind of thing. It's at most, I mean, it's a moving violation. I'm not even sure if it's a misdemeanor. But it's definitely not a felony.

Kip Boyle: I think an infraction.

Jake Bernstein: It is an infraction. Bottom line though is it's not a felony. You're not a felony if you speed. But it's felons who get sentenced to long prison terms. I want to say that the maximum prison sentence for a misdemeanor is a year. And all of this is well established. It's almost too bad for this episode that I don't have more of a criminal law background. But I want people to understand that this is not something that Kip just pulled out of the 1950s that otherwise has no relevance. We've developed our entire legal society and culture is based on these concepts, and it has been for hundreds of years. So the fact that we're now talking about it as to why people violate security rules, it is absolutely connected. And so you didn't know you were going to get that, but you know what? There's a part of you that probably did know, which is why you did it. Yeah. You're guilty, you have a...

Kip Boyle: Yeah, I'm going to have to take the Fifth on that. I'm going to take the Fifth on that, man. No, I'm going to invoke my constitutional right there.

Jake Bernstein: So that's another good example. On a strict liability thing, it doesn't matter if you invoke your Fifth Amendment right against self-incrimination because you did it...

Kip Boyle: And it doesn't matter why.

Jake Bernstein: ... and it doesn't matter why.

Kip Boyle: Okay. All right. Now, that was denial of injury. Okay? We're going to go to the next one. An appeal to higher loyalties. So this is how somebody thinks when they place the demands of their work, the project that they're on, or the demands of their manager or supervisor above compliance with the security guideline. So they know that ignoring the policy is wrong, but they make loyalty to someone or something an imperative that overrides the loyalty to the policy.

Jake Bernstein: Now, this one's interesting because it's almost but not quite the same as the I was just following orders excuse that often gets brought up in military criminal cases before The Hague.

Kip Boyle: Right. Crimes against humanity and that sort.

Jake Bernstein: Crimes against humanity type of stuff where I was just following orders. That's really more the next one, which we'll talk about in a moment.

Kip Boyle: Well, why don't you take the next one?

Jake Bernstein: I will. But I do want to just quickly hit on this idea of appeal to higher loyalty. So this one is really, I mean, obviously, it's a rationalization to do something that you shouldn't do. I'm not sure this one has as much to say about in the criminal standpoint because I think the next one, which is denial of responsibility, that's when employees refuse to take personal responsibility for their actions, rationalizing that the situation is beyond their control. Now here, we're saying that they might claim that they weren't aware of a specific security policy.

Kip Boyle: Right.

Jake Bernstein: And everybody knows the old saying ignorance of the law is no excuse. Or they weren't given the proper training to implement it. Or someone told me to do it. Denial of responsibility could also be, "Well, I didn't do it. Someone else told me to do it and I just followed orders." So that is a non-functional defense before the International Court of Criminal Justice, whatever it's fully called. Again, not my area.

Kip Boyle: But I can see you find this interesting. So that's good.

Jake Bernstein: I do find this, it's super interesting. And it's not only interesting, but I think you can already see that we're not there yet. But if you understand why people are likely to ignore cybersecurity rules, then you can, for us we would call it a compensating control. You can go and do some compensating controls.

Kip Boyle: Right. You can mitigate.

Jake Bernstein: The easy one here is denial of injury. If you tell people how breaking the rules causes injury, it probably won't do it as often.

Kip Boyle: Now you're skipping to the end already, and we still have three more of these neutralization strategies.

Jake Bernstein: Well, the denial of injury one is kind of the easiest one. But no, denial of responsibility, obviously I didn't know, I didn't do it...

Kip Boyle: They made me do it.

Jake Bernstein: ... I was told to, et cetera, et cetera. So okay, what's the next one? This one's new to me.

Kip Boyle: Yeah, this one's a little strange. This one's called metaphor of the ledger. I just love how all these little turns of phrases. I would love to know how this one came about, but that's what it's called. Now, here's how it goes. It's a technique in which an employee mentally tallies all the positive things that they do. Like, "Oh, I worked overtime. I met or exceeded my quota." And they compare that to their occasional negative behavior. And so if their tally of positive things outnumbers the negative, then they tell themselves like, "Oh, I should be able to break a security rule every now and then. It's okay because I'm such a good person in all these other ways."

Jake Bernstein: Okay, this reminds me of the TV show, The Good Place. This absolutely reminds me of the TV show, The Good Place, because...

Kip Boyle: I haven't seen it.

Jake Bernstein: ... in The Good Place.

Kip Boyle: inaudible.

Jake Bernstein: In The Good Place, and without really spoiling anything, I can just say that the angels and the demons keep track of how much of everything that a person does throughout their entire life in a running ledger. And if your total is above a certain point, then you get to go to the good place. If it's not, then you have to go to the bad place. And it's actually a comedy. It's a great show.

Kip Boyle: Okay. How long was this on the air? I missed it.

Jake Bernstein: Four seasons.

Kip Boyle: Okay.

Jake Bernstein: It's probably, I mean, it's been off the air for quite a few years now. Anyway, the metaphor of the ledger, that makes a lot of sense. It's a little utilitarian in some ways, ends justify the means type of deal.

Kip Boyle: Yeah.

Jake Bernstein: But the only other thing I'll say on this one is it also reminds me of this concept called the I-deserve box. Are you in your I-deserve box? And this is something from the, it's the Arbinger Institute's book called The Anatomy of Peace. Never in a million years thought that I would talk about this on a cyber risk podcast, but here we are. It is something similar where you basically convince yourself that you deserve to do whatever you want to do even though it's wrong or might hurt someone else. It's very similar to the metaphor of the ledger. Not exactly, but pretty close.

Kip Boyle: Okay.

Jake Bernstein: All right, the next one. I'll take this one.

Kip Boyle: Okay.

Jake Bernstein: Defense of necessity is when employees convince themselves they were forced to behave a certain way in a given situation so it isn't their fault. For example, they justified downloading unauthorized software from the internet because they need it to meet a tight deadline. This one I think is different than denial of responsibility where... And I think between the two, I would put my boss made me do it under denial of responsibility. It could maybe also be defense of necessity.

Kip Boyle: Depends on how you rationalize it. Yeah.

Jake Bernstein: It depends how you rationalize it, but clearly related.

Kip Boyle: I think some people download unauthorized software from the internet because they don't have the money, but they still need to get something done.

Jake Bernstein: Yeah. I mean...

Kip Boyle: That's another way.

Jake Bernstein: Yes, exactly. I think this one's almost even more utilitarian, right? "Well, I steered the trolley into that single person so that I didn't kill those four people," things like that. That's the classic ethical problem, at least if you're a philosophy ethics nerd like I was or still am apparently.

Kip Boyle: Oh, okay. So will Kant make an appearance today?

Jake Bernstein: No. No. There will be no Immanuel Kant. You just invoked him. But we...

Kip Boyle: Get him for this model.

Jake Bernstein: Yeah. Although Kip, now that you've mentioned it, the categorical imperative, which is essentially the Golden Rule, is probably not... It's always useful. Always useful.

Kip Boyle: Okay.

Jake Bernstein: Okay. So...

Kip Boyle: Here comes number six. Okay? This is the sixth and last one. And then after we talk about this one, we'll talk about given that this happens, what do you do about it as a cyber risk manager? So the last one is condemnation of the condemners. So this is where you criticize the people who implement and enforce security policies and then you use that criticism as a justification for ignoring the rules. The example that I want to give is when employees think that the security team is really unreasonable or just out of touch with the needs of the business. And so as a result, the employees are just like, "They're so out of touch that all their policies are invalid, and I'm justified to ignore any and all of them as I see fit."

Jake Bernstein: That's pretty interesting. I think there's a lot of this that goes on.

Kip Boyle: Especially when people think that the controls are overboard. Right?

Jake Bernstein: Yes.

Kip Boyle: When they would tell us, "Well, it's a nine or a 10, I can't get my work done." But it's really interesting to see how people's perceptions of it being overly secure will definitely vary based on their own personal risk appetite, their own personal risk tolerance, and other personality characteristics that we can't control and sometimes just don't even understand.

Jake Bernstein: Okay. So obviously, that's very psychological, but it's also moral, ethical, legal. It has all kinds of... I mean, human behavior is fascinating. Right? Just is.

Kip Boyle: It is fascinating and it's non-deterministic. And for people who are typically are attracted to working in cybersecurity, IT security, cyber risk management, in this whole problem space, we're typically interested in the ones and zeros, right? We like the computers and building...

Jake Bernstein: Oh, yeah. I see what you're saying.

Kip Boyle: ... things and writing code and all that stuff. And one of the reasons why I think this is certainly it's true for me, and I've talked with a lot of other people, so it's like computers are deterministic, right? Once you learn how to use this piece of software, for example, when you press a button, pretty much does what you expect all the time.

Jake Bernstein: Yeah.

Kip Boyle: Right?

Jake Bernstein: Well, I mean, that's about the strength.

Kip Boyle: What you put and what you get out is the same. It's predictable.

Jake Bernstein: Yeah. Well, it's ideal.

Kip Boyle: But people are non-deterministic.

Jake Bernstein: Software can't make a mistake because it only does what you tell it to do.

Kip Boyle: Yes.

Jake Bernstein: The second that software...

Kip Boyle: Unfailingly.

Jake Bernstein: Yes. The second... Now, of course, you may be thinking, "Of course, software can make a mistake." And I would say, "No, no, that's a bug that you have yet to find and fix." With generative AI, there becomes the possibility of non-deterministic computing. But let's not go there this episode.

So let me check on this. Let me check on this. You would think that the threat of sanctions would cause employees to think twice before violating a company's information security rules. Just like you would think that the threat of a speeding ticket would make everyone follow the speed limit. But if I'm understanding all this correctly, sanctions for wrongdoing are precisely what neutralization techniques are so effective at dismissing. If someone convinces themselves that violating a policy isn't wrong in their circumstance, why would they fear being apprehended or punished? So if sanctions don't work, what does and what can management do about all this?

Kip Boyle: Well, that's a great and very insightful question. I'm so glad you consulted the script...

Jake Bernstein: Read it from the script. Yes.

Kip Boyle: ... before you said anything. So because this idea of neutralization in this context has been around for so long, it turns out it's been studied quite extensively in research, universities. So I found a study that was done at the university, well, not the university, but at one of the universities in Finland. I can't actually say the name of the university and I don't want to butcher it, so I'm just going to say...

Jake Bernstein: Probably a good idea.

Kip Boyle: I'm just going to say there's a really prestigious university in Finland and they did this study because they were trying to figure out, well, how can we counteract neutralization so that people will be more likely to follow the rules in whatever circumstance they happen to be? So this is what they did. So they gave security training to 87 employees of a large multinational company. And this was an experiment. So of the 87 employees, 21 received standard security training, just the same thing anybody would get. But 66 of those 87 also received training that specifically addressed neutralization techniques that the employees might engage in.

So as an example, there was a trainer who described how people commonly use a defense of necessity to justify choosing weak passwords because they believed that strong passwords are too onerous to use and the system didn't stop them from choosing a weak password. And then the trainer discussed why this isn't necessarily true and then showed them practical ways to choose passwords that were both strong and usable. And so that was an effective way to cut through the defense of necessity. What do you think about that?

Jake Bernstein: Interesting. I think that naming things gives us power over them. This is a very philosophical episode of the Cyber Risk Management Podcast, everybody. But I think simply pointing out even just once that example of that particular neutralization technique, I'm going to guess had a good effect. And since I don't want to steal your thunder, why don't you tell us the punchline here?

Kip Boyle: Yes, I will get to that. I thought I might provide another example. So another thing that they tried in this study, and I'll put the link to the study in the show notes. So if you'd like to go read it's in English and you shouldn't have any problems understanding it, assuming you haven't stopped listening already. So the trainer explained how people use the denial of injury technique to rationalize, well, no harm is done using a weak password. But then the trainer showed them why that's not true, and they demonstrated how easily hackers can guess or brute force weak passwords and the damage that can be done when they're successful.

So what I love about this study is they took this criminology work from the 1950s on juvenile delinquents, probably kids that were throwing bricks through windows and spraying graffiti, and now we're applying it in the cybersecurity realm in a modern day. And I just think this is really cool. Okay, here's the punchline. So the employees in the group that received this special anti-neutralization trading, they self-reported a substantially higher intention to comply with security policies in the future and a lower agreement with the neutralization techniques compared to those that were in the control group and the differences that they measured right after the training held up three weeks later. So fascinating.

Jake Bernstein: It is fascinating. I would love to know... I mean, one of the difficulties in just measuring cybersecurity training effectiveness overall is that you can't measure when something doesn't happen and for the same reason here. But it would be fascinating to know if this was put into regular practice, if over time people actually stopped ignoring the rules. I don't know if that's been done. It would be difficult and expensive and time-consuming to do an actual scientific study on that. But I think certainly logically, even based off these results, you would expect that this type of training would help. But it almost seems like one of those things where you don't need to have this whole big fancy training. You could also just send messages instead.

Kip Boyle: You can.

Jake Bernstein: What do you think about that?

Kip Boyle: Yeah. So it turns out that you don't have to bring people into a classroom and do all of this, right? They did it for the purposes of the study. But it turns out that you can get almost all the benefits of bringing them into a classroom by just sending them messages. And that could be email if they're reading them, or it could just be little things that are said at a team meeting or something like that. And as long as you're directly undermining specific neutralization techniques, you can actually gain a lot of the benefits.

So just here's a sample message. You could actually send out a message that says, "Even though people believe that sharing passwords can be justified under certain circumstances without any real consequences, adherence to this policy is important. Sharing of passwords should not be justified for any reason." And this is really fascinating to me because it's not very specific. It doesn't talk about a specific harm. It just says that there are real-world consequences. It doesn't call out anybody in particular. It doesn't even really recite what the actual policy is.

Jake Bernstein: Yeah. No, that's interesting. And I think its effectiveness is going to depend on the personalities of the people involved. Right?

Kip Boyle: Sure.

Jake Bernstein: There's always people who are the know it all, like me. There's always people who, frankly, you always have sociopaths, which just means someone who doesn't follow...

Kip Boyle: Feel guilt.

Jake Bernstein: ... the social rule. They don't feel guilt. Right?

Kip Boyle: Right. Yeah. Well, and then you have people who are boundary testers. I mean, these are people who are just compelled to test boundaries constantly. It's just their personality. I have one of these children. Well, maybe more than one. And then there's rule followers who they're more likely to respond to this kind of training. What I would like to know...

Jake Bernstein: I'd argue they don't need it because they don't break the rules in the first place.

Kip Boyle: Well, that's true, but I would submit that a rule follower might fall into some of these traps every now and then, because remember, the idea of these neutralization techniques is to maintain a self-image that you are rule-abiding.

Jake Bernstein: That's true. That's a very good point. That's very, very good.

Kip Boyle: I think it's probably less likely, less frequent. That would be my hypothesis. But what I would like to know is how many of these companies that we purchase cybersecurity awareness training from do this, include this in their materials? And if they're not, why not?

Jake Bernstein: I don't have an answer for that.

Kip Boyle: I don't either.

Jake Bernstein: I don't have an answer for either question. I don't know if it's included, but...

Kip Boyle: I strongly suspect it's not. Because I've watched a lot of it over my career, and I strongly suspect that this sort of thing is not included, which is awful. Because if I was going to make a cybersecurity awareness company, I would like to think that I would go into the research and I would try to pull out proven techniques that work because it would make my product better. It'd make my product perform better.

Jake Bernstein: And I mean, it also depends on the market. Do people buy cybersecurity training services to just check the box? Or are they really trying to get their people trained on cybersecurity?

Kip Boyle: And here's the gap between people who just want to check boxes for compliance and people who feel the need to manage risk, to really manage risk and check boxes for compliance. And certainly, I can't control that. But for those of us who'd like to do both, I wish more of this was in the trainings that we purchase anyway.

Jake Bernstein: So okay. So the takeaways from this episode are really two of them. One is that organizations need to understand that neutralization comes naturally to most people and that no amount of threatening and insisting will change that because the whole point of the neutralization is that it's immune to threats...

Kip Boyle: That it's right.

Jake Bernstein: ... and insistence, right? That's kind of...

Kip Boyle: It's Teflon.

Jake Bernstein: It's Teflon, yes. And the second thing is that management can help employees to recognize and reject rationalizations by helping them to see the cybersecurity policies for the essential role that they play. And I would suggest that you do what that study did and use specific examples with the specific names of neutralization techniques because, like I said, naming things gives you power over them, and that's what the game is here.

Kip Boyle: Yep. Well, I think we've done a very good job of exploring this subject. And look at us, we haven't even hit 40 minutes yet, so can we wrap this up?

Jake Bernstein: I think we can.

Kip Boyle: All right.

Jake Bernstein: I will say before you do, and I'm going to try to keep it under 40 minutes here, is that I think this episode just goes to show how multidisciplinary cybersecurity can be.

Kip Boyle: Yeah.

Jake Bernstein: Most of the time you just don't think about criminology, psychology, ethics, and morality as having this role to play in building a successful cyber risk management program. But I think we've done, I mean, perhaps we're both biased, but you've convinced me and I think hopefully we've convinced our viewers that in fact, our viewers, geez, our listeners, that that is in fact the case, that these things do matter. And this I think was a really good example of that.

Kip Boyle: Yeah. And it's practical, right?

Jake Bernstein: It is.

Kip Boyle: Once you understand it, there's usually practical things that you can do to turn this into a tool in your toolbox. And that's what I hope people who listen to the podcast get is a lot of practicality about some of this obscure stuff.

Jake Bernstein: Yep.

Kip Boyle: All right, that wraps up this episode of the Cyber Risk Management Podcast, and today we explored why employees keep ignoring workplace cybersecurity rules and what you can do about it. Thanks for being here. We'll see you next time.

Jake Bernstein: See you next time.

Speaker 1: Thanks for joining us today on the Cyber Risk Management Podcast. If you need to overcome a cybersecurity hurdle that's keeping you from growing your business profitably, then please visit us at cr-map.com. Thanks for tuning in. See you next time.

Headshot of Kip BoyleYOUR HOST:

Kip Boyle
Cyber Risk Opportunities

Kip Boyle is a 20-year information security expert and is the founder and CEO of Cyber Risk Opportunities. He is a former Chief Information Security Officer for both technology and financial services companies and was a cyber-security consultant at Stanford Research Institute (SRI).

YOUR CO-HOST:

Jake Bernstein
K&L Gates LLC

Jake Bernstein, an attorney and Certified Information Systems Security Professional (CISSP) who practices extensively in cybersecurity and privacy as both a counselor and litigator.