
EP 123: How to Really Reduce the Risk of People Falling for Phishing
Our bi-weekly Inflection Point bulletin will help you keep up with the fast-paced evolution of cyber risk management.
Sign Up Now!
About this episode
January 17, 2023
What can we learn from a recently released research report called “Phishing in Organizations: Findings from a Large-Scale and Long-Term Study”? Let’s find out with our guest, Jason Rebholz, the CISO of Corvus Insurance. Your hosts are Kip Boyle, CISO with Cyber Risk Opportunities, and Jake Bernstein, Partner with K&L Gates.
Jason Rebholz prior guest appearance–https://cr-map.com/podcast/114/
“Some Workgroups Deserve More Protection Against Malware”–https://cr-map.com/podcast/108/
“How to Really Make Sure that Cybersecurity is Everyone’s Job” (pt 1 & 2)
https://cr-map.com/podcast/88/
https://cr-map.com/podcast/89/
Episode Transcript
Speaker 1: Welcome to the Cyber Risk Management Podcast. Our mission is to help executives thrive as cyber risk managers. Your hosts are Kip Boyle, Virtual Chief Information Security Officer at Cyber Risk Opportunities and Jake Bernstein, partner at the law firm of K&L Gates. Visit them at cr-map.com and klgates.com.
Jake Bernstein: So Kip, what are we going to talk about today on episode 123 of the Cyber Risk Management podcast?
Kip Boyle: Hi, Jake. Today we're going to see if we can find a way to really reduce the risk of people falling for a phishing attack, which is a big, big deal. And this is such a big deal that-
Jake Bernstein: Oh, ban email. Let's just ban email. That'll do it.
Kip Boyle: We may end up there at some point in the future. Email has become quite a problem, but because this is such a big issue, we have a guest and it's a returning guest. We don't have a lot of returning guests, but Jason Rebholz is back and he's the CISO of Corvus Insurance. So Jason, we're happy to have you back on the podcast. And the last time you were here was on episode 114, which we published on September 13th, 2022. And I remember that we talked about how small and medium sized businesses can benefit from cyber insurance even if they don't buy a policy. That was kind of the upshot of the episode. And anyway, really great to have you back, Jason.
Jason Rebholz: Yeah, I appreciate you having me. And definitely that episode still stands true. Everyone should have cyber insurance, especially SMBs.
Kip Boyle: Absolutely agree. Yeah, I really appreciate what you did there, the value that you brought. And then for anybody who hasn't listened to that episode, Jason, would you mind just a short thumbnail sketch of your role at Corvus Insurance?
Jason Rebholz: Yeah, so I'm the CISO at Corvus Insurance, and we are a cyber insurance InsurTech, which allows us to bring technology into underwriting. And so I have a fairly unique role for a CISO where I spend part of my time on internal security, but also spend time supporting our policy holders and our underwriters in understanding security and supporting our product team in building out just a next level scanning utility to understand risk.
Kip Boyle: Boy, those external scanners need all the help they can get. I'm so glad you're focused on them because I have to-
Jason Rebholz: There's a lot of opportunity there.
Jake Bernstein: Well, and I think too that it depends on how you use it, right? It's a tool. It's not a, what do we often say, Kip? There's no easy button.
Kip Boyle: It's not an easy button.
Jake Bernstein: Those scanners are not easy buttons.
Kip Boyle: But everybody wants easy buttons.
Jake Bernstein: Everybody always wants an easy button, which you just can't always get. But I think here that having an InsurTech, I think it makes a lot of sense because really what you're doing is you're trying to manage risk, your own risk, the underwriters. And I think it's interesting. It doesn't surprise me that this is a thing that's out there in the marketplace. It seems pretty smart. And I assume that it's a component, but it's not the entirety of the underwriting process.
Jason Rebholz: Exactly. And I think that's true for any external scanner as well. You're not going to build a house with just a hammer. You're also not going to assess your entire security program with just an external scanning tool. So for us, it gives us inputs to understand risk and it's really meant to predict the likelihood of an incident. And it draws in really interesting things and data points for us. We can tell that, for example, for policy holders that do not have a secure email gateway, they're twice as likely to have a business email compromise. And so that's the type of insight that we can bring into underwriting because we have the data available.
Kip Boyle: That's fantastic.
Jake Bernstein: I mean, isn't that the essence of underwriting is having the data and being able to say things like that?
Jason Rebholz: Exactly.
Jake Bernstein: Yeah. That's amazing.
Kip Boyle: Jason, is that a published statistic that you just shared with us about not having a secure email gateway, you're twice as likely? That's great.
Jason Rebholz: Yep. It is. Yeah, we have a risk insights report that we release and there's tons and tons of little nuggets in there.
Kip Boyle: Oh, that's fantastic. Excellent. Okay, so let's get on with the episode here. So what we're going to do, members of our audience, I just want to set the stage here, is there was a report that was released in the last year. It was a research report and it seemed to me that it was very, very well done, had a good methodology and I thought it was very trustworthy. So it's called Phishing and Organizations, Findings From a Large Scale and Long-Term Study. And this came out from the Department of Computer Science at ETH Zurich, which is a public research university in Switzerland. So we can trust it because-
Jake Bernstein: Just to be clear, it's not related to Ethereum. This is not a blockchain thing. It's not a dot F, right? This is actually probably one of the older institutions in the world.
Kip Boyle: Oh yeah. Oh yeah. They've been open for well over 100 years from what I understand. But let me just summarize what they did, what was their research? And then what I would like to do is just kind of unpack this paper for everybody and see what we can learn from actual research. And this is, as far as I know, this is one of the biggest research projects and most rigorous research projects on the topic. And so that's why I think it's really worthy. So the experiment ran for 15 months. And they had more than 14,000 study participants. And so they actually focused on a very large organization and the 14,000 participants were actually employees of the company that they studied.
And these 14,000 people received different simulated phishing emails in their normal working context. The other thing they did is they deployed a reporting button so that the company's email client would allow any participant to report a suspicious email that they received just by clicking a button. And they measured the click rates for phishing emails. They measured any dangerous actions, like if somebody fell for the phish and they submitted their credentials, and they also measured the rate at which suspicious emails were reported. And before we start getting into it, I want to read the tagline of this research report. They said, the results of this landmark experiment calls into question the expanding corporate spend on user education campaigns that combine simulated phishing attacks with training videos and mandatory quizzes. So they're kind of saying the emperor has no clothes, right? Anybody?
Jake Bernstein: Why am I not surprised, I guess, like you were? Maybe I am a little surprised. And I think it really seems to be an issue that... It's the do something mentality, right, Kip? We've talked about that before.
Kip Boyle: Which got us here.
Jake Bernstein: It did, it did. And I think it's good to study these things vigorously. So I suppose what we should do is really just dig in.
Kip Boyle: Well, I want to hear what Jason thinks about this. Jason, are you surprised by their tagline?
Jason Rebholz: I wouldn't say I'm fully surprised. I think the challenge that we have in security is that we just take a lot of things at face value because somebody told us it's what we're supposed to do. And I love that somebody has taken a step back and challenged the status quo on what has been this defacto recommendations for phishing awareness education. And so, I just like it from that component. There's certainly nuggets in here that I think are super surprising. Others are just like, okay, great, tell me something I didn't know. But overall, this is the type of mentality that we have to have in security is we need to constantly challenge ourselves and say, is what we think actually accurate? Because we're going to get surprised from time to time.
Kip Boyle: And I agree with you. There's going to be some surprises in here and there's going to be some non surprises in here. So first thing I want to do is the report has some observations, which I think are really insightful. And I was hoping, Jake, that you could talk about the report, the part of the report that talked about who is most susceptible to falling for a phishing attack?
Jake Bernstein: So the question here is, does age and computer skill correlate with phishing susceptibility? And unsurprisingly, it does. For example, older, 50 plus, and younger, 18 to 19, darn kids, employees are more at risk, and people with lower computer skills are more at risk. And that, I think, is intuitive. The age thing is not necessarily intuitive, but the lower computer skills, I really do think, if you don't know how to check the from email address, it makes it a lot harder to spot a phish, particularly modern phishes, which are generally less bad than they used to be. Nor does it surprise me that people in the 20 to 29 year old age range were the least susceptible. I wouldn't be surprised if the 30 to 50 or 30 to 49 range is pretty close as well. Another kind of, I think, intuitive finding is that gender plays no role here really. It doesn't correlate, I should say, with any phishing susceptibility. But one thing that I think is a good question is what if I just keep hammering at you? What if, in other words-
Kip Boyle: In the name of testing.
Jake Bernstein: In the name of testing, yeah, how many times does it take before I get you? And what the report found was that many users will eventually fall for the phish, a phishing trick, if continuously exposed. And that reminds me of what's being called MFA fatigue, right, which it's not the same exact thing. It's not identical, but it is somewhat similar. And I think that is-
Kip Boyle: It's a human factors thing.
Jake Bernstein: It's a human factors thing, yeah. Unlike computers, which will do the same thing given the same input and the same controls basically every time unless you mess with it.
Kip Boyle: Yeah. People are non-deterministic.
Jake Bernstein: Humans will get bored. They will get tired, they will get frustrated, they just might have a bad day. Computers don't have bad days, all things considered.
Jason Rebholz: Tell that to Facebook when they went down.
Jake Bernstein: On a deep, technical level, computers are incapable of having bad days. Whether or not bad things happen to computers is totally separate. Now, I think what's interesting too is that the type of computer use is actually more predictive for phishing vulnerability than the amount of computer use. And this makes a lot of sense, right? The most vulnerable employees are those who are using computers daily for repetitive tasks with specialized software only. Rather than people like us who are probably using it for everything, every day.
Kip Boyle: Yeah, many and various tasks.
Jake Bernstein: Or obviously for people who don't need to use computers in their day-to-day job at all.
Kip Boyle: You know who I think about when you talk about the profile of the person that is more predictive to fall for this? I think about the people who fall for business email compromises. I think about accounts payable people, because they fit this profile. I think about salespeople because they fit this profile. Who else? I think about, gosh, somebody working in ERP.
Jake Bernstein: Customer service representatives are probably the quintessential example of this. You go, you sit down at a computer and you use it for a very limited purpose the whole day and it's constant, right?
Kip Boyle: And we did a podcast episode on this actually. We said some people deserve more protection because the nature of their job compels them to open email and email attachments that they don't expect to receive. So I think that's the nature, the very nature of their job, also, I think makes them susceptible. Jason-
Jason Rebholz: That's one of the areas that this study didn't go too far into and I hope that another study does, right? Because we talk about what's role specific training? And so you have these high risk groups that the standard deviation for phishing emails away from their day-to-day is so small, compared to somebody that is, if they're not dealing with HR or finance and they get a HR and finance email, it's like, great, well, why would I even pay attention to this? And so I think you really have to hone that in. I would love to see a follow up study on that is what's the real effectiveness for that role specific training?
Kip Boyle: Yeah. And I don't even think, is there role specific training out there, generally available? I don't know. Have you seen any, Jason?
Jason Rebholz: No. Everything I've seen has been customized. A lot of these solutions that are out there today are just one size fits all. And if you want to do it, you can potentially do it through a platform. But the level of effort to do that, why are you paying for that in the first place then?
Kip Boyle: Episode 108, by the way, if you want to go listen to the episode where we talked about how some work groups deserve more protection against phishing and malware than others. That would be episode 108, I'll put that in the show notes, because we have show notes now. Jake badgered me into show notes.
Jake Bernstein: I badgered him until we did it.
Kip Boyle: So we have them now.
Jake Bernstein: And I think what we're really saying here is that the type of computer use is more predictive for phishing vulnerability than the amount of computer use. So it doesn't automatically follow that someone who's on the computer all day every day is more vulnerable to phishing than someone who's not at all. In fact, perhaps it would be the case that someone who only occasionally uses it for a specific purpose may be quite vulnerable.
Kip Boyle: Yeah, well, I think that's interesting because now we can really better focus on a profile of a person who would match susceptibility based on data, not based on human bias, about who we think is susceptible. Now we actually have some really great data that actually tells us who is susceptible. That's one of the things that I really like about this because I don't want to profile people based on just suspicions or my own inherent bias, whether it's conscious or unconscious. I think that gets us in trouble. I think it undermines our work if we fall into that trap. Are we ready to go onto the next section?
Jake Bernstein: Let's do it.
Kip Boyle: Yeah. Because there's this other part of the report that I think is worth looking at, before we get to the recommendations, which is how does an organization's vulnerability to phishing evolve over time? And Jason, I was hoping you could share what you think is important from the report on that?
Jason Rebholz: Yeah. And so, Jake kind of alluded to this before here. What happens over time? And so what the study found was that a third of employees were vulnerable over that 15 month period. So specifically, there's 32% of users that clicked on at least one link over that duration. Now, you go in even deeper on that and it's, what's the repeat susceptibility to that? And so we had about 30% of participants that clicked on a link in two or more phishing emails. And what I really like what they did was, it's not just about clicking or opening that email. It's, is there a dangerous action that's tied to that? And so that could be either giving up your credentials or enabling a macro. And it was about 24% of participants performed a dangerous action over that 15 month period of time. And that happened on two or more phishes.
So it's just, I always like to say that security is an endless game of survival, and for me, this is the stat that supports that because it's not just a one and done type scenario here. Attackers aren't going to just send one campaign. They're going to try to relentlessly go and try to attack your users. And Jake, when you talked about the MFA bypass, that was the first thing that I was thinking of as well. How do we just constantly stay in front of these users, because it's just a matter of time before somebody fails?
Kip Boyle: And the economics are in the favor of the attacker, right? Because the costs, the marginal cost of sending the next phish to try to get me, is zero for all practical purposes. So they can take a million shots on goal and I only have to screw up once and the puck goes whizzing by, knocking a tooth out of my mouth on the way, I suppose.
Jason Rebholz: And for me, this is one of the main reasons why. I never would advocate for a phishing awareness only security program if you're trying to secure email. We have to operate under the assumption that a human is going to make a mistake at some point in time. So our job as security practitioners is how do we fill that backstop for the human so when that eventual miss happens, we've got a backstop there that can try to give them a little bit of air cover and protect them from something truly bad happening?
Kip Boyle: Right? Yes. Thank you, Jason. I love that you said that because that's been my observation as well. And what I tell my customers is, nobody has ever achieved a 0% phishing rate. It's just, nobody that I know of has ever been able to do that. It's good to go from 15, 30, 20%, whatever you started at. It's good to get it down. But I think 3% is about as low as I've ever seen it go. I think that's about what anybody can reasonably expect to get to. So you still need a backstop. Absolutely need a backstop. I would like to share what I think is the right backstop for this, which is application control, which used to be called application white listing, where you have a computer, and especially for the people who are most susceptible, you have a computer that can only run a handful of programs on a list and everything else by default is denied.
I really like this approach because especially with the people that we said were susceptible, and we're using vertical applications, they don't need to run everything. They don't need to be the administrator on their computer. They need to run a very crisply defined shortlist of applications and that's it. And if we can get this right, and this is new, this is emergent, I'm not saying it's right for everybody, but I really think this is where we need to go with this is turn these general purpose computers into appliances that are tailored for use wherever we can. We can't do it with every role out there, but I really think that's where we should be heading. I don't know. Jake, Jason, I mean, am I smoking dope on this one?
Jake Bernstein: No one's ever gotten a 0% phish rating. Obviously, you can never rely on, well, you should never rely on any one defense. It's all about defense in depth and being able to have multiple kind of, what do they call them? The bailey, the castle walls. You've got the moat and then you've got the bailey and then you've got the keep and you want to have defense and depth. You can't just rely on one thing, right? And obviously, phishing training, it's proven it's just not going to work.
Jason Rebholz: I did a survey on LinkedIn a while back where I asked the question of, if your security technology quadrupled in price next year, how would you prioritize your spending? And the thing that shocked me the most from that is the number of people that said I would focus on phishing awareness training. And there were some users that were saying board level awareness. And for me, that really struck me, because it's just like, we place such a heavy emphasis on the training aspect, but we tend to forget, just as Kip said. You're never going to achieve 0% on these types of tests. And so that's where like, you can split it however you want, but you really have to have that balance between the two.
Jake Bernstein: I think one of the things that's so interesting about that survey. If the cost of your cybersecurity program quadrupled, what would you prioritize? And a lot of people have said phishing training. And why wouldn't we prioritize what happens after someone clicks? Because we know that you cannot prevent everybody from clicking. And in order for that to be truly effective, hypothetically, you would have to be able to stop 100% of phishes and that's just not feasible.
Kip Boyle: Exactly.
Jake Bernstein: I know we've used the phrase kill chain or attack chain and things like that on this podcast and security folks are familiar with that, but it really is the illustration of the importance of defense in depth. And if you were to focus too much on phishing prevention via training, it's the old saying, the defender has to be right all the time, the attacker only has to be right once. Warnings on top of suspicious emails could be effective. More detailed warnings are not necessarily more effective than simple ones. That's an interesting finding. I suppose that's true.
Kip Boyle: Well, they've proven it.
Jake Bernstein: Yeah. Well, I mean, I suppose, I guess I'm saying I suppose that makes sense, that if you go into great detail about why a phish was suspicious or something, people, unless you're immersed in this world, it's probably going to just be Greek to you, right? It's all Greek and that's just not going to be effective.
Jason Rebholz: Well, and it's attention span too. No one's going to be looking for a full thesis on why this particular thing is bad. You can even just think of the external banner or the banners for external emails coming in. Fairly benign or just kind of simple. But when you ask users, they're like, yeah, great, that makes a lot of sense. I can't imagine having a paragraph and something there. And the theory behind why external emails are bad for user. Simplicity wins here.
Jake Bernstein: You just gave me an interesting idea or a concern really about those types of external email banners is that, all that stuff eventually, we're highly adaptive creatures. All that stuff eventually fades into the background. It is the essence of the MFA fatigue attack style, but it's almost worse. And the problem would be that someone would click a phish even though it had the banner and then someone would be like, well, didn't you see the banner? And the truthful answer is, yeah, I saw it. I just didn't care because three quarters of my email has that banner and it doesn't mean anything. And that's a real challenge for some of this technology and the training that we do.
Jason Rebholz: Yeah, I mean it's just like dirty dishes piling up in the sink, right? It's like, somehow nobody manages to notice this pile of dishes that need to get cleaned or put in the dishwasher, but you just become immune to it. But I think the future of phishing security or awareness is going to be just in time information. How do you really deliver something that is going to stand out to a user, doesn't get lost in the shuffle, and is there just in time for them to take an action? Because the attention spans of humans is just decreasing over time. And because of just this media rich world that we live in-
Jake Bernstein: It's a separate problem entirely.
Kip Boyle: Regulated. Exactly. Yeah. Did we talk about the fact that voluntary embedded training actually makes things worse?
Jake Bernstein: No, but that's interesting.
Kip Boyle: I think this is one of the most useful outcomes of the research, is that if somebody clicks on a test phish and they're immediately redirected to a dedicated webpage that says, hey, you just fell for a simulated attack and so we want you to read this information or maybe take this little training before we're going to allow you to continue to work, that actually makes clicking and dangerous action rates go higher, which is so counterintuitive to what we thought was going to happen. And I was really surprised by this. But Jason, were you surprised by that?
Jason Rebholz: This is the finding that surprised me the most in the entire report. This is like, has my entire life been a lie type finding? I'm struggling to really understand why. Is it because it makes it seem like a game? Does it take away some of the risk? Is the training itself ineffective? I would really love to dig deeper into this to understand the why behind it, because it feels like it should make sense. All logic should be saying, hey, does this really happen? And two, do we need to tweak the type of training? Do we just need to scrap this all together? There's a lot of things that are left unanswered for me.
Kip Boyle: Yeah. And the research doesn't say exactly how we should change it. But my recollection is they said the reason why this happens, one reason why, is people start to, they do start to treat it as a game, because they're like, I want to see what all the training is. In other words, they say it's a game for them to collect all the training pages that get shown after you click on a phish. And so that's what, and then also people are saying, well, these are fake. These are fake phish. So it doesn't matter if I click on them. So they perceive it's zero risk to click on a fake phis. They want to see the training page because that's kind of the game. But then they end up clicking on a real phish because they think those are training phish. That was my takeaway.
Jake Bernstein: That's a bad design of the embedded training in that concept.
Kip Boyle: But who knew? Right? It's an unintended consequence.
Jake Bernstein: It is, it is.
Jason Rebholz: I would also love to meet the users who actually want to see all the training because that's just mind boggling to me.
Kip Boyle: Well, I want to see the 1% of the people that click on everything you send to them no matter what. Because there are people like that. Even without complicating the matter, there's just people out there that will just click on everything, all the time, no matter what. No matter how subtle or not subtle the thing is, they will click on it.
Jake Bernstein: Clicking is fun.
Kip Boyle: Clicking is fun. Let's click some more. Okay. So the last thing that they said here, which I thought was really good in terms of phishing warnings and training was that crowdsourcing phish detection was effective and feasible. And this goes back to that button that they put on the email client that a person could click if they found that they were looking at a phish in their inbox. And so, that's good because I think people intuitively expect that that would help, and it turns out the research says it does.
Jake Bernstein: That's good. That's good.
Kip Boyle: Yeah.
Jason Rebholz: Well, and one thing that's interesting for me is I think there's multiple forms of crowdsourcing here. I love that button. It's something that we try to encourage all of our users to use and they do. The caveat is sometimes it doesn't work and so you have to have backups, whether it's forwarding the email to an email distro for the security team, something along those lines. But I also see just a lot of traction in Slack or Teams where you've got users saying, hey, did anyone else receive this weird email? So when you start seeing that, it's really interesting to see just the conversations that can happen. I also think back to the Uber breach where the attacker posted it in the Slack channel. Everyone thought it was a joke, but then it's just like, oh, this is actually real. And I'm sure that got the security team moving really, really quick when that was happening.
Kip Boyle: Yeah, I don't remember the report saying that the only form of crowdsourcing that was effective and feasible was that button, but that's what they tested. So yeah, that's a great point, Jason. Are we ready to get onto the recommendations? Because they did make some very specific recommendations.
Jake Bernstein: Yeah. Let's do that and I'll start here. So first one is kind of obvious. Adopt phishing prevention tools, including things like simple embedded warnings. I've experienced them. They're not bad, they are effective. But most importantly, adopt the ones that have been extensively studied and where the available literature supports their effectiveness overwhelmingly. In other words, if you don't need to guess about what works and what doesn't, that's for the best, right? Use what works. And then, another one is, in what we were just talking about, is embedded phish training as commonly deployed in the industry today, is not effective and can in fact have negative side effects, such as actually making people more susceptible to phishing. So don't necessarily do the thing where you immediately redirect people to a webpage. Don't make it a game. Don't make it a low stakes game that might tickle someone's sense of exploration and curiosity. That's probably the worst thing you can do in this type of situation.
Kip Boyle: Yep. Yeah, definitely. And I think that's going to be a problem, because don't the major commercial phishing training platforms include this, and in fact, kind of trumpet this as a brand new awesome feature that you can't live without? I think there's this train wreck that's about to happen here.
Jason Rebholz: I imagine a big battle is going to happen here, because yeah, that's such a key feature. And it's a differentiator for a lot of the upper tier ones where it's just like, hey, well we have this and this is what everyone thought was industry best practice. But I think that for me, those recommendations, Jake, that you just read, it screams to me question how we're doing things and experiment. There's nothing stopping any of us here on going and trying this out in our own organizations. I'm going to be going back and testing some stuff out in our quarterly phishing test now and see what can we do to help just tweak this and get to the results that we want quicker.
Kip Boyle: And when you say that, Jason, are you saying for Corvus on an internal basis so that your folks don't get phished? And then are you also saying for insurance? Because you are very interested in what actually reduces the risk of a claim, right?
Jason Rebholz: Absolutely. Yeah. This is all across the board, because at the end of the day, it's just, phishing is one of the most common attacks that we see. In terms of frequency, business email compromise is actually far ahead of ransomware. Ransomware gets all the news because it's a high severity incident, meaning there's big cost associated with it, but frequency based, it's phishing attacks. And so, for us, how do we help our policyholders and ourselves figure out how can we reduce that and just help everybody out along the way?
Kip Boyle: Right, right, right. Absolutely. Well, so there's three more recommendations that the research report makes. Why don't you take the next two, Jason?
Jason Rebholz: Yeah, so one we were talking about quite a bit here recently. Just crowdsource phishing detection is effective and practical in a single organization. So, the phishing button for reporting, especially when you can tie that into auto remediation. When you're reaching a threshold of users reporting the same email, get it out of the user's email box. You're spreading some of that security expertise and just kind of forward thinking scouting on this to the rest of the organization. So those are the things. You got to get that in place.
Kip Boyle: And that implies that you have the capability to quickly identify a phish that's in an inbox that hasn't been looked at yet and potentially in hundreds or thousands of inboxes, right?
Jason Rebholz: Exactly. Yeah. And so part of this is how do you have this review process? Do you have a team that's reviewing it or are you outsourcing that? There's no wrong answer with that. But have somebody looking at this and be in that position to pull that back. Because you know what? At the end of the day, you can always get that email back into a user's email box, right? But it's that time that it's sitting in there just waiting for somebody to click on that. That's your danger zone. So try to reduce that as much as possible.
Kip Boyle: Got it. Okay. What else?
Jason Rebholz: And then the next recommendation was just be prepared to put repeat phishers or clickers into a better habit. So how can you work with them to fix that bad habit? And so, this is just really taking the time and identifying, who are the repeat offenders, and working with them to build better habits on replacing that. Don't click on the links. So I think this is where it could be more training, it could just be more one on one. I know a lot of people will try to loop in managers for this. I'm personally not a fan of the scare tactics, but this is something where when you have those repeat offenders, you have to take the time to work with them, because they're a weak link, and that could take down the whole organization.
Kip Boyle: Yeah, absolutely. What I advise my customers to do is to enter people who repeatedly click on phishing links into, most organizations have what's called a progressive disciplinary system where it starts as a verbal warning and then you get a written warning and it kind of escalates from there. So the more times you violate a particular policy or SOP or something like that, the more serious the consequences get. And eventually you can get fired, right, if you are warned several times and then you just don't correct your behavior. And so, that's what I encourage our customers to do, and just treat it like any other violation. You showed up to work drunk or you insubordinated your supervisor. I mean, really, I think if you just treat it like any violation that would justify you going into some kind of a disciplinary process, then I think it kind of normalizes that this is important. It's not just some weird squirrely IT thing that doesn't really matter. I don't know. That's what I think.
Jason Rebholz: Yeah, I like that. The only thing I would add to it is you really have to give that support, right? Because maybe the training that you have for the masses isn't resonating with that individual. So I would look introspectively and say, what is going on with this training that is not resonating with the user and how can I support them? And so, give them that opportunity, and if they continue to go into that, right, yeah, see that probationary thing all the way through. Because at some point, if it's not clicking, there's a liability for the company.
Kip Boyle: Oh, absolutely. Yeah. Good stuff. Okay, so there's one final recommendation in the report I want to cover before we wrap the episode up, which is, and I like this one. It says, remember that phishing testing is more of a catalyst for culture shift rather than a goal unto itself. I really appreciate that because it re-enforces what we've already talked about. Jason, you talked about this very eloquently when you said phishing training has to have a backstop, because we can't rely on it as a single line of defense. And I think this fifth recommendation really touches on that. And I want to call out that Jake and I actually talked about this extensively.
How do you create a culture of cybersecurity, of reasonable cybersecurity? And we did that back in episodes 88 and 89. There was actually a two-part. And it was called How to Really Make Sure That Cybersecurity is Everyone's Jobs. We released those episodes in September of 2021 and yes, I will put the links to the show notes or links to those episodes in the show notes. Because Jake and I really did that as a result of what we read in the 2021 Verizon Data Breach Investigation report. And anyway, I was really happy to do those episodes and I think they make a great companion to what this fifth recommendation is all about. So, all right, so anybody have a comment about this fifth recommendation as we bring the episode to a close?
Jake Bernstein: I think it's really important to remember that, as we often say, everyone is a foot soldier in the cyber wars. Kip, that's your phrase. And a culture shift is ultimately more valuable than almost any trinket or doodad or blinky light box you can find.
Kip Boyle: Yeah, definitely. Jason, any thoughts on this one?
Jason Rebholz: Yeah, I would just try to further define culture here, right? I think for our sake in security, that culture is the habits of that user and that whole community. And so how can you foster better security habits that everybody embraces? And that's how you can really start shifting an entire organization's security program.
Kip Boyle: Yeah. Well, and that's what we talk about in these episodes, 88 and 89. We talk about in role expectations and we talk about extra role expectations. So if you're an accounts payable person, what's unique to you in that role? If you're a salesperson, what's unique to you? But then, those are the in role. And then the extra role behaviors are, hey, no matter what your role is, if you hear somebody struggling with a password manager, help them. Just say, I can see you're struggling with that. Can I help you? I have it figured out, I'm happy to share with you. Those are the extra role ideas. And anyway, so please, please go listen to those episodes, everybody. I think it's going to really help give you a vocabulary for what Jason is saying, which is what exactly do we mean when we say we want a culture shift?
How do we observe that? What should we be asking people to do? And again, I think that those episodes really do a good job of unpacking that. We actually base it on another piece of research that the DBIR pointed out to us. Okay, well, I think that's more than enough for everybody on this topic today. Really appreciate you being here, Jason. And if listeners want to interact with you, find out more about the work that you're doing, how would you like them to do that? Where would they go to find out more?
Jason Rebholz: Yeah, thanks for having me. And if you want to learn more, follow me on LinkedIn and you can check out my YouTube channel at Teach Me Cyber.
Kip Boyle: Excellent. Yeah, I've been watching the things that you've been publishing there and really enjoying that. So yeah, everybody, if you want to get smarter on cyber or if you know somebody who wants to get smarter on cyber, you definitely need to go to Jason's YouTube channel and follow him there. Any last words, Jake?
Jake Bernstein: Not today.
Kip Boyle: Oh my goodness. Case closed. The prosecution rests. All those tropes. And that wraps up this episode of the Cyber Risk Management Podcast. And today, we tried to see if we can find a way to really reduce the risk of people falling for phishing attacks. We did that with our guest, Jason Rebholz, and he's the CISO of Corvus Insurance. Thanks so much everybody for listening, and we'll see you next time.
Jake Bernstein: See you next time.
Speaker 1: Thanks for joining us today on the Cyber Risk Management Podcast. If you need to overcome a cybersecurity hurdle that's keeping you from growing your business profitably, then please visit us cr-map.com. Thanks for tuning in. See you next time.
Sign up to receive email updates
Enter your name and email address below and I'll send you periodic updates about the podcast.
YOUR HOST:
Kip Boyle
Cyber Risk Opportunities
Kip Boyle is a 20-year information security expert and is the founder and CEO of Cyber Risk Opportunities. He is a former Chief Information Security Officer for both technology and financial services companies and was a cyber-security consultant at Stanford Research Institute (SRI).
YOUR CO-HOST:
Jake Bernstein
K&L Gates LLC
Jake Bernstein, an attorney and Certified Information Systems Security Professional (CISSP) who practices extensively in cybersecurity and privacy as both a counselor and litigator.