EPISODE 126
Due diligence as a Risk Management Approach

EP 126: Due diligence as a Risk Management Approach

Our bi-weekly Inflection Point bulletin will help you keep up with the fast-paced evolution of cyber risk management.

Sign Up Now!

About this episode

February 28, 2023

Can you “demonstrate due diligence to a defensible standard of care” as your risk management approach? This would replace “red/yellow/green” approaches or advanced statistics. Let’s find out with our guest, Karen Worstell, who is a “Senior Cybersecurity Strategist” and a “CxO Security Advisor” with VMware. Your hosts are Kip Boyle, CISO with Cyber Risk Opportunities, and Jake Bernstein, Partner with K&L Gates.

If you want to learn more about DOCRA (The Duty of Care Risk Analysis Standard) check out our previous episode — https://cr-map.com/59

“Risk-Based Security is the Emperor’s New Clothes”
https://taosecurity.blogspot.com/2006/06/risk-based-security-is-emperors-new.html

Tags:

Episode Transcript

Speaker 1: Welcome to the Cyber Risk Management Podcast. Our mission is to help executives thrive as cyber risk managers. Your hosts are Kip Boyle, Virtual Chief Information Security Officer at Cyber Risk Opportunities and Jake Bernstein, partner at the law firm of K&L Gates. Visit them at cr-map.com and klgates.com.

Jake: So Kip, what are we going to talk about today on episode 126 of the Cyber Risk Management Podcast?

Kip: Hey, Jake. Today we're going to explore this idea, which is actually pretty... It's been around for a while, but it's also controversial. I don't think it's been resolved yet. But the idea is about demonstrating due diligence to a defensible standard of care as a risk management approach. And so this is going to be really interesting, because we've talked about due care before, but now we're going to talk about it in a little bit of a different way, and we're going to do that with our guest Karen Worstell. And Karen is a senior cybersecurity strategist, and she's also a CxO security advisor, and she works at VMware, that little software company.

Karen: inaudible

Kip: And I should tell you right now that Karen and I used to work together at Stanford Research Institute in Menlo Park, California, but that was back in the day. But my gosh, that was so much fun. It was one of the best jobs I think I ever had, and I learned so much. It was so formative to how I do my work these days. So without any further ado, Karen, hi. Welcome to the podcast. So glad that you're our guest.

Karen: I am so thrilled to be here. I've looked forward to this for some time, so I'm glad we're finally getting to have this conversation.

Kip: Yeah. It's wonderful. So I just gave you a little introduction, but I would love for you to tell everybody more, because you've been doing this work for a long time now, and you have just such a depth of experience and capability, and I would just love for you to share a little bit more about who you are and just anything you want to say.

Karen: Okay. Thanks. I have been doing it for a while, long enough to figure out a whole bunch of things that don't work, and a few things that do work amazingly well. And so that's part of why I love this topic so much, because as I've been a serial CISO for multiple large brands, and served as you know, Kip, because we worked together when we were at SRI, in the consulting arena, working for some of the top brands around the world, that was just the most amazing experience. And before I came to SRI, I did cybersecurity research and engineering. And so I have had an amazing opportunity to do a lot of different aspects of this industry and see what other people are doing, and so that's kind of, I guess, why I have opinions.

Kip: Well, and...

Karen: Especially why I like to share them.

Kip: Yeah. And I would say they're well informed opinions. I think people could accuse you of having opinions, but I don't think they're empty-headed ones, not by any stretch of the imagination. Well informed. So now you're at VMware, and would you just tell us a little bit about what's this role that you're currently in? What's it about?

Karen: So as a senior cybersecurity strategist at VMware, I wear a few hats. I get to obviously learn a lot from the product team and roadmaps and product development and technology. Work on customer advisory boards, so I get to hear from customers. And I do a lot of going out with the inaudible teams and speaking one-on-one with customers. Plus I get to do podcasts and events and so forth. We have our own podcast that we do a live stream on LinkedIn once a month called Ask the Howlers. So it's pretty much trying to share cybersecurity, thought leadership, I guess, very similar to what we did at SRI.

Kip: Yeah. But also stay in touch with what's actually happening in the market space, which I think is wonderful to be able to talk to people, practitioners who are actually out there trying to move the needle on this stuff, and just see, what are people struggling with, what are they succeeding at? I love that perspective, to be able to have that. Sounds like that's what you've got right now. Is that right?

Karen: It is. And I think in looking back to our shared roles that we had when we were back at SRI, it's very similar in that having the customer inform you about what's top of mind for them and what are they struggling with and what's going well for them is everything.

Kip: Yeah. It's so powerful. And then we did that in the I-4 organization, which was just so wonderful. By the way, Jake, you and Karen I think have something in common in your backgrounds, because Karen as an undergrad, weren't you a biology major in college?

Karen: I was. Actually, I got a double degree in biology and chemistry.

Kip: Right. Okay. Well, Jake worked in a hard science area, right, Jake? Remind us.

Jake: I did, yes. I also have a double degree, but mine's in philosophy and then solid molecular biology, so pretty similar.

Karen: Oh, wow. An ethicist.

Jake: Yes. I'm listening to this and I'm really curious where we're going to take this conversation, because things like due diligence and defensible standard of care, those sound really legal to me. They become really... Those are really legal concepts. You hear that around negligence, liability. And I don't know that half an hour or 45 minutes is going to be nearly enough to cover this topic, but I want to kick off that part of the discussion by asking, what do we mean when we say demonstrating due diligence? And maybe let's just start with that, because I think due diligence is a term that gets tossed around an awful lot, and I know lawyers know what it means, but what about... For everyone else who's not a lawyer, let's just quickly talk about, what does that mean?

Karen: And you know what? I love that we're doing this together, because if there's something that you think I should tweak about the way that I talk about this, I really want to hear about it from you. So I developed this partly when I was working with Donn Parker at SRI, and that was the germ of it. We can talk a little bit about how that happened. But basically the way that I use the terminology is that duty of care is part of it. Duty of care means that there has to be an understanding of the impact that I have on another party, and I need to exercise a reasonable level of care, and that is my due diligence, before I act in a way that could impact them adversely.

And so it's really important to us to think about that now in the cybersecurity arena, because third party risk is such a big problem. It's a big problem now because we didn't pay attention to it sooner. And it's kind of a freight train that is running away, and there isn't really any good solution for it. So I do think it will start to become more of a legal issue. So that's my short term definition. And I'll tell you, while the germ of it happened in the '90s when we were at Stanford Research Institute, I developed it a lot further when I was working as a chief information security officer. Because I realized early that I could be deposed by hostile counsel. We actually used to do a session that was called welcome to your deposition.

Jake: Excellent. Everybody should do that in a position of authority, who might get deposed.

Karen: Yes, they should. It's a gigantic eye-opener, I think you'd agree.

Jake: I was going to inaudible it's rather eye-opening. Yes.

Karen: But my imaginary scenario and use case was, if I'm being deposed by hostile counsel after a breach that exposed millions of records, what I wanted to be able to do was to say, "I can tell you that the controls that we have in place are the appropriate controls for our business and for our constituents, and that I can prove to you that that is the case, and I can demonstrate to you that they are in place and operational 100% of the time. Next question." That's my goal. I like to win those arguments.

Jake: Yeah. inaudible

Karen: And so that's where I'm coming from.

Jake: Kip, I don't know if you shared with Karen my first principle, which is to... The way I approach cybersecurity from the legal perspective is to work backward from the trial. And what do I say to clients? I say, "My goal to make sure that you have perfect cybersecurity because that's not possible. It's to make sure that you have a defensible position from which you can defend yourself." And I think that that sounds a lot like what we're talking about here, what your concept is. And so there's so many ways that this fits into a cyber risk management strategy, and I have a feeling we're probably not going to be talking about the details of firewall settings and whether or not we have DMZs set up appropriately and whatnot, but what are some of the things that you focus on, and how does this fit into a cyber risk management strategy?

Karen: Well, now let's go back... And Kip, I'd love to share a little bit about our shared experience with our inaudible.

Kip: Sure.

Karen: Because Donn Parker had this very interesting, very provocative title that he used for a presentation at an I-4 seminar once. And it was called risk assessment is hooey. And it got everybody's attention, coming from him. And he had a particular point of view, and he wasn't completely saying that risk assessment has no place in it. But one of the things that was happening at that time was that people were going into these endless cycles. And we saw this when GLBA came out, and other kinds of regulatory requirements came out, requiring risk assessment. We went into these endless do-loops of refining a risk assessment, when actually there's... It's not a hard science, it's an art form. And so we ended up with these very squishy, very malleable, and very changeable risk models for the environment, and it ended up not accomplishing a whole lot.

And so that's kind of the foundation of where I started with this, is there actually... And so at the same time that that was going on, there was an effort underway to create a standard. And there were a bunch of very top tier companies that came together led by American Express at the time. Ken Cutler at American Express did it. And they wrote what they called... What was it? It was a cybersecurity best practice document. I'm trying to remember what the name of it was now. It escapes me.

Jake: And I'm just curious, was this... This was obviously pre Cybersecurity Framework? It was pre... inaudible

Karen: This became ISO 27000.

Jake: Okay. So that's where we are.

Karen: Yeah.

Kip: Yeah. Then that was, wasn't that BS, British Standard 7799?

Karen: It first became... Yeah, it was... It was called the Code of Practice. That's what it was called.

Kip: Code of Practice.

Karen: It was Information Security Code of Practice. That was what its working title was. And then it became BS 7799, and BS 7799 ultimately became ISO 27001, and the others that followed from there. So with that framework happening in the background, and it was the first framework, first and only for some time, that Donn's premise and the one that I got tutored under and really actually got the opportunity to do it, so I know that this works, which was to say, don't go bottom-up. Don't try to risk assess this from a bottoms-up perspective. Risk assessment has its place, but if you already know what the code of practice ought to be, you have a standard for what due diligence looks like.

Jake: And when you say risk assessment, just to be clear, are you hinting at, as quantitative as you can make it in that sense? Because Kip and I have strong feelings on quantitative risk assessment and how it probably doesn't work. Although sidebar, the insurance industry now has data, more data than this industry, than cybersecurity strategists has ever had.

Kip: And they paid dearly for it.

Jake: They had paid dearly for it. And maybe that's something we can talk about later, because I find it really fascinating how theory has intersected with reality, kind of like how we all thought that the Russian army was unstoppable, and clearly turns out not. This is almost at the core of what I try to do as well, so I'm super fascinated by it. And sorry, my question was, when you say risk assessment, what do you mean there? Because I've run into this same problem.

Kip: Qualitative, quantitative.

Jake: Semi-qualitative, semi-quantitative. And how do you even do it? And I think that's what you're getting at, right? Which is, in a lot of senses you can't do it in a meaningful way.

Karen: Right. I have never seen it actually done in a meaningful way that actually translates into an actionable program that is sustainable and defensible.

Kip: And affordable.

Karen: Well, the affordable thing, we'll come to that.

Kip: Yeah. The reason why I think that's important is because I think a great example of what we're talking about here is the FAIR method, where highly quantitative... And I know some enterprises do it, like Starbucks. Starbucks is head over heels in love with FAIR, and if it works for them, I think it's great. I have no problem with it working for organizations that can make it work. But at the same time, they have a whole team of people that work on just doing risk assessments using FAIR, and it's extremely math and advanced statistics dependent, so you've got to have people that understand that. And then it depends on senior decision makers respecting advanced statistics and being able to understand it, or being able to at least report what FAIR is putting out in a way that senior decision makers can understand.

And my bottom line on it is, garbage in, garbage out. If you don't spend the money to get the best, highest quality data to put into your algorithms, then you might as well just be on the back of a napkin. It's cheaper, and you're going to get just as good of a result, I think, anyway. That's how I think about this.

Karen: I agree with you completely. In fact, I was going to say garbage in, garbage out myself. But I think that... So here's what I am suggesting, and what we did at AT&T Wireless, because I like to do security top-down as an outcome-based exercise.

Jake: That sounds like the CSF.

Kip: Yeah. And certainly it is... That's the Cybersecurity Framework, and it's certainly the way that we use it at Cyber Risk Opportunities. When we do a CR map, that's exactly what we do, is we start from the top and then we drive down and we focus on top five cyber risks, because there's an endless parade of cyber risks coming at you, but you have a limited budget.

Jake: The terminology here matters. Literally the smallest, the lowest set is called outcome. That's what it's called.

Kip: Yeah. It is. Yep.

Jake: Sorry, I keep interrupting you, but this is super fascinating.

Karen: No, no. inaudible

Kip: We're excited.

Jake: We're very excited.

Karen: Well, like we keep saying, Kip and I got really tutored in the same place, same watering hole. And here's what I mean by an outcome, as an example. I need to be able to demonstrate to an auditor, beyond any reasonable doubt, that every entry that hits the general ledger is accurate to within two cents.

Jake: In a financial sense, just to be clear.

Karen: That's a financial example.

Jake: Financial example. Yeah. No, that's important.

Karen: But think of how that applies in many other cases. I need to show that I don't have leakage of confidential data, and that I have a tolerance for it inaudible and this is where the DoCRA standard gets interesting because they've defined how to do that in kind of a qualitative way. Because one of the things I found that's really difficult is people's risk tolerance level is super hard for them to define.

Jake: It is. And I was going to ask if you knew Chris, but clearly you've heard of him.

Karen: Yeah.

Jake: Yes. inaudible

Kip: And we've done an episode on DoCRA before, actually. It was episode 54, I think it was. No, 59. 59. And I'll put it a link to that so you can look at the transcript, if anybody wants to check it out.

Karen: So I'm excited about DoCRA, because I can takes the conversation on duty of care a little further when it comes to cybersecurity. But the idea of being able to say, this is how we used it AT&T Wireless, because we had a very limited amount of time to do an entire overhaul top to bottom for all of the IT environment and change the security.

Jake: Oh my God, that sounds...

Karen: And we had 10 months to do it, from start to finish.

Jake: That's immense. Absolutely immense.

Karen: And so I started writing blank checks to the... Or started writing checks to the bank account in the sky, because there wasn't any other way to get it done. We hired a lot of people, but in order to get everybody very focused on what it is we had to do, we designed it as an outcome based security, and we used defensible standards. So I needed to be able to show that I wasn't inventing my own idea of what constituted duty of care and due diligence. I went... First of all, we had just come off of Enron, and so we knew that there were going to be a heavy emphasis on the financials. And of course we had Sarbanes-Oxley. And so it's like, I have standards out there. I have COBIT, I have ITIL, I have ISO 27000. And those are the ones I used.

And we created our framework. Everything would map back to that. But instead of saying, "Here's the requirements," and making a prescriptive set of requirements, we said, "What are the controls in the entire end-to-end work stream that have to be in place in order to accomplish a less than two cent variance on every transaction against the general ledger? What does that look like?" And it meant that we had to do a pretty good analysis, in fact an excellent analysis, of the business process, all the way back to where the financial organization granted accounts to authorized people to do certain things. Because I needed to demonstrate things like separation of duties. I needed to be able to demonstrate that every account that was being used for that was authorized and in the hands of the right person. I needed to inaudible there was a whole bunch of stuff.

And so we kind of did what you described, Jake, is that we began with the end in mind saying, the most rigorous test that an auditor is going to put us through, we have to pass. And the terms... And just for a little bit of background info, the pressure that was on this team was, the largest cash acquisition in U.S. history depended on the IT team re-engineering the security in IT, getting it 100% with no deficiencies on the very first audit, and we had 10 months to do it. And we were starting from a place where there were things like... There were 36 different identity and access management systems, with four inaudible I think there might have been nine systems of record, I forget what the number exactly was any more, but it was a huge number.

Kip: It's more than one.

Jake: That's not how that works.

Karen: More than one.

Jake: There should be one. One system of record.

Karen: That's the audit requirement. So when we realized where we were at, we had to get everybody absolutely crisp on, we are going to work on these things, and if there's something else you're thinking about working on, the answer's no. inaudible

Kip: I love it. Priority.

Karen: We had to, otherwise we inaudible we had one shot. inaudible

Kip: It sounds like a inaudible without any trials or...

Karen: Yeah. Without testing it to see if it worked.

Jake: And I'm curious, just because... What was the upcoming audit in that instance?

Karen: So the background is... The upcoming audit, all four of the big audit firms were involved. The audit was a financial controls audit. But because the financial controls, the underlying... This is what people forget when they talk about privacy and financial controls, is that if the underlying security is broken, the financial controls are actually not based on anything solid, so you're not going to pass the audit.

Jake: Yeah. inaudible

Kip: It's like thinking you have privacy without a good information inaudible.

Jake: Without security. Yeah. Yeah. Got it.

Karen: Right. So we had to go back to COBIT and ISO and say... And we used ITIL a lot, because a lot of the basic fundamental blocking and tackling of IT happens in the things like change control and problem management and all of that. And we overhauled all of it.

Kip: And it was all done for an acquisition, right? Isn't that what you said? inaudible

Karen: When AT&T Wireless was being acquired by Cingular, which at that time was a joint venture between BellSouth and Southwest Bell. Yeah.

Kip: Yep. So the whole deal hinged on this integrity, the system... inaudible

Karen: The AT&T brand hinged on it. The Death Star, the... inaudible

Kip: Right. Reputation.

Karen: It was an experience.

Jake: Yeah. Interesting. Okay. So I think that is... That was intense, I'm sure. And so how has this demonstrating of due diligence to the defensible standard of care evolved over time? Because Cingular was late '90s? Early... inaudible

Karen: 2003, four. Yeah.

Jake: Yeah. So it was the early 2000s.

Karen: Mm-hmm.

Jake: Wow. inaudible 20 years.

Karen: I know. It's shocking. So how has it involved? Well, first of all, that was the once in a lifetime greenfield experience where I got to use that approach, and with the belief that it was going to serve us. So I guess that then became the next block point in my understanding and in my venture to talk about this everywhere, was you can do this as an outcome based approach. Obviously, you're going to have to find out what your risk tolerance is along the way, for how much you ratchet things down. I want to come back to that in a second.

So as I've been talking about it, shortly after that, I was invited to go speak to the University of Washington, to Barbara Endicott-Popovsky's cyber assurance program. And we were talking about risk assessment, and the thing that popped out at me when we were going through that conversation, and it was like a open seminar with the students, was that we were focusing on risk from a standpoint of, what's the risk to me? What's the risk to my organization? It was very in inwardly focused. And what I realized is, we were whiteboarding things and interacting together inaudible oh my gosh, duty of care means I have to care about others. And that's not built into our risk framework. At the time, that wasn't built into anybody's risk framework.

Jake: No. It's still not as widespread as it should be, I would say.

Karen: It's not. We still tend to think about it as, what's the risk to my company? But we have to realize that we're the caretakers and custodians of things that affect other people's lives in a very big way, more and more and more. If we're not convinced of that yet, there's no hope.

Kip: Yeah. So one thing I tell customers is, you really need to think about how your actions affect the online community as a whole. Because if people lose faith in the ability to conduct commerce over the internet, where does that leave your business? How does that affect you? So it's funny, because the only way I can really get them to pay attention to the community is to bring it back to, if the community's not doing well, you're not doing well. But I think that just goes back to, what's in it for me? It's a default human question that just rattles around in everyone's brain. Should I pay ransom? That's a great choice for me, because I get my stuff back and I can go back to business. But as the insurance companies have found out, that's just feeding a beast, and then that beast ultimately came back and bit them really hard. CNA had to pay something like $30 million of ransom, because they got...

Karen: I think the top payout right now is $300 million.

Kip: Who was that? I don't know about that one.

Karen: I don't think they named names.

Kip: Okay. But still.

Karen: But that came out. NetDiligence published a report recently and that was in there.

Kip: That's awful. That's awful. Just feeding a beast. But to your point, Karen, it's more than just my assets. It's also, what am I doing to take care of other people? And that's missing completely from the dominant conversation.

Karen: Yeah. I think that's the evolution. And that's why I got excited when I read the DoCRA draft standard, because now it is saying, what is the risk tolerance to my constituents who are potentially affected by me? And if we were doing more of this, our third party risk management processes would simplify tremendously. Because right now we're saying, somehow it's up to me as the contractor to catch the problem with everybody else. What the DoCRA standard and a duty of care due diligence standard would do is essentially say, I have to consider those who are a hop away, or even three hops away, how I might impact them. And I need to be able to put that into my model and show that I'm doing the right thing.

Kip: Yeah. It's an externality, right? It's like pollution. It's like a plant dumping chemical waste into a nearby river and going, "Job done." And not really considering the consequences of, what's going to go on downstream because of what you just dumped into the river? And it costs them nothing to do it, and they're not going to really suffer for having done it. Other people will suffer for them having done it. And this is one of the things that I think about when I wonder... There's so much digital pollution. There's so much digital exhaust going on right now. And it's in nobody's individual best interest to do anything about it. So are we going to need an Environmental Protection Agency type function for the care of our digital environment? I don't know. I don't if that's the answer, but the parallels seem really strong.

Karen: I think we ought to unleash the lawyers.

Kip: Oh, no.

Jake: It's starting to happen anyway.

Karen: It's a negligence issue. And Jake, I want to hear what you have to say about this, but I want to just do one thing and pivot off of what Kip has just said, because there's a very common example about anybody who's got a swimming pool. If you have a swimming pool in your backyard, it's an attractive nuisance. If a child falls in and drowns, it's your fault. There's even a biblical principle in the Old Testament, in the Hebrew scriptures, that talks about if you have a hole, if you dig a hole and you do not cover it up so that nobody falls in, and they fall in, you are liable. That's ancient.

Jake: That's very ancient. Yep.

Karen: And so I think what we are doing is digging pits all over the internet and not covering them up.

Jake: Or even putting up a fence or a warning sign or anything.

Karen: Not even putting in a... And there used to be some really interesting cases that were settled out of court that helped guide us on some of that. Because as an example, if a company says they want to make a claim, that they have trade secrets, and they haven't put the necessary controls to protect a trade secret as if it's a trade secret, and somebody comes in and steals it... There was a case like this with a very, very large Fortune 50 company, more than 20 years ago. And a former employee left, knew that he could get back in, came back in, stole the information, took it to market, competed with his former employer, and they sued. And he won, because they didn't put the necessary protection around it. That's a little bit of a different thing. That's more the inwardly focused problem. But it's all part and parcel of the same cybersecurity approach. We're not talking about a completely new approach to doing cybersecurity. We're talking about just do the cybersecurity.

Jake: And I think that's one of the things that's tough, is when you're advising a client... And let's be fair and honest, a lot of clients, they don't necessarily know where to start. And so what happens is they go and they buy what Kip and I long called blinky light security, because it's at least easy.

Karen: Right. Shiny object security.

Jake: If I can show, look, I've got cybersecurity, I've got a bunch of switches and firewalls and appliances, and they all have blinky lights going, and it looks very impressive.

Kip: Yeah. I can point to it, you can see it, here are the invoices.

Jake: Exactly. Yep.

Karen: It's complicated, therefore it must be good.

Kip: That's right.

Jake: Yep. But obviously we know that that is not true. And I think one of the words that we've mentioned here a few times that I think is so important is demonstrate. How is it demonstrable? And I think one of the things that's I think underappreciated about GDPR, for example, is that as a regulation, it really, really focuses on evidence and demonstrating that you're complying with it. And I think that's very wise, because if you don't have... And the classic example, by the way, is the very first thing that a European data protection authority is going to ask anyone for is, show us your records of processing activities. And let's just say it's not uncommon for the response to be a slow blink and being like, do we have those?

And I think it's... One of the things I see in my practice all the time is, the classic example is, we downloaded a written information security policy or an incident response plan from the internet and didn't even fill out the blanks. That's not doing cybersecurity. And honestly, that demonstrates the opposite of what you want it to demonstrate. But I'm curious, when you're implementing...

Kip: That's an auditor repellent.

Jake: It is. Well, it's a different type of auditor repellent. When you're implementing security using this approach, what does that look like on the ground?

Kip: How do you know? What do you see?

Jake: How do you know?

Karen: There's two things that I learned, and I use this mantra today all the time. There's done, and there's done done.

Kip: Interesting. And then there's inaudible.

Jake: inaudible

Karen: And if you don't, then inaudible. Yeah. So the done is what we saw all the time. Go to the CIS admin. Did you get that thing put in place? Yeah. Is it tested? No. Okay. That's not done. The second mantra is, if it's not written down and I don't see proof, it doesn't exist.

Jake: I share that one. I love that. That's true.

Karen: And so that was the thing that actually, when we went and we got the audits done 20 years ago, we passed all four Big Four audits with zero deficiencies, not even a documentation deficiency. Because we followed the done done rule, and if it's not documented and tested and written down, it doesn't exist. And I think that's what people actually really don't get, and I think they're overwhelmed by the idea, but... inaudible

Jake: It is overwhelming.

Karen: It's overwhelming, but listen, in 10 months, and I grant you I spent $10 million doing it, but in 10 months we did it. It's not that it's impossible. It probably takes a force of will and everybody focused on getting it fixed, but it is doable.

Kip: Yeah. inaudible

Jake: And for all those thinking, oh my God, but that cost $10 million 20 years ago, that's like $50 million in today's dollars. Perhaps, but what was the size of the deal, by comparison?

Karen: The deal was $41 billion cash.

Jake: Yeah. So the $10 million was... inaudible

Kip: inaudible

Karen: But that's why I say this will never happen again. Do you know what I'm saying? The circumstances that led to that being everybody laser focused on making sure that this happened, will never inaudible.

Kip: But to be able to extract the lessons learned is just... It's wonderful, because like you said, it's a once in a lifetime thing, so the money was spent, but now how can we get the most benefit from it, not just from the folks involved, but what can we share with everybody else so that they can benefit from it too?
I'm keeping an eye on the clock because I don't want this episode to go as long as it probably would go, if we organically just allowed it to do what it's doing. I just want to say a couple of things and then I think we should wrap up. One thing I want to say is, it's kind of depressing as a practitioner to think that over the last 20 to 25 years, this is the conversation. And not much has changed. Not much has changed. The practice of cybersecurity hasn't really evolved very much, even though we're aware of all these things. And so that's a little depressing. The good part here for me is that these ideas still resonate. We've got DoCRA. There are people who are still pushing it forward, and that's encouraging.

Jake: You should say what that means, by the way.

Kip: What?

Jake: DoCRA, duty of care risk analysis. We've used DoCRA repeatedly, but we never actually spelled it out for those who don't know.

Kip: We didn't, but we did tell them episode 59.

Jake: We did. That's true.

Kip: They'll get more DoCRA than they can handle if they just go listen to that episode. So there's something in here to be disappointed about, but there's something in here to be excited about, and I want to also say that Donn Parker... Risk assessment is hooey. If you want to be more informed about Donn's opinions, and God rest his soul, he's not with us any more... I mean, think he was retired when I showed up. But he's just...

Karen: Oh, no.

Kip: But he had a rich, long career.

Karen: He had. Yeah.

Kip: I don't mean retired as in checked out, but he had done so much, by the time I showed up at SRI, he just had this amazing body of knowledge that he was drawing from. He was wisdom in a pair of shoes walking around. It was fantastic. But he has publications. You can go to the communications of the ACM March 2007 publication, and he's got a whole article in there called risks of risk based security, and it's a version of risk assessment is hooey, and he also published in the ISSA journal. Really, if you just search his name and the risks of risk based security, you're going to find writings. He actually had... I found comments on different blogs where people talked about his paper, and then he actually came in and was part of the threaded conversation, so you can actually go out and see Donn talking to people in these blog posts and in the commentary, you could actually still read him talking to people about why his approach makes sense, and rebutting people who say he's out to lunch. I just think he was so far ahead of his time in his understanding of this.

Karen: He was. What he wrote about in his computer crime book, which I highly recommend, he invented the term a crimoid, which was automated cyber crime.

Kip: Which we have now.

Karen: Which, hello ransomware. Hello zero-day. Hello Stuxnet.

Kip: Yeah. Ransomware as a service.

Karen: At the time that he wrote about that...

Kip: It was unknown.

Karen: Everybody was like, what?

Kip: Yeah. It was mind blowing. I remember reading about that and then talking with him about it, and it just really did seize up my brain, because I just had a hard time imagining what that would look like. But he was so farsighted. Oh my gosh. So farsighted.

Karen: Yeah. You're right about the depressing thing, and I just think we can't let that get ahold of us.

Kip: Right. No, we can't.

Karen: We did do the summit, remember, the cyber secure... Or what do we call it? The Information Security Summit that we did in Menlo Park, and we held it at SRI. Who was there? Donn Parker. Peter Neumann. The former chief information security officer of Microsoft. The head of security... The chairman of Bank of America. The head of security for Bank of America.

Kip: Yep. Everybody who was inaudible.

Karen: The chairman of SRI. It was a Who's Who list... And we had the Secretary General of Interpol. And we all stood on the stage one after the other and said, "We have to get ahead of this, because if we don't get ahead of this now..." The internet had just become commercialized in the previous five years. "If we don't get ahead of this now, we are going to really pay for it and really feel the pain." So here we are.

Kip: We're paying for it.

Karen: And I still have a box of pictures from that event, Kip, and I hold them up, and I say, every single one of them has one caption. We told you so.

Kip: Yeah. But nobody was listening because, they're just it... Yeah. It's awful. Too bad.

Karen: It was inconceivable, but I think that's where we are right now, which is we are at an inflection point. We could do a whole nother session on this.

Kip: Was that a Princess Bride quote just now? Inconceivable.

Karen: inaudible

Kip: Sorry. inaudible

Karen: Inconceivable.

Jake: I don't think that means what you think it means.

Karen: Yeah. That might be embedded in my brain.

Kip: Yeah. I was thinking that it might be. Again, this episode could go along for a lot more time than we have. Karen, we'd love to have you back and unpack this a little bit more.

Karen: inaudible

Kip: Maybe do something that's related to it, but we are out of time. Jake, any final thoughts?

Jake: Yes. We didn't even talk about the duty of loyalty.

Kip: I know.

Karen: I know. Please, let's do that one next.

Jake: Yeah. That's a...

Kip: Yeah. Okay. Let's come back and do another episode on duty of loyalty. I think that'd be fantastic. There's so much to talk about.

Jake: Actually, Kip, don't we have an episode about the fiduciary duties of executives at one point? I feel like we do.

Kip: I don't know if it's by that name, but I will go dig around in the editorial calendar archives, and if I find one, I'll put it in the show notes. Fair enough?

Jake: Fair enough.

Kip: Okay. Any final words other than that, Jake?

Jake: No, let's wrap it up.

Kip: Okay. Karen, any final words?

Karen: inaudible

Kip: You want to tell us where... Where can our listeners go to find out more about you and your work?

Karen: Oh, gosh. I love to connect with people on LinkedIn, and we run some podcasts, live streams called Ask the Howlers, goes once a month. We're going to be talking about this later this month.

Kip: Great. Ask the Howlers.

Karen: Ask the Howlers. Look us up.

Kip: I'll put a link to that in the show notes as well, so people can follow you over there. Fantastic. Karen Worstell, thank you so much for being here today. And that wraps up this episode of the Cyber Risk Management Podcast. Today we explored the idea of demonstrating due diligence in a defensible standard of due care as a risk management approach, and we did that with our guest, Karen Worstell, who's a senior cybersecurity strategist and a CxO security advisor at VMware. We'll see you next time.

Jake: See you next time.

Speaker 1: Thanks for joining us today on the Cyber Risk Management Podcast. If you need to overcome a cybersecurity hurdle that's keeping you from growing your business profitably, then please visit us at cr-map.com. Thanks for tuning in. See you next time.

Headshot of Kip BoyleYOUR HOST:

Kip Boyle
Cyber Risk Opportunities

Kip Boyle is a 20-year information security expert and is the founder and CEO of Cyber Risk Opportunities. He is a former Chief Information Security Officer for both technology and financial services companies and was a cyber-security consultant at Stanford Research Institute (SRI).

YOUR CO-HOST:

Jake Bernstein
K&L Gates LLC

Jake Bernstein, an attorney and Certified Information Systems Security Professional (CISSP) who practices extensively in cybersecurity and privacy as both a counselor and litigator.