
EP 36: The emerging “Reasonableness Test” for cybersecurity
Our bi-weekly Inflection Point bulletin will help you keep up with the fast-paced evolution of cyber risk management.
Sign Up Now!
About this episode
September 17, 2019
Kip Boyle, CEO of Cyber Risk Opportunities, and Jake Bernstein, JD and CyberSecurity Practice Lead at Newman DuWors LLP, discuss “The Sedona Conference Draft Commentary on Reasonable Security Test”
Episode Transcript
Voice-over: Welcome to the Cyber Risk Management podcast. Our mission is to help executives thrive as cyber risk managers. Your hosts are Kip Boyle, CEO of Cyber Risk Opportunities and Jake Bernstein, cybersecurity council at the law firm of Newman Du Wors. Visit them at cyberriskopportunities.com and newmanlaw.com.
Kip: So, Jake, what are we going to talk about today?
Jake: Hey, Kip. Today, we're going to talk about another way to define reasonableness in the cybersecurity courtesy of the Sedona Conference Draft Commentary on a Reasonable Security Test. And let's go ahead and get started.
Kip: Okay. Obvious question number one, I'm just playing the role of a listener here, what is a Sedona Conference? Is that just an excuse for a vacation?
Jake: It is not. Well, it could be actually, but the Sedona Conference is a worldwide think tank that historically focused on eDiscovery issues. That's electronic discovery crosstalk.
Kip: That's a very lawyerly thing, right?
Jake: It is a very lawyerly thing. It's about how you share information in a lawsuit between parties in a civil lawsuit specifically.
Kip: Are we about to go careening into a legal episode?
Jake: Kind of.
Kip: Okay. Well, you're warned, audience.
Jake: Yeah. You are warned, but it's definitely, it's relevant, so.
Kip: But it's relevant to the cyber risk managers, right?
Jake: Oh, absolutely. It's not just relevant. It's critical.
Kip: Okay. Stay seated, everybody. Let's go.
Jake: All right. So the Sedona Conference, it was originally focused on eDiscovery but has really branched out in recent years to include information governance, patent litigation, cross-border data transfer, and of course, data security and privacy.
Kip: How long have these people been a thing?
Jake: Oh, decades now.
Kip: Oh, so this is like a secret society that's being unveiled by us?
Jake: To this audience, yes, it is.
Kip: Okay.
Jake: And so here's additional history, is that most though, not all of Sedona participants are attorneys. And the reason for that is that Sedona is really interested in providing practical guidance to both practitioners, and courts, and judges.
Kip: Wow. I like that.
Jake: And this is really important, right, because judges are no more enlightened about all of these new issues than anyone else. And they need help because they're expected to make decisions, make rulings, and write opinions on this stuff.
Kip: Wait a minute, wait a minute. Judges are not necessarily the most digital literate people we have?
Jake: Not even digital. See, the thing that most people misunderstand about judges is that they think that they know everything. And in fact, judges are called upon to work in so many different areas. They're the ultimate generalist, and so they rely on the parties to explain things to them so they can make a decision. And the Sedona Conference produces these commentaries, and these publications that are directed to judges who ...
Kip: Okay. Okay. Okay. So everybody knows Kip is not a lawyer. And gosh, maybe I've stepped into a courtroom once, but it wasn't anything like on TV. So are you saying judges are like super jurors because they have to have everything explained to them, they're like boss jurors like in a video game?
Jake: Yes.
Kip: Okay. That's cool.
Jake: Well, and I think you were saying jurors, but see, do you know the other word for judge is actually jurist? So-
Kip: Ahhh.
Jake: ... these words are all connected. You are and expected to explain and back up your statements, that's how you win in court. And what happens is that the Sedona Conference, particularly, like I said on eDiscovery, these publications can become effective law because judges will incorporate them into their opinions and decisions, so-
Kip: Wow. So they actually reference them directly?
Jake: They do. You can easily find, if you go into a legal research tool and just search for a Sedona Conference, you're going to find a lot of citations in published court decisions, so-
Kip: Wow.
Jake: So that's why Sedona matters. And really, you can now forget all of that moving forward for the rest of it because we're going to talk about cybersecurity.
Kip: Okay. Okay. But it's still, that's pretty fascinating. I had no idea. And certainly from watching television, I mean, I think in movies, you wouldn't have any idea that judges don't already know all this stuff. Right? They're presented as a bit of an Oracle, or as being omniscient. But all right, so Sedona Conference, what do they have to say about cybersecurity?
Jake: Well, quite a bit. They have an internal working group 11 that their entire job is to focus on cybersecurity and privacy issues.
Kip: That's so funny, working group inaudible that's like public school, 52 or something like that crosstalk.
Jake: Yeah. Well, and they're kind of taking it from the-
Kip: So prosaic.
Jake: ... European system as well, working groups are a big thing over in the European union.
Kip: Lawyers need marketing. Keep going.
Jake: Fair enough. And today what we're talking about is something that is really a draft discussion piece, and it's about a reasonableness test. And I want to emphasize that we're not going to be distributing this publication. It has not been officially adopted by the Sedona Conference, but-
Kip: So this is a sneak peek?
Jake: It's a sneak peek. It may, at some point become available for public comment, in which case this podcast episode will be useful for that. But what I want-
Kip: Am I violating any intellectual property by revealing this?
Jake: No, because we're not distributing or republishing anything. And what we're going to talk about actually are just ideas.
Kip: Okay, great.
Jake: And the ideas in this document are universal, and they've been debated and discussed much more widely. So what we want to talk about here is, what is reasonable? And there are so many ways to think about it but this is just one other method. So let's delve into it.
Kip: Yeah. Okay. So, right. So we already know that reasonable cybersecurity is something that the FTC says a lot about, and we've mentioned it on this podcast. We refer to a lot in our work, but as we explain to our customers and the people that we talk to, reasonable is a moving target. And the FTC says that reasonable depends on several things like the size and sophistication of a company, the cost of security measures, the type and amount of data that they're collecting, or processing, or storing. So conceptually, I think reasonableness, I think the FTC's done a pretty good job there, especially as they bring the nest cybersecurity framework into it, that puts more specificity on it. So what is the Sedona Conference saying? Is it different?
Jake: So it's really not different. There's not much point to completely disagreeing with the FTC on this. I mean, the FTC has existing precedent. It's gone through the court system before. And what we're looking to do when we talk about this type of thing is to find other ways to get at exactly how you're going to determine if you're being reasonable or not. And in other words, how do you look at the sensitivity of information, the availability of resources, and the cost and benefits of available controls to actually decide whether some act or omission was or was not reasonable?
Kip: So it's a bit of a test?
Jake: It's crosstalk
Kip: They're trying to create a standardized test what could be ...
Jake: That's right.
Kip: Okay. So whether it be Scantrons and number two, pencils.
Jake: Probably not, but the idea is that it's going to lend itself to that. I mean, the idea whether or not the Sedona Conference ever adopts this particular test, it doesn't matter. The idea is that practitioners, and eventually judges are going to need to understand how to determine if something is reasonable or not.
Kip: Right.
Jake: And for all crosstalk.
Kip: Do reasonable the reasonableness tests exist for other legal standards?
Jake: Oh yes, absolutely. You've got the most famous one of all is the "reasonable man test". But a lot of them come down to risk and cost benefit analysis, and that's what we're going to talk about here is ...
Kip: So Sedona is trying to flesh out cybersecurity, so it has something comparable to other reasonableness tests? Is that a reasonable way of looking at it?
Jake: That is a reasonable way of looking at it, and that's what Sedona does. I mean, that's really one of their main functions is, "Hey, let's take these. Let's take these relatively abstract and confusing ideas and try to build up some level of meat onto the bone so that people don't have to just fumble in the dark."
Kip: Great. Great. Let's hear more about it.
Jake: Sure. So what we're going to talk about here is a risk-based cost benefit analysis, that is something that you would perform every time some person or entity is alleged to have failed to provide reasonable security. And it doesn't matter if it's personal information or corporate data, the idea here is that there's someone is harmed, and we're going to that person the claimant, and there is some entity that did or did not do something it should have done. And what the analysis looks at is the incremental costs of implementing and maintaining additional or alternative security measures that the claimant is saying, "You should have done this," while taking into account the magnitude of the potential costs and the probability that those costs would've been incurred with any reduction in the risk of harm that may have resulted.
So what you can see is we're talking about costs and benefits. And in order to assess the benefits, we have to compare the magnitude and probability just like we did with the costs. And in order to do that, we're probably going to need to figure out some method of measuring. So, what do you think so far?
Kip: Well, until you started that last little bit, I was feeling very good about this, but ... Okay. So first of all, my brain went numb halfway through what you just described. It was really thick, and I just felt like my cup just couldn't contain it all. And then you said a key word, which just hopped right out me and woke me right up, which is probability. And I thought, "Oh my goodness, we're going to have to do quantitative risk analysis." And in my experience that most cyber risk managers are not set up for success to do that, and so is that what this is all about?
Jake: So, it's not. So you are right, quantitative risk analysis is very difficult to do. And the Sedona Conference isn't saying that you have to do quantitative analysis. There is a suggestion that if quantitative analysis is, or a quantitative analysis is possible, it can be done. But really what we should be doing is looking at existing industry standards and practices, and we're not going to get precise figures. So we're going to have to use something else instead, which of course begs the question, if we're not using-
Kip: What else?
Jake: ... numbers, what are we going to use? And-
Kip: Yeah. So does Sedona say what we should use? I mean, you said in the opening that they try to make things practical. And so far it looks like they're steering us off a cliff, so how do we not do that?
Jake: So we're going to use qualitative terms and-
Kip: To do a quantitative risk analysis? No, that's not what you're saying.
Jake: Nope. We're going to use qualitative terms to do a qualitative risk analysis. And I think what I love about this, this draft is the terms that they have chosen. We're going to evaluate costs and actually benefits as tolerable, intolerable or catastrophic. And then-
Kip: Wait, wait. Don't go any further. I want to make sure I understand. So costs have three potential settings, tolerable, intolerable, and catastrophic. Those are the three labels. That's it, just three?
Jake: Just three.
Kip: Okay. And have they defined those terms?
Jake: Not specifically. They're meant to use their standard just everyday language definition. So in other words, a cost that is tolerable is one that you can pay that you really would not. You may not want to pay any cost. Right?
Kip: Right.
Jake: I think that's default, nobody wants to pay a cost. But a tolerable cost is one that you're not going to think too hard about incurring because it's crosstalk.
Kip: So these at this point anyway are not obscure legal terms, these are Webster's dictionary words-
Jake: Yeah.
Kip: ... for now? Okay. All right. So we have three types of costs. And what about the benefits? How many of those, and what are they called?
Jake: So we're actually going to use the same three terms, but we're going to apply those to the harms suffered by the claimants, and is a way to think about the benefits. So that sounds weird. Let me explain. A benefit is going to prevent a harm. Right?
Kip: Mm-hmm (affirmative).
Jake: That's in our definition and the way we're thinking about this, a benefit is something that mitigates or prevents a harm. So in order to figure out the utility of that benefit, we're going to look at the harm prevented. So the same idea is the harm tolerable, intolerable, or catastrophic crosstalk?
Kip: Oh, those clever Sedona heights.
Jake: Yes. Yes, indeed. And then there are only three other words, or I should say three other phrases. For probability, we're going to go with not foreseeable, foreseeable and imminent.
Kip: Okay.
Jake: And let's break those down. So we're not talking about, we're not going to ask someone to say, "Well, I think that's five percent likely to happen, or 50% likely to happen. In fact, these three terms are, I mean, you could assign percentage chance to them but they don't need it at all to function. Something that is not foreseeable means that I just didn't see that coming. Like, that is not something that I would have ever thought would be a problem. And that the type of situation that's going to be is maybe open to interpretation.
Kip: Well, I think all three of these, or, sorry, all six of these labels in all three of these areas are going to be subject to quite a bit of interpretation because, for example, everybody's risk posture or risk attitude is a little bit different, and organizations have different risk attitudes or risk appetites. So something that for example, would be intolerable to a bank might be perfectly tolerable to a fashion retailer.
Jake: Well, and that's what we call a feature, not a bug. We want something that is going to automatically adjust to the industry that's being examined.
Kip: So it's one size fits all. It's unisex. It is fashion.
Jake: It is, but its more like it's, they're all stretch clothes, like everything stretches depending on what it needs to do. And that's ...
Kip: It's now comfortable.
Jake: Right. Comfortable, that's the point. And there's really, I would say no better way of doing it because if you try to do it any other way, you're going to instantly get into quantitative systems that you've already said oftentimes don't work.
Kip: Right. Hmm. Okay. So, all right. So my test for whether this test is good comes back to practicality. I'm very practical about this stuff based on my personal temperament and the work that I've done in the course of my career. And what I liked about the Sedona Conference that you said in the beginning was, is that they're focused on practicality. So I'm just absorbing this for the first time. Does it strike you as satisfying the practicality requirement?
Jake: I think it's too early to say exactly. I think that what we have right now is not much guidance from anywhere. And I think the way that you can start to look at this is, let's take some examples.
Kip: Okay.
Jake: And I think that when you start to apply it, you find that it's definitely practical. But one of the questions I have about it is it may be practical, but how often does it ever say that something, that how often will it actually come about as actually being reasonable? I think that's where it gets interesting. So let's just go ahead and do this, right?
Kip: Mm-hmm (affirmative).
Jake: So here's an example. I'm going to make this up as I go.
Kip: Watch out folks, he's walking without a net under him.
Jake: That's right. So let's say that we have a situation where two businesses enter into contracts. One of them is a vendor to the other and this vendor has access to, and actually for their business purposes needs to maintain a copy of sensitive information of their customer and-
Kip: Okay. So let's pretend that's a bank and their outside bank statement printer, right, because they would need to have the data to make statements.
Jake: Sure. That makes sense. We can do that. And let's say that the bank, and the bank has contracts, and the printer signs them that says that the printer's going to take reasonable steps to prevent the unauthorized access or disclosure of the bank's confidential information. And let's say that the printer signs off on this and everything is going along just fine until it comes out, as things often do in the news that a bank has suffered data breach, and that a list of all of its customers, their bank account numbers, and the balances has been leaked to a inaudible on the internet.
Kip: Okay. Like that's ever happened before.
Jake: Right. So people are freaking out, as is likely to happen.
Kip: And the bank's logo is above the fold on all the newspapers.
Jake: Exactly. And as soon as you dig in, you find, oh, wait, it actually wasn't the bank that was breached. But it's this printer. The printer, as you said, needed to print all the bank statements, and the files that contained the information that would allow the printer to print the bank statements were intercepted after they left the bank and were on the printer's systems. So let's say now we have our breach, we have our pain. I'm a victim of this. I'm someone whose bank account was perhaps attacked or drained based on this, and I'm going to sue everybody, right? I'm a crosstalk.
Kip: I'm a claimant.
Jake: I'm a claimant. I'm going to sue the bank and I'm going to sue the printer, assuming I know about them, and we're going to see what happens. And so my main thing, my main claim is going to be that the bank should have been reasonable, and should have taken more reasonable steps to protect my information when it sent them off to be inaudible to one of its processors. Right? That's what the printer is, it's a data processor.
Kip: Yep. Yep.
Jake: And I'm going to say that the bank should have run constant penetration or should have required an annual penetration test on this-
Kip: The printer, the data processor
Jake: ... on the printer, this vendor. And that the bank should have audited the security practices of its vendor on, at least an annual basis crosstalk.
Kip: Okay. So those are two things that it should have done.
Jake: Yeah. And we're just going to take those two things. And let's assume as well that the bank did neither of those things, and that the contract would've allowed the bank to do so the bank just didn't do it. Right?
Kip: So they didn't practice those things that they crosstalk.
Jake: So their contract was, the language was there but they didn't take advantage of their rights. And this is not an entirely unreasonable situation. Or I should say, this is not an entirely unrealistic situation. In fact, it's probably quite realistic-
Kip: Yeah, it's very.
Jake: ... where a company thinks, "Oh, our contracts are pretty good, they gave us all these rights." But then they don't actually use those rights.
Kip: They reserve them but they don't keep the reservation.
Jake: Right. They just, they don't actually use it. Right? It's there but they've done nothing with it. So Let's go ahead and try to figure out how this looks under this proposed test.
Kip: Okay. So here comes Sedona, right, after the fact?
Jake: Yep.
Kip: And it's helping us determine, did the bank practice reasonable cybersecurity given this claimant's lawsuit, right?
Jake: Yep, exactly. So the first thing we're going to do is say, "Okay, let's take one item at a time. They should have of requested and actually looked at a penetration test for their data processor or the printer."
Kip: Okay.
Jake: Now the cost of that, it needs to fall into tolerable, intolerable or catastrophic. And already, I think you can begin to see how this is going to play out.
Kip: Well. Okay. So I'm a tolerable in the sense that I know that a penetration test is between four and five figures of money, typically in my experience, and a bank should be able to afford that.
Jake: Exactly. It certainly is not catastrophic. Right? That it requires crosstalk.
Kip: It would not put the bank out of business.
Jake: Exactly, it's not going to put the bank out of business. So then is it a tolerable or intolerable cost? And I think that this is where there's going to be litigation, of course. But when you get down to it at the end of the day, these are, it's not going to be an intolerable cost. Which to me means it's not that we're going to go out of business, but we just can't afford it. That's not the case.
Kip: Right. Like, we don't have the cash flow. Like, we'd have to lay off people. We wouldn't go out of business but we'd have to lay off people in order to do the penetration test.
Jake: Right. And so I don't think the facts would ever show that, that would be the case for a bank, and so we're going to have to go with tolerable.
Kip: Okay.
Jake: Okay. So we have costs as tolerable. Now let's look at the benefits side. And let's go ahead and that the harm, because remember, we're going to examine the harm for the benefit, is the harm, which is publication of your bank account number and your current balance, is that a tolerable harm, an intolerable harm, or a catastrophic harm? And again, I think we can start off immediately by saying it ain't tolerable.
Kip: Okay. All right. So what you're saying is the claimant case is that the harm of this data breach is not tolerable?
Jake: Yes, it is crosstalk.
Kip: Is intolerable.
Jake: It is at least intolerable, possibly catastrophic.
Kip: Okay. So it's A or B. I think we can agree that it's at least intolerable. Okay, we're on the same page there.
Jake: Okay.
Kip: What would it need to be in order to qualify as catastrophic, I wonder? Would somebody have to lose their home, or would they have to file bankruptcy? Is that catastrophic?
Jake: I think catastrophic for individuals would be the draining of your bank account with no possibility of recovery.
Kip: What if your bank account's $100?
Jake: Well, I think that's going to end up ... If you only have $100 in your bank account and assuming that's all the money you have, that is catastrophic for that person.
Kip: Okay. All right. Well said. Well done, counselor.
Jake: Yeah, no problem. And so-
Kip: Can I hire you?
Jake: Okay. So the important thing here actually isn't, is it intolerable or catastrophic? It's that it's not tolerable.
Kip: Okay.
Jake: So already we're starting to see that, it looks like you probably should have done this. Right?
Kip: It was a tolerable cost and the harm was intolerable because it actually occurred, so now let's have fun with probability. How does that work?
Jake: So the first thing we should look at is, is it not foreseeable at all that their printer could have been ... Let me back up a second. We as the bank know that we are sending this information to the printer. Right?
Kip: Mm-hmm (affirmative).
Jake: Okay. So the question we have to ask ourselves is, what was the probability that this information could have been stolen from the printer? In other words, was that not foreseeable at all? Was it foreseeable? Or was it imminent? And I already know where I fall.
Kip: Okay. But I think the answer to this in part depends on which hat you wear, right? If you wear the banker's hat, okay, they're not going to say it's imminent but I think they might try to argue that it was not foreseeable based on how they conducted their due diligence on the printer, the data processor, and that they had a very good contract that the data processor made all kinds of warranties as to their ability to protect that data.
Jake: Well, this is where I get to lawyer you. And the question wasn't, was the bank being reasonable with its due diligence process, or whether or not they made representations? The question was very specific, given that we are sending this data to the printer, is it not foreseeable or foreseeable that, that data could have been stolen from the printer?
Kip: Well, now I'm thinking that instead of saying foreseeable and now I'm thinking possible, like as a synonym for foreseeable is possible. Like, is it possible or not possible? Should I do that? I mean, in my mind, should I substitute foreseeable for possible?
Jake: I think you could but the problem is, is that I don't think not foreseeable is the same as impossible.
Kip: Okay. Okay. That's a good point. But let me say, let me ask you this, could a banker who's not tech literate say, no? I couldn't possibly have foreseen it. I don't understand that stuff.
Jake: No, because the bank is not just one person.
Kip: Aha. Okay. Okay.
Jake: So I think it's pretty clearly in the foreseeable category.
Kip: Well, you're painting me into that corner anyway. Good job.
Jake: Yes. Well, that's what we do. So what you've got here is a foreseeable, at least intolerable harm. And now the question is, okay, so we have this crosstalk.
Kip: More tolerable cost ...
Jake: We have a tolerable ...
Kip: ... that could've prevented it.
Jake: Right. Then the question is, what's the probability that it would've prevented it? Was it, is it not foreseeable, foreseeable or imminent that it would've prevented it? Well, that's a tougher question, I think. But I think that if you're going to do pen tests and ensure that they actually get done, it may not be imminent that it's going to stop the harm, but it certainly isn't not foreseeable, right? Like if I do a pen test and I'm making them do this stuff, it's foreseeable that it could prevent the harm, so-
Kip: Interesting.
Jake: I think you have a foreseeable and tolerable cost with a foreseeable, and at least intolerable ... sorry. A foreseeable intolerable cost and a foreseeable tolerable benefit. I think I'm confusing myself now, Kip.
Kip: But it's all new. It's a new framework. You get some grace.
Jake: And I think playing with these ideas is how you figure out if they're going to be useful or not. Right?
Kip: Yeah. For sure. I mean, I would love to take some real cases and just for fun, just take real cases and then just run them through this Sedona framework. I don't know. Is it a framework? What would you call it?
Jake: It's a test. I mean, it's a-
Kip: A test, right.
Jake: ... conceptual test, so-
Kip: It's a test, right?
Jake: Yeah.
Kip: A Sedona test? Does it have a name yet? Is it the Sedona test? Or what is it called?
Jake: Oh, not yet. I mean right now, like I said, it's, this is very much in a conceptual crosstalk.
Kip: It's working group test. It's working group 11 cybersecurity test. Okay. Very practical name, socks, black, wool,
Jake: Exactly.
Kip: ... two.
Jake: Yes.
Kip: Reminds me of when I was in the air force, "Here we go, these are your socks." Okay, cool. Well, this is really helpful. I actually do think that this could be useful. I have one final question. You had said that the test was designed, and I'm trying to remember your words here, but it was designed to be used in a post-harm situation. Is that right? And if so-
Jake: That is right.
Kip: ... wouldn't you expect people to use this in a pre-harm discussion?
Jake: Yes, I would. I mean, I think that what you're going to see is that by definition, you actually can't do this test without a claimant, so it has to be post-harm. But here's the funny thing, you have to do the analysis before the harm occurred to understand if something was foreseeable, not foreseeable, et cetera.
Right.
Kip: So crosstalk.
And so for our audience, this is really important. And this is sort of comes to the point of why we're bringing this up for our audience, which is if you are cyber risk manager, if this Sedona test is going to become something that judges are actually going to use and potentially cite in their decision making processes, then you're going to want to start from the lawsuit and work your way back and use this.
Jake: Exactly. And just to be clear, this test, this concept of this cost benefit analysis, and you're going to love this, it actually comes from a famous long dead judge whose name, and I am not making this up, was Learned Hand.
Kip: That was the judge's name?
Jake: The judge's name crosstalk.
Kip: crosstalk that was-
Jake: No, the judge's name is Learned Hand. That is his name.
Kip: But that's the name of a legal concept too, isn't it?
Jake: Only because it's, there's you have seen the reference to Learned Hand concepts, but that's the name of the judge who came up with them.
Kip: Oh, I had no idea that was actually some man's name.
Jake: That is a man's name, Judge Hand.
Kip: Oh my goodness, this just gets more and more funny. All right. Now I wish I was a lawyer. This is just too good.
Jake: Yeah. That name, I think always draws some level of chuckle. So anyway, this discussion, I think, shows that just how squishy and difficult it is to wrap your mind around some of this. And I don't know that ... I'm not promoting or not promoting these ideas, but I do think that it's super fascinating to play with them and see how they might be used. And I think as you said, you put it very well, whether or not this test has a name, whether or not it ever goes anywhere, the idea should be that you probably should be thinking about what you're doing in terms of cost and benefit, and whether not you could defend the decision not to do something based on the cost versus the budget.
Kip: Right.
Jake: Right?
Kip: Right.
Jake: And what's really interesting is that there's actually research in the wrongful death arena that shows that when you will use overly quantified mechanisms to decide whether or not to do certain things that could prevent death, that the jurors don't care that you are about the numbers. Right?
So you might be thinking, "Well, this just isn't that useful because these are so squishy." That's, this is probably the better way of thinking about it is, you don't necessarily have to use tolerable, intolerable and catastrophic but you should use something to figure out, is it going to happen or not?
Kip: Yeah.
Jake: I have another kind of Troika of words I like, which is the other direction but it would be, I know something's going to happen, I think something's going to happen, or I hope something's going to happen. Right? That's kind of another way of thinking about this.
Kip: Interesting. Well, I think this is super useful. And I'm glad that you brought to us this sneak peek of this secretive society and their working group number 11, and like what they're up to. Because even if this doesn't end up ever seeing the light of day, I think everything was helpful for our audience to think about, how can I take reasonableness as required by the FTC and turn it into something that's a bit more practical.
Jake: Yeah.
Kip: So thank you, Jake.
Jake: Yeah. You're welcome. And just to be clear, the Sedona Conference actually is open to the public, and they have lots of great resources. Everyone should feel free to dig in, visit their website. Join, if you like. It's actually far from secret.
Kip: Oh man, here I am trying to throw a little dash of drama in.
Jake: I know. I just crosstalk.
Kip: Lawyers need marketing.
Jake: Yep.
Kip: Okay. Well that wraps up this episode of the Cyber Risk Management podcast. Today, we talked about other ways to conceive of reasonableness as applied to cybersecurity, courtesy of the Sedona Conference. See you next time.
Jake: See you next time.
Voice-over: Thanks for joining us today on the Cyber Risk Management podcast. Remember that cyber risk management is a team sport and should incorporate management, your legal department, HR, and IT for full effectiveness. Management's goal should be to create an environment where practicing good cyber hygiene is supported and encouraged by every employee. So if you want to manage your cyber risks and ensure that your company enjoys the benefits of good cyber hygiene, then please contact us and consider becoming a member of our cyber risk business strategy program. Find out more by visiting us at cyberriskopportunities.com and newmanlaw.com. Thanks for tuning in. See you next time.
Sign up to receive email updates
Enter your name and email address below and I'll send you periodic updates about the podcast.
YOUR HOST:
Kip Boyle
Cyber Risk Opportunities
Kip Boyle is a 20-year information security expert and is the founder and CEO of Cyber Risk Opportunities. He is a former Chief Information Security Officer for both technology and financial services companies and was a cyber-security consultant at Stanford Research Institute (SRI).
YOUR CO-HOST:
Jake Bernstein
K&L Gates LLC
Jake Bernstein, an attorney and Certified Information Systems Security Professional (CISSP) who practices extensively in cybersecurity and privacy as both a counselor and litigator.