Search
Close this search box.
EPISODE 125
Applied Security Design Principles

EP 125: Applied Security Design Principles

Our bi-weekly Inflection Point bulletin will help you keep up with the fast-paced evolution of cyber risk management.

Sign Up Now!

About this episode

February 14, 2023

There are many security design principles we can use to build and evaluate products and services. Can we use them to understand the LastPass incidents from late 2022? Let’s find out with your hosts Kip Boyle, CISO with Cyber Risk Opportunities, and Jake Bernstein, Partner with K&L Gates.

Tags:

Episode Transcript

Speaker 1: Welcome to the Cyber Risk Management Podcast. Our mission is to help executives thrive as cyber risk managers. Your hosts are Kip Boyle, Virtual Chief Information Security Officer at Cyber Risk Opportunities and Jake Bernstein, partner at the law firm of K&L Gates. Visit them at cr-map.com and klgates.com.

Jake Bernstein: So Kip, what are we going to talk about today in episode 125 of the Cyber Risk Management Podcast? I don't know why, but 125 seems like a cool number.

Kip Boyle: Yeah. Yeah, it is a cool number. I mean, once we went over 100, I don't know, it was some kind of a milestone for me. I feel like we're grown up now.

Jake Bernstein: Yeah. No, totally.

Kip Boyle: Anyway, so 125, that's a great number. What we're going to do today is we're going to talk about an incredibly useful toolkit, as I think of it, that I don't see cybersecurity people using enough these days. And I don't really know if it's taught the way that it used to be or ... I'm just not sure why I don't hear people talking about security design principles as much, but ...

Jake Bernstein: Oh, well, I can tell you; it doesn't sound like a blinky light box to me.

Kip Boyle: No, it's not. It's definitely not an easy button, but I think it's an incredibly helpful way to think about all kinds of things, whether you're designing systems or whether you're analyzing the ability of a system to keep you safe. Or maybe if a system fails, it's also a good lens to use to ask yourself, "What went wrong?"

Jake Bernstein: Well, yeah. So I remember these. As you know, I took the CISSP exam fairly recently, and these are now in domain three of the CBK, what we call the Common Body of Knowledge; in domain three: security, architecture and engineering. And I think it's ... You know, first principles are first principles for a reason, and I think people need to remember all of this stuff. So I think this is a good thing to talk about. And having read the script, I know where we're going to go with this. But let's keep the audience waiting a little bit on that and let's just kind of talk a little bit about what this is. So what do you mean when you say security design principles?

Kip Boyle: Well, there's actually quite a bit to it. And so I think our challenge right now is to discuss them without getting too bogged down, because there's actually quite a number of these things. If you look in the official (ISC)2 guide to the CISSP Body of Knowledge, there's actually a book, which wasn't around when I went through that process, but which you probably looked at, but in that book, there's actually two sources that are cited in domain three.

One of them, and I'd like to unpack these, is there's an ISO/IEC technical standard, and it's ... 19249 is its number and it was released in 2017. And it's got a nice clunky name: 'Information technology security techniques catalog of architectural and design principles for secure product systems and applications.' That's a mouthful.

Jake Bernstein: That's a great name there. I'm heavily inspired to read it.

Kip Boyle: I know, right? And to pay for it, because no ISO standard that I'm aware of is free. You've got to pay for all these things, and inaudible

Jake Bernstein: Yeah, don't get me started on how much I paid for ISO 27,001 only to discover it's like a 15 page PDF.

Kip Boyle: Yeah. I cannot fully understand what that's all about. I think it's an unnecessary barrier to interacting with these standards.But anyway, won't go there. But that's one of the references. Now the other reference is fascinating, and I think this other reference really helps to prove the point that these security principles are timeless and relevant and will remain so. It's a paper that was published in the proceedings of the IEEE, and it was called 'The Protection of Information in Computer Systems' by Saltzer and Schroeder, and it was published in 1975.

Jake Bernstein: I like it already.

Kip Boyle: So I think we should quickly review the principles in each one of these documents. Do you want to take us through the ISO standard?

Jake Bernstein: Yeah, let's quickly go through those. So a lot of these, I think people are going to recognize the name. And again, when I was reviewing this, I was like ... A lot of these phrases get kind of tossed around a lot still, but I wonder how many people remember where they came from and what they really, really mean. So I think this is great. Okay, so the five ISO principles: Number one, least privilege. This one, people do understand. I think this one still gets used pretty often.

Kip Boyle: It's pretty popular.

Jake Bernstein: It's pretty popular. Keep the privileges of an application user or process to the minimal level that is necessary to perform the task.

Kip Boyle: It's difficult to practice-

Jake Bernstein: It is.

Kip Boyle: ... I'll tell you.

Jake Bernstein: No, it is, for sure, but I think that's a good one. Next, number two is attack surface minimization; disabling or blocking unneeded services and ports using IP whitelisting to limit access to internal API calls that need not be publicly accessible; the disabling and/or removal of unneeded services and components and so on. Obviously some of that stuff did not ... Well, this is the 2017 ISO standard, not the 1975 paper.

Kip Boyle: That's right.

Jake Bernstein: So yes. Yeah, no, that's a good one. Number three, centralized parameter validation. When dealing-

Kip Boyle: That's awkward.

Jake Bernstein: That is awkward, but here's what it means. When dealing with user input or input from systems to which users input data, invalid or malformed data can be fed to the system either unwittingly by inept users or by malicious attackers to cause an exploit. So a good example might be ... I think kind of at this point, classic cross-site scripting tends to be a failure of centralized parameter validation, right? It's when-

Kip Boyle: Yeah, and also, I think ... Wouldn't you agree that SQL injection would be another example?

Jake Bernstein: Yes. In fact, isn't cross-site scripting a form of SQL ... ? No, it's different, but yes, that's the same idea. SQL injection may be what I was thinking of. So yeah, that's a good one. Number four, centralized general security services. By implementing commonly used security functions once, it's easier to ensure that the security controls have been properly reviewed and tested. Do this at the operating system level, network level, et cetera.

This is really ... This one I think is forgotten way too commonly these days. How often ... ? And in some ways, the entire cybersecurity industry almost goes against this in some ways. I mean, how many different tools are out there that one can buy and install on endpoints and servers and your switches? Maybe some of them come with their own appliances. I'm not casting aspersions on any particular product or anything like that. I'm just simply pointing out that complexity is the enemy of effectiveness sometimes.

Kip Boyle: Right.

Jake Bernstein: And this is a reminder of that.

Kip Boyle: Right. And later on, we're going to apply these principles later on in our episode. We're going to apply one of these principles to an actual case. And I think this one is relevant to that discussion. Bookmark that. Keep going. We have one more.

Jake Bernstein: One more. Okay. Preparing for error and exception handling. Systems must ensure that errors are detected and appropriate action taken without leaking sensitive information. Yeah, that's important

Kip Boyle: Yeah. That one's almost tightly ... I'd say it would be like a cousin or a sibling to centralized parameter validation, because if you do get malformed data fed into a system because you didn't prevent it, well, then you want to have great error and exception handling so that it doesn't actually result in an exploit. Right?
So I think those two kind of reinforce each other.

Jake Bernstein: Yep. Exactly.

Kip Boyle: Okay. So those are the five out of the ISO standard. And I agree with you, a couple of them are mentioned quite a bit, I think that's a good collection, but it's not everything, because if you think of popular security principles, there's more, right? And so now let's look Saltzer and Schroeder's 10 principles. And let me go through this pretty expeditiously.

The first one is economy of mechanism. This means that the protection mechanism should have a simple and small design. And that's where you were saying complexity is the enemy of security, and I think that's exactly what this principle is getting at. The next one is fail safe defaults, which means that the protection mechanism should deny access by default and grant access only when explicit permission exists. And we violate this one all the time, where things don't fail closed, they fail open, right?

Jake Bernstein: Yeah.

Kip Boyle: I mean, how many times when a security ... ? Like when a firewall fails, everybody screams bloody murder if it fails in a way that denies a service. They want it to fail in a way that is open. So I've certainly struggled to implement things that are consistent with fail safe defaults. That's been-

Jake Bernstein: Yeah, no, it's definitely-

Kip Boyle: ... Definitely difficult. Number three, complete mediation. That means that the protection mechanism should check every access to every object. But in practice, that is very resource intensive, and so we have shortcuts like cookies. So if you go to a website and you log on and you access a resource, the website will drop a cookie on your machine and you don't have to log on again to access subsequent resources. So it's a compromise between the principle and reality, and we see this over and over again.

It gets us into a lot of trouble sometimes, but one observation that I have is anytime you compromise on one of these principles, you end up weakening your system and you usually end up regretting it and having to do a lot of rework in order to strengthen using other controls, the design decision that you made. Okay, so that's three. Here goes number four: open design, which is the protection mechanism should not depend on attackers being ignorant of the design to succeed. It may, however, be based on the attacker's ignorance of specific information. like a password or a cipher key. Open design is really, really important. It's the basis for all successful public key encryption.
Like AES. There was a huge bake off when NIST was choosing in the advanced algorithm standard. So all of the candidate algorithms were completely laid bare and were attacked and scrutinized.

And a deep understanding of AES does not directly lead to its exploitation. And I think this is, in particular, a principle that is very, very difficult for people to believe in. I find that there's a lot of ... Oh gosh, what would the word be? People get twitchy about releasing their design on whatever their security is.

Jake Bernstein: Very much so.

Kip Boyle: Yeah. Okay, that's number four. Number five, separation of privilege. The mechanism should grant access based on more than one piece of information. So you shouldn't just be able to come up and say, "Hi, I'm Kip.Boyle. Let me have that thing." The mechanism should say, "Okay, hold on, just declaring that you're Kip.Boyle isn't enough. Tell me something or show me something that only Kip.Boyle is, does, whatever. Some kind of factor of authentication." So I think that's what that's getting at. All right, so we're halfway through. Let's keep going. Number six, least privilege. The protection mechanism should force every process to operate with the minimum privileges needed to perform its task.

Jake Bernstein: So that's obviously a repeat from-

Kip Boyle: Yeah.

Jake Bernstein: Or no, this is probably where it came from actually.

Kip Boyle: Exactly. Yeah. So the ISO document had this in there, and absolutely, I think they were inspired by Saltzer and Schroeder. And again, least privilege is a great idea and is very difficult. I'll never forget a place that I worked at one time where I had recently joined and I was learning how new people got their computer accounts, and they were using a method called 'model after.'. And I said, "Oh, that's interesting. What does that mean?" And they said, "Oh, well, you just joined, and so when we made your account, we modeled it after Joe's account." And I said, "Well, why did you pick Joe?" "Oh, well, Joe's been here for 20 years and Joe's access is completely sorted out, so we used his account as a template to build your account."

And I said, "Joe's been here for 20 years?" "Yeah, yeah, yeah." I said, "How many different jobs has Joe had?" And he told me five, six, whatever it was. And I said, "Did you ever roll back any of Joe's access as he changed roles?" "No, it's all in there." So every permission that Joe had accumulated over 20 years, I instantly received, even though I didn't need 80% of.

Jake Bernstein: Yeah.

Kip Boyle: But everything worked. And that's kind of what they were focused on. And that was kind of a head slapping moment where I was like, "Okay, we're not doing this anymore. We're going to change that."

Jake Bernstein: No, no, no. Definitely not.

Kip Boyle: Okay. That's number six. Number seven, least common mechanism. The protection mechanism should be shared as little as possible among users. Do you know what this one means? That's kind of weird, isn't it?

Jake Bernstein: It is. Protection mechanism ...

Kip Boyle: ... Should be shared as little as possible-

Jake Bernstein: What does that mean, Kip?

Kip Boyle: ... Among users. Okay, this one's kind of a really interesting one to unpack, and I want to do it exactly right, because if I'd explained this wrong, then it's just going to create-

Jake Bernstein: Someone's going to call us out.

Kip Boyle: ... More confusion. Okay. So I'm pulling up this reference here. Hang on. Okay. Let's see here. Okay, here's an example. A website provides e-commerce for a major company. Attackers want to deprive the company of the revenue they obtain from the website. They flood the site with messages and tie up electronic commerce services. Legitimate customers are unable to access the website as well and they take their business elsewhere. So here, the sharing of internet with the attackers causes the attack to succeed, and the appropriate countermeasure would be to restrict the attackers access to the segment of the internet connected to the website.

So by attacking the website, they are able to completely dismantle the entire e-commerce website. So least common mechanism says that we should be able to block the attackers without blocking the legitimate users. Does that make sense?

Jake Bernstein: Yes. That does make sense.

Kip Boyle: Okay. All right. Now don't ask me any more questions about that.

Jake Bernstein: Okay, fine.

Kip Boyle: Okay. Number eight, psychological acceptability. This is huge. This is so overlooked. And I think that especially cybersecurity people think that psychological acceptability is somehow a weakness if we abided by it, but it means that the protection mechanism should be easy to use. It should be at least as easy as not using it. Could you imagine if every control had maximum psychological acceptability and it was easy?

Jake Bernstein: Everyone would use it.

Kip Boyle: Everyone would use it.

Jake Bernstein: Yeah.

Kip Boyle: And so this is where you get into situations where you secure systems so much, you put so many controls in place, that people find it psychologically unacceptable. And instead of using your email system, which has multiple two-factor authentications, for example, they just start using their Gmail account for all their official business because they find your security unacceptable.

Jake Bernstein: Unacceptable. Yeah. No, that's definitely an issue.

Kip Boyle: Yeah, it is. And we don't-

Jake Bernstein: I mean, this is the fundamental driver of shadow IT.

Kip Boyle: Oh yeah. That's a huge part of it, for sure. And that leads to a false sense of security among us, right? Because we think, "Oh, we have five layers and we have two factor authentications. This is a super secure system, and you never get any incidents and nothing bad ever happens." And it's because the thing's not being used. So of course nothing bad can ever happen, because no one's actually using it. And you have no idea that they're using Gmail to get their work done because you're not paying any attention to that. So you've got this massive false sense of security.

Jake Bernstein: And you also may have no visibility into it.

Kip Boyle: Right. Yeah. So I think a lot of things go off the rails because we don't pay attention to that one. Okay, so we're getting close to the end here. Number nine, work factor. Consider the cost of defeating security with, A, the value of the asset being protected, and B, the anticipated resources available to the attacker.
We do talk about this a little bit. We don't call it work factor. I think a phrase that I use sometimes is I don't want to build a thousand dollar fence to protect a hundred dollar horse.

Jake Bernstein: Exactly.

Kip Boyle: But there's other ways to think about this too. So we also say, "Well, who's our threat?" Is it nation states, because they have limitless resources and it's not practical to defend against them? Versus hacktivists, script kiddies? That work factor there is kind of in our favor as defenders.

Jake Bernstein: Yeah. Totally.

Kip Boyle: Okay. And the last one, compromise recording. And what this means is that in situations in which preventative controls are unlikely to be sufficient, we should consider deploying detective controls so that if security is breached, if there is an incident, A, the damage might be able to be contained or limited by a prompt incident response, and B, evidence of the perpetrator's identity might be captured. And this is centralized logging, for example. Are we logging all the events? So those are the 10 that come directly from Saltzer and Schroeder's 1975 paper. And I allege that these all are so fresh and relevant to the work that we're doing today. It's stunning that a body of work so relatively ancient in internet years ...

Jake Bernstein: Well, I think it just goes to show that security is not only about keeping up with the most modern technology solutions. In fact, it's really not.
And I think you can go back even farther and just look at ... I mean, how often do people use the castle metaphor, right? But castle design from the Middle Ages is ... You know, the principles there still apply to modern security, even modern computer security.

Kip Boyle: Yes. Yes. And that's one of the reasons why I think this is a useful toolkit and why I wonder why it's not used more often in various circumstances.

Jake Bernstein: In fact, Kip, without even realizing what I did, speaking of ancient history, are there any other design principles that you might have thought of that would be worth mentioning?

Kip Boyle: Yeah. Actually there's many. And there's three in particular that I think we should mention in our episode here, because they're very, very relevant.
So the first one is called defense in depth. Now, anybody listening to this episode, I wonder if you've already said to yourself, "Why haven't they mentioned defense in depth yet? Why is that not in the ISO standard? Why isn't that in the 1975 paper? When are they going to mention it?" Well, here we go. I'm going to mention it now. Defense in depth, as you said, is an ancient concept. It actually goes back to the Roman Army, and at least the third century AD. But that's not when I learned about it. When I learned about it was in a book from O'Reilly and Associates.

O'Reilly ... Remember these are the people that publish books with the animals on the covers? Right? So this book, it was called Building Internet Firewalls. It was published in 1995, and the authors are Chapman and Zwicky. And they actually have a huge collection of security design principles in that book. And that was a really important book for me to cut my teeth on, and that's where they talk about defense in depth. And I think it's still relevant. I think it's very relevant. It might be the most popular security principle ever.

Jake Bernstein: Well, and I think for good reasons why, right? I mean, if you don't have defense in depth, then you're relying on essentially one point of failure every time.

Kip Boyle: And sometimes that's okay, because if you go back to compromise recording, which was the 10th principle in the Saltzer paper, remember, it said that if you have a preventative control that's unlikely to be sufficient, then you want to use detective controls and you want to collect evidence of the perpetrator's identity. Because sometimes it's not economical to do defense in depth for everything. And so sometimes you do have to pick and choose, but-

Jake Bernstein: Well, and I'm honestly not sure that I would consider ... I'm not sure that I would say you're not doing defense in depth when you're deploying detective controls and evidence gathering. I mean, I think defense in depth as a concept is just the idea that you have multiple chances to catch the bad guy or find the bad guy or notice him or stop him. I mean, it's really ... You know, it's the whole moat, wall, inner bailey, boiling oil thing.

Kip Boyle: The way I think of it, and I'm not saying I've got a monopoly on this, but the way I think of it is defense in depth is kind of belt and suspenders. That was another little turn of phrase that I learned early on in my career. And I was confused by that when people would say, "Well, let's use a belt and suspenders approach." I remember thinking, "Why the hell would I wear a belt and suspenders? That's ridiculous. It's one or the other." And people who were at this longer than me said, "Well, what if your belt fails and your pants drop? Don't you want some suspenders there?"

Jake Bernstein: Yeah.

Kip Boyle: It's like, "Okay. All right."

Jake Bernstein: You know, at the risk of going off the rails just slightly, I think I want to actually take a moment and push back against anyone using the belt and suspenders concept. And I have a reason for it.

Kip Boyle: Okay, let's hear it.

Jake Bernstein: If I'm a skeptical, thrifty modern business executive and my security guys tell me that they want to take a belt and suspenders approach, I might be like, "Yeah, I don't really see the value in that."

Kip Boyle: Hmm-mm.

Jake Bernstein: But if they say, "Well, we need to take a defense in depth approach because that is how you defend everything and anything," I'm much less likely to push back.

Kip Boyle: Yeah.

Jake Bernstein: And this is just a good way of saying that language and metaphor matters in the business context, because you don't have to ... Just think of the difference that would make. And you can actually see it. You can just envision the conversation going two completely different directions, right? If you're the security engineer trying not to speak in zeros and ones to the CFO, if you say defense in depth versus someone else who says belts and suspenders, I bet there'll be different results.

Kip Boyle: Probably will. But the reason why I brought that up, Jake, is because defense in depth to me means you're going to wear a belt and suspenders in case one fails; that the controls, they're very similar because they're trying to solve the same problem, which I don't want my pants falling down. Now, if you go back to compromise recording, where it says, "Hey, if you've got a preventative control, and it may not work all the time, now you should consider adding a detective control and other controls", to me, that's more of a diversity of defense because you're using different types of controls. And that's the point that I wanted to make.

Jake Bernstein: Interesting.

Kip Boyle: And diversity of defense is actually another principle in the Chapman and Zwicky book. But yeah, I inaudible.

Jake Bernstein: That's good to know because I would say that I unintentionally squished those two concepts together. I would think of defense in depth as not just redundancy, but diversity as well.

Kip Boyle: Yeah.

Jake Bernstein: But maybe those are separate. I think that's a good point. Okay.

Kip Boyle: I think they can be, and that was the only reason why I wanted to make that point.

Jake Bernstein: Yeah.

Kip Boyle: So defense in depth from the Building Internet Firewalls book ... Which by the way, if you go on the internet and you search for that book, Building Internet Firewalls by Chapman and Zwicky, it's actually free now. So you can read it. And I would recommend that you do so. It's really good. Okay. Now the second additional principle that we talked a lot about today that I want to mention is the one called assume breach. And this is where we manage security on the assumption that one or more of our security controls have already been compromised, even though we love them and we trust them, but we're just going to assume that somebody's already defeated them. And this principle was popularized beginning around 2000 by our good friend Kirk Bailey, who's the CISO at the University of Washington, who also led the Agora for many years, which we enjoyed attending, and which I am sad no longer exists.

Jake Bernstein: Indeed. I think this one is increasingly important just because there's so many different ways to get into a system, I think.

Kip Boyle: Oh yeah.

Jake Bernstein: I think it's probably not particularly controversial at this point.

Kip Boyle: It isn't, but when it came out, it was. I remember people saying, "Why should I assume breach? Aren't I just giving up on everything that I've done? What sense does that make?" It was a real different way of thinking about security and a lot of people were really uncomfortable with the idea. But I do think it's been normalized. And it has been so normalized that I would even allege that it's led to the third additional principle that I want to mention, which is zero trust.

Jake Bernstein: Oh no, that's a buzzword. Careful, man.

Kip Boyle: Well, it has become a buzzword, hasn't it?

Jake Bernstein: It has.

Kip Boyle: The marketing people have gotten a hold of it. But let me de-buzzword it for you and let me show you how to stay grounded on this principle.
First of all, I want to say that the principle invalidates the idea that we can have an internal trusted network. That's a lot of what zero trust says. And so it's actually taking the assume breach principle and taking it further, I think. Now, where did this come from? Well, I remember back in 2004, there was this group called The Jericho Forum, which came out of Europe, and they talked a lot about de-perimeterization.

Jake Bernstein: Ooh, there's a nice word.

Kip Boyle: Yeah. And I remember hearing that and I was like, "What the hell are you talking about?" And so what they were saying was, even back in 2004, they were saying, "We don't have a perimeter. Our perimeter as an approach to network security has failed because it's just way too easy for people to get in through phishing and social engineering and lack of patches." And they were just saying, "The inaudible has too many holes in it and we're pretending it doesn't leak, but it leaks all the time." And that was another really big source of cognitive dissonance for those of us who were doing the work at that time.

Jake Bernstein: Yeah.

Kip Boyle: Because the failures were often silent. So in the lack of evidence, we were just scratching our head going, "What are you people talking about?"
But I think they were right. They persisted, and eventually, what happened was is that this whole idea of, "We shouldn't trust anything. We shouldn't trust and then verify, we should instead flip the model." And zero trust says that, first, we have to verify that you are who you say you are, and that you deserve to have access to the thing that you're requesting based on where you're coming from and the profile of the device that you're using, and so on and so forth.

Jake Bernstein: Yeah.

Kip Boyle: So that's what zero trust really is. And I really encourage people to study NIST Special Publication 800-207, because that is a vendor agnostic description of zero trust. Whereas all vendor publications on zero trust I think are biased in favor of their product sets. And no surprise.

Jake Bernstein: Well, and this is inaudible

Kip Boyle: But the products are not able to do zero trust in the pure way right now. So please be careful, everybody.

Jake Bernstein: Yeah. That's interesting. And I'm resisting the urge to philosophize about how zero trust is actually just an outgrowth of the concept of defense in depth and defending a castle.

Kip Boyle: Yeah.

Jake Bernstein: Just because you have been waived through the outer gate doesn't mean you're not going to be challenged again later on, particularly if you want to go someplace sensitive, like the keep.

Kip Boyle: Right. Yeah. If you want to go over the Crown Jewels store or you want to go to the powder room or whatever ... I mean the Gunpowder room.

Jake Bernstein: Yeah, that's what I thought.

Kip Boyle: Yeah. And I want to say one other thing here too, which is that this whole idea of de-perimeterization I think is really crucial to the idea of zero trust.
And when you mention a castle ... What I tell people these days is, and this is very apt given what's going on in Ukraine right now, the style of warfare that's being fought; the war in Ukraine is all about drones, right?

Jake Bernstein: It is.

Kip Boyle: And imagine that you're fighting a defender who lives in a massive fortress, and all you need to do is fly your drone several hundred feet over the moat, over the thick walls, and then drop an explosive on the head of somebody walking around down there. And that, I think, really does explain why network perimeters don't work anymore; because our adversary has drones, has digital drones, and they're just having a field dance.

Jake Bernstein: Yep. Castles have not been an effective thing for a long time.

Kip Boyle: Yeah. Not in the real world and not digitally.

Jake Bernstein: Exactly. Okay. So is the episode over or are we headed somewhere to an additional destination with all of this?

Kip Boyle: I hope people haven't abandoned us yet because we've just been reading lists. We are headed somewhere with this. We're not done. We're almost done, but we're not. We're going to try to stick within our typical time boundaries here. So remember, I opened the episode with a comment that I just don't see cybersecurity people using security design principles enough these days. There's a few they throw around, but they really don't draw from all of the tools in the kit. And I want to give you an example.

Jake Bernstein: All right. Is it a good one? I hope it is.

Kip Boyle: Well, you can judge and the listeners can judge whether it's a good one or not, but it has to do with the LastPass incident. At the time of this recording-

Jake Bernstein: Oh boy.

Kip Boyle: At the time of this recording ... I mean, it just seems like every day they drop another shoe. I don't know how many shoes they wear, but it seems like they have way too many. And at the time of this recording, what's happening is we're realizing that everybody who had a vault in LastPass has learned that the vaults have been taken and they're all now offline, they're static. So whatever you had in your vault at the time of the exploit of the incident is now in the hands of the attackers. You don't have the ability to change the password on it anymore because it's offline, and there's no multifactor authentication because that's a mechanism that is enforced outside of the vault.

And so I want to look at this because there's a lot of hysteria on social media right now. People are saying things like LastPass as a product and as a company is bankrupt and they're awful and evil, and that this whole episode approves that the era of online password managers is over and that it's an awful idea and that you should abandon LastPass immediately. And I don't think that that hysteria is justified. I think what's lacking is an objective assessment against these design principles. And so what I want to do-

Jake Bernstein: Well, if you're asking people to slow down and think through something instead of just offer hot takes on social media, Kip, I think you're living in the wrong era.

Kip Boyle: I know. I realize I'm standing and facing the wind. Or you could say I'm punching the rain. That's fair. But still, at the same time, I just got to think there are people out there that are making big decisions right now with enterprise networks who are using LastPass, I use 1Password, and I'm concerned that they're going to make a knee-jerk reaction that isn't going to be really reasonable for them. And so I want to take the open design principle that we talked about, and I want to look at what's going on with LastPass just through that one lens of open design.

But I would allege that if you want to do a really good job of understanding what's going on and whether it's reasonable, you should probably look through each of the lenses of these principles that we've talked about and ask yourself, has LastPass violated any of these principles? And as a cybersecurity vendor, I would hold them up to a very high standard on this. I wouldn't inaudible-

Jake Bernstein: Oh, sure.

Kip Boyle: ... An easy pass on any of this stuff. I would expect them to know.

Jake Bernstein: Yeah. Okay. So let's do it. And you mentioned open design. That's the one you want to unpack in particular. And I think LastPass claims to use an open design that's been independently reviewed. So let's assume that. What do you think?

Kip Boyle: All right. So if that's accurate, and I think it is, then I would say their product is actually working as designed in this post-breach scenario. And I have to make a comment here that I'm using breach knowing that the DBIR defines breach and incident differently. And so I'm using breach with a little B. Because breach by the DBIR means that the data confidentiality has been violated, and actually, that's not what's going on here at the time of this recording.

Jake Bernstein: Right there, I think there's a lot of people who are going to see LastPass breach headlines and just assume immediately that all passwords are now unlocked and everyone has them, and it's just the end of the world. But that's not true, is it?

Kip Boyle: No, it's not. Words matter. So I wanted to point that out that people are throwing around this breach word and it really isn't a breach.

Jake Bernstein: So for our listeners who haven't, or at this point, by the time this episode airs, may have forgotten some of the details, why don't you just quickly summarize what exactly ... ? And of course, we don't have all the details at this point.

Kip Boyle: No.

Jake Bernstein: It might be interesting to do a postmortem on this one in a year or so. And you can correct me if I'm wrong, but what I think, as you said, is that the vaults have been stolen, right?

Kip Boyle: Hmm-mm.

Jake Bernstein: But the passwords at least ... And we can quibble inaudible Actually, okay, I'll be fair. There are some issues with some of the metadata fields not-

Kip Boyle: Yeah.

Jake Bernstein: They were never encrypted.

Kip Boyle: Some of them, yeah.

Jake Bernstein: Some of them were. But all the password fields are fully encrypted. So even though this information is out there, it's fully encrypted. And I'll just play lawyer for one second here.

Kip Boyle: Okay.

Jake Bernstein: Under most, if not all data breach notification laws that I can think of at the moment, if the data that has been stolen is encrypted, then for purposes of the law, it's not a data breach.

Kip Boyle: Correct. Correct.

Jake Bernstein: And that's kind of what you're getting at here, is that-

Kip Boyle: Yes.

Jake Bernstein: ... We're using the B word, but it's ... Well, one, there is some confidential information, because it was never encrypted, that has been compromised. So one could argue that that makes it a breach. But the most sensitive data, the passwords themselves, are still encrypted. Now, there is an issue there, Kip, and I believe it can be summarized as a question: how strong is your master password?

Kip Boyle: Yes, that's exactly where I want to go to next, which is not only have the vaults been stolen, but we know also that some of the source code for the LastPass product has been stolen as well. And so if you think about open design, then that shouldn't bother you. It should not bother you that some of the source code has been stolen. Because remember, open design implies that your product doesn't rely on the secrecy of the design to protect itself.

Jake Bernstein: Which is important.

Kip Boyle: Yeah, it's crucial, right? Not every product does this. I hear so many ongoing vulnerabilities where the coders are keeping the private keys in the source code or the private keys are somehow in GitHub and can be retrieved. Right? So that's where the designs are failing, is where knowledge of the design actually gives the attacker the ability to defeat the controls. Here with LastPass, that's just not what's happening. They've got source code, they've got the vaults, but again, at the time of this recording, no shortcuts have been found in the source code to compromise the stolen password databases.

Jake Bernstein: And just to be clear, what that means for everyone is that ... Go find the current iteration of the password strength chart that shows how long it takes to brute force through any given length of password or passphrase using current modern technology. And as long as your master password is strong enough, then this shouldn't matter.

Kip Boyle: No, it really shouldn't. And I'll caveat that too, Jake, by saying, not only should it be sufficiently lengthy and complex, but you should not have used it anywhere else. If your-

Jake Bernstein: Oh, definitely not this one.

Kip Boyle: Yeah. If your master password for your LastPass vault is the same as the password that you used on the genealogy site for your log on, you're in trouble, because there's a billion credentials at least. You can go to the haveibeenpwned website, and that website is operated by the public release of credentials, username and password pairs, and if your LastPass vault, if your credentials for getting into your vault are in haveibeenpwned, well, you'll be pwned from a LastPass perspective.
But that's not LastPass' fault, is it? That's part of the shared responsibility model of cloud security, isn't it? LastPass did a lot, but they cannot force you to set a master password that is super long and complex or whatever. I mean, that's under our control. Right? That's what we have to do.

Jake Bernstein: And let's be even more reasonable here and think about this just logically; everybody agrees that perfect security is impossible, right?

Kip Boyle: Edward Snowden taught us that.

Jake Bernstein: Yes. And I think the open design principle is really just a realization of that phrase, which is ... And here's the thing too; there's only maybe three or four cloud-based password managers that are really popular in the world.

Kip Boyle: Yeah.

Jake Bernstein: So if you're those companies, you've got to know that you're probably under constant never ending forever assault-

Kip Boyle: Yep.

Jake Bernstein: ... Because you're a big target. You're a juicy target. And I think that contrary to the initial hysteria, I think LastPass might eventually come through this looking pretty good, assuming ... And again, there are things that you could argue about. I know you and I started arguing even before we started recording the show, about, "Well, surely LastPass could have requirements for the master password." And they could, but it's ... And I guess maybe I'm even asking myself the question, what is their place and their responsibility? I think that they probably should have a minimum length requirement of eight characters at least, maybe even 10. I don't know that they don't have a minimum length requirement. I'll say this; it would be surprising if I could use a four digit number as my master password in LastPass. That I would put more on them if they allowed that.

Kip Boyle: Okay. You'd say that was, per se, unreasonable.

Jake Bernstein: I would say that's per se unreasonable. But I would say that beyond putting a minimum ...

Kip Boyle: Yeah

Jake Bernstein: And I guess this is kind of the point where people can disagree, but ultimately, your master password is your responsibility.

Kip Boyle: Hmm-mm. I think so.

Jake Bernstein: And I think that can't be overlooked here.

Kip Boyle: No. Not at all. And I also don't want to overlook the fact that LastPass ... I'm not trying to say that they couldn't have done better. I think they could have done better. They're a seller of cybersecurity products and services. My expectations of them are higher. And I mentioned in passing that I don't use LastPass. We used to use LastPass. We actually started using LastPass at Cyber Risk Opportunities a few years ago, and we recommended it to our clients. And we did that because we thought that LastPass was a particularly good team-based password manager at the time.

But what happened was is that we watched them over time and we really realized that while they were adhering to some of the principles like open design, there were other principles that they were not following. So for example, a couple of years ago, we found out that they had added seven different embedded third party trackers into their Android mobile version of LastPass. And I just felt that was an unnecessary increase in their attack surface. So there's a bonus-

Jake Bernstein: Bonus design principle. Yeah. There you go.

Kip Boyle: ... Violation. But that's what I want people to realize, is that when you see a vendor violating these principles, and even if nothing bad happens in the moment, that should pop a red flag for you. And so that popped a red flag for us. And that, along with some other things, we just said, "You know what? We really don't think the people running LastPass right now are doing a very good job of inaudible to these design principles. And even though nothing bad's happened yet, we're getting off this platform. We're going to go to a different one."

And 1Password at that point had implemented some additional functionality that made them a much, much better choice for small teams to use. And so we went to 1Password. And I scrutinized them in the same way. I asked myself, "Are they using an open design?" and so forth. "What's their attack surface like?" And they've made compromises as well, but I can live with their compromises better than I can live with LastPass' compromises.

Jake Bernstein: Yeah. Well, I mean, it's all a compromise, right? There's always compromises. It's just inherent in security.

Kip Boyle: Oh, absolutely.

Jake Bernstein: It's always going to be that way.

Kip Boyle: Yeah. So let's take Windows for a real quick example. In Windows, their mechanism is called the 'security reference monitor.' Anytime you try to access a resource, the security reference monitor is going to check you and ask, "Are you allowed to have access to a resource?" Well, the security reference monitor used to be implemented in one part of the operating system that actually crippled performance because it was being called way too much. And so in a sacrifice to improve performance, they re-implemented it in a different space.

And I'm not going to go into Windows' internals to explain all the details for you, but just suffice it to say that in order to increase performance, they actually made a compromise to their central security functionality. And I remember when they did that, and I remember thinking, "I do not like that at all. I do not appreciate that. That's a compromise I don't like." Anyway, so yes, it's happening. inaudible

Jake Bernstein: And Kip's been a Mac user ever since.

Kip Boyle: Yeah.

Jake Bernstein: Actually I have no idea what the history is there, but that is funny.

Kip Boyle: Actually my first Mac was an SE/30 back in ...

Jake Bernstein: I don't even know what that is. Wow. Okay. There you go.

Kip Boyle: Yeah. Yeah. It was a Mac SE/30. That was a ... Go look it up, kids. It was fun. But anyway, so over at Cyber Risk Opportunities, we don't use LastPass, but it's not because the vaults got out of control, because I still don't see any evidence that people should abandon LastPass just because some source code got stolen and the vaults got out of control. Because if you have a really great master password set, you're fine. You're fine.

Jake Bernstein: At least until the Quantum Revolution renders all modern encryption useless.

Kip Boyle: Yeah.

Jake Bernstein: But that's a different issue entirely.

Kip Boyle: Yeah.

Jake Bernstein: We'll save that.

Kip Boyle: So we're going to keep using cloud-based password managers because that's the best option for us as a team. I know other people think they're anathema. Great. I'm glad you have options so that you can use stuff where your password vault is never online. Choose what's right for you, but don't choose out of hysteria.

Jake Bernstein: Yep. And keep in mind too that whatever compromises you have to make to accept a cloud-based password manager, they're still better than no password manager at all.

Kip Boyle: I think so.

Jake Bernstein: Almost guaranteed.

Kip Boyle: Yeah. There's no silver bullet here. We just have to choose the best that we can come up with of all the competing options.
I don't have anything else to say.

Jake Bernstein: No. Well, we took that one to the edge of our longest time. So let's go ahead and wrap up this episode, Kip.

Kip Boyle: All right. I am now officially wrapping up this episode of the Cyber Risk Management Podcast because today we talked about an incredibly useful toolkit that I don't see cybersecurity people using enough these days, and that's the security design principles that we reviewed. Thanks, everybody, for being here, and we'll see you next time.

Jake Bernstein: See you next time.

Speaker 1: Thanks for joining us today on the Cyber Risk Management Podcast. If you need to overcome a cybersecurity hurdle that's keeping you from growing your business profitably, then please visit us at cr-map.com. Thanks for tuning in. See you next time.

Headshot of Kip BoyleYOUR HOST:

Kip Boyle
Cyber Risk Opportunities

Kip Boyle is a 20-year information security expert and is the founder and CEO of Cyber Risk Opportunities. He is a former Chief Information Security Officer for both technology and financial services companies and was a cyber-security consultant at Stanford Research Institute (SRI).

YOUR CO-HOST:

Jake Bernstein
K&L Gates LLC

Jake Bernstein, an attorney and Certified Information Systems Security Professional (CISSP) who practices extensively in cybersecurity and privacy as both a counselor and litigator.