Search
Close this search box.
EPISODE 149
The Tools and Rules of Digital Trust

EP 149: The Tools and Rules of Digital Trust

Our bi-weekly Inflection Point bulletin will help you keep up with the fast-paced evolution of cyber risk management.

Sign Up Now!

About this episode

January 16, 2024

How do you take a very important, yet ethereal, idea like digital trust and make it more concrete and actionable? Let’s find out with your hosts Kip Boyle, CISO with Cyber Risk Opportunities, and Jake Bernstein, Partner with K&L Gates.

Tags:

Episode Transcript

Speaker 1: Welcome to the Cyber Risk Management Podcast. Our mission is to help executives thrive as cyber risk managers. Your hosts are Kip Boyle, Virtual Chief Information Security Officer at Cyber Risk Opportunities, and Jake Bernstein partner at the law firm of K&L Gates. Visit them at CR-Map.com and Klgates.com.

Jake Bernstein: So Kip, what are we going to talk about today in episode 149 of the Cyber Risk Management podcast?

Kip Boyle: It's almost episode 150, can you believe it? I mean-

Jake Bernstein: I can't believe it actually, but we can save our self-congratulatory back padding for next episode.

Kip Boyle: All right. All right, all right. Okay, but in episode 149, what we're going to do is we're going to take a look at this ethereal idea of the digital trust. It's coming up more and more and more. I've actually made a couple of presentations on digital trust at this point to ISACA, a couple of ISACA chapters. It's apparently a theme of the year for them, and what I've realized is that we really need to make digital trust something that's a bit more concrete for people because it's a very, very important issue, but it's very difficult to get your arms around. And I feel like I've been doing a pretty good job explaining this lately, so I thought I would share it with our audience.

Jake Bernstein: That sounds great. Now, Kip, before I can even have a conversation about this, that I need to know what you mean by the term digital trust. And I'm laughing because you wrote this script and I'm realizing that we actually are now able to write the script that sounds like the other person pretty much exactly on. I mean, come on, you do know I have to define terms. It is in fact what I do.

Kip Boyle: I can hear your voice in my head when I write these things.

Jake Bernstein: Me too, actually. It's kind of funny. I guess that's what 149 episodes will do.

Kip Boyle: It can in the best circumstances.

Jake Bernstein: Absolutely. So, okay-

Kip Boyle: So of course I expected this reaction from you.

Jake Bernstein: Of course.

Kip Boyle: And I'm ready. I'm ready.

Jake Bernstein: That's good.

Kip Boyle: Okay. Now we're going to have to build up to this because digital trust is, it's cloudy. It's like, "What is this?" It's this concept, it's a think piece. So let's start making it more real by listening to a couple of rhetorical questions, and you can respond if you want. So two rhetorical questions. Here we go, everybody. First of all, how do you trust an algorithm that's making thousands of decisions a second when you don't even know how it works? All right, think about that. And question number two, how do you trust a company that is silently tracking your movements every day, collecting data about you, and then not telling you what that data is or what they do with it?

Jake Bernstein: And I would go one step further with that second question and say, not telling you what they do with it in any meaningful detail. Because I think, unfortunately, I'm not sure these can be rhetorical, even though they kind of... I think we need to be able to answer these somehow, someday.

Kip Boyle: We do. Yes, yes, yes.

Jake Bernstein: And I-

Kip Boyle: And I'm saying rhetorical because I don't want the episode to be a deep dive on these specific questions, but instead I want the episode to be a deep dive on, how do you answer questions like this?

Jake Bernstein: No, that's exactly it. I mean, I think they are rhetorical, at least right now, particularly the first one. But really, Kip, how do you trust? If you just stop right there, it's already a complicated question and answer. And trust is something that it's one of those, I'm not going to say it's unique because I haven't thought deeply enough about it yet, but trust is something that takes a long time to build and only a microsecond to lose. And it is something that is... It's temporal, so it happens over time. It's something that has to be earned usually. It isn't something that necessarily is freely given, although some people can freely give trust. And it scales in funny ways. Oftentimes, trust is most important with respect to just a couple of other people on the planet for an individual. Yet at the same time, we also have to trust huge numbers of people and organizations that are doing stuff that affect us on a daily basis.

Kip Boyle: Yeah. Institutions, right?

Jake Bernstein: Institutions-

Kip Boyle: I'm sure you've seen the reports of the polling organizations that have said, "I'm going to show you a list of institutions, you tell me how much you trust them." And consistently in the last several decades in the United States what you see is Congress at the bottom of the list, least trusted, Politicians, that sort of thing. And who's most trusted? Well, it tends to be scientists, educators, military officers. So you've got a-

Jake Bernstein: All of which may-

Kip Boyle: ... really interesting continuum.

Jake Bernstein: None of which may still be trusted in this current day and age.

Kip Boyle: Well, trust is an issue, and now-

Jake Bernstein: I feel like we're in a low trust world right now.

Kip Boyle: Absolutely. Absolutely. And if you go look at those polls, I'm not even saying that the people that are considered most trustworthy are even getting high marks, they're just... Nobody's trusted more, but-

Jake Bernstein: It's relative.

Kip Boyle: ... but they're not fully and completely trusted without question. So if trust in the real world is so difficult, hard to earn, easy to lose, and all of those characteristics that you named off, well, how much more difficult is digital trust? Because you can't even see these algorithms, you can't even see the databases and the data stores and that sort of thing. It all masquerades. When you check out of the grocery store, you just feel like, "I'm buying food." But it's more than that, there's actually-

Jake Bernstein: I was just going to say, you're actually trusting a whole bunch of machines to one, get the prices correct as to... A lot of the time you just assume they match what you saw on a tag. I'm literally realizing just now that there's a lot of implied trust. I don't necessarily double check to make sure that, unless it's a sale item, I'm like, "Oh, did you get the coupon?" That you might do, but there's nothing-

Kip Boyle: You don't even get a paper tape anymore, inaudible.

Jake Bernstein: Not necessarily, no. So again, you're trusting the machine to scan the products and add it up-

Kip Boyle: But then what are they doing with the data? Well, they're putting it into a huge data warehouse and they're crunching it and they're trying to understand, "Who is this person with this phone number?" It's Kip, but whatever. And then, "Should we be sending him coupons for this or that or the other thing?"

Jake Bernstein: Yeah, it's true.

Kip Boyle: I mean, so there's this massive amount of stuff going on in the background that's not being even revealed to you. So digital trust is opaque and difficult to even realize that there's a trust situation that's happening.

But let me tell you a story about a real world incident. So we're going from a couple of rhetorical questions, now I want to give you a very specific example. So we're recording this in 2023. It's the end of the year. But about six months ago in July, the city of Hayward, California declared a state of emergency because they had a cyber attack and it degraded their emergency services dispatching capability. So if you're not in the United States and you're listening to this episode, here in the US when you need police or fire assistance or you have a medical emergency, you pick up the phone and you press 911. And that's supposed to work anywhere in any physical place, you should be connected to a dispatch service.

But in Hayward, California in July, the cyber attack made calling that number and getting help extremely difficult because while you could get through, in this particular case, the dispatch center had no typical systems available. So they were really struggling to figure out how to answer these calls from citizens. And then, how do I get police or firefighters or EMTs to respond to these calls because I don't have my normal systems. Now, why is this digital trust? Well, because a cyber attack caused this issue, and the real problem was that citizens had no warning that the emergency services dispatch could fail in this way or that they did fail. There was no way to estimate when the systems would be fully restored. We don't really know if anybody died or if anything really awful happened beyond whenever awfulness was going on when they called 911. We know in a heart attack situation, for example, the faster you can summon medical assistance, the more likely the person is going to survive from it, and any delay is going to severely decrease the chances that they're going to survive the heart attack.

Jake Bernstein: I think we can assume, time is of the essence. If you're dialing 911, time is of the essence, period.

Kip Boyle: Absolutely, absolutely. Everything else falls to the wayside, I need assistance now. So with respect to digital trust, we expect that all these emergency services will use technologies reliably in order to serve us, and that they're going to do other things like protect private information that we give them, and that the operators of these systems are going to act responsibly and not abuse those systems. But in Hayward, California, digital trust failed because this life-saving service stopped working and could not function.

Jake Bernstein: Now, Kip, you must have known that it would not be difficult to get me going on this topic simply because I see a lot of both discuss in our incredibly detailed script here. But I'm thinking about this and I'm thinking to myself, as you know, I have a philosophy background in college.

Kip Boyle: Yes, you are a scientist-philosopher.

Jake Bernstein: A scientist-philosopher, lawyer. And I'm thinking, "Okay, who was trusting what and where was the digital trust in this example?" And I'm going to say something, I'm curious if you agree. I do not think that the trust that the citizens of Hayward California were putting in their 911 system was the digital trust that we're talking about. I think that was standard trust in a government service. However, I absolutely think there was unwarranted digital trust being made by the 911 employees and the city of Hayward with respect to its 911 systems.

Kip Boyle: Well done.

Jake Bernstein: That is where the digital, I'm not sure if you meant me to get there or not, but that is what I see here, is that the digital trust isn't necessarily coming from the people who are using the services. And the reason I got there is I was thinking about this and realizing, I don't care how the 911 operator answers the phone or sends someone to get me, I just care that they pick up and someone arrives. It does not matter to me in the slightest if they're doing it with messenger pigeons, as long as those are fast. I care about speed and that's it. Now, naturally, this being 2023, it's quite normal that whatever old literal switchboard systems were in place in the '40s, '50s, '60s, whenever 911 was invented, I actually do not know the answer to that question, that's all gone now. These are all using digital systems, voice over IP, all kinds of stuff. And yeah, there was a lot of trust there.

Kip Boyle: Absolutely. So yes, you've hit on something that I think is really important, which is, is that the people that we trusted to run this dispatch center did in fact engage in trust in the systems that they had built to serve them. And inaudible-

Jake Bernstein: Unwarranted digital trust.

Kip Boyle: Unwarranted digital trust-

Jake Bernstein: Unwarranted digital trust.

Kip Boyle: I can hear you writing the complaint as we spoke. Now here's the thing though, I would allege that trust has a characteristic called transitive trust. In other words, if I trust you, Jake, and Jake says, "Hey Kip, let's record the next podcast episode using this brand new service that I just found out about." And he's-

Jake Bernstein: Oh, you mean-

Kip Boyle: ... like, "I have no idea what this service is, but I trust Jake so I'm going to trust this service."

Jake Bernstein: I think you have this backwards, Kip. I believe what happened actually in real world was Jake trusted Kip to use a new service and the first time, actually no, randomly the second time we used it, Jake had a God-awful echo to deal with the entire time. This is what happened.

Kip Boyle: It's true.

Jake Bernstein: It is true.

Kip Boyle: It's true. But nonetheless, we are talking about transitive trust, right?

Jake Bernstein: It's a real thing, absolutely.

Kip Boyle: It is a real thing. So you've got people running a dispatch service that deals in life and death trusting some technologies, but the people who are relying on the service, the citizens, well, they're also trusting it. They don't necessarily realize they are, but in fact they are through transitive trust.

Jake Bernstein: You know Kip, this makes me want to record an episode on zero trust and how it really isn't zero trust, because transitive trust is necessary for the world to function.

Kip Boyle: It is. Oh my gosh, yes, absolutely. Yep. It happens all the time. And we could even, I could even do a whole episode on how Windows authorization happens behind the scene using transitive trust properties.

Jake Bernstein: Oh, yes. I mean there is-

Kip Boyle: I mean, it's embedded in all our systems.

Jake Bernstein: It's embedded in all systems. I mean, I think I get it. I think I know what we're talking about. We've established that it's really important that it does happen. I think there is a-

Kip Boyle: It's not just ethereal anymore.

Jake Bernstein: It's not just ethereal. I do think that one thing is for certain, there is a great deal of unwarranted digital trust out there. And-

Kip Boyle: And I'm going to give you some more examples.

Jake Bernstein: You will. And I'll just say that I think one other way to conceptualize our role as cyber risk managers is to say, our goal is to make your digital trust warranted, not unwarranted. I just thought of that, but that sounds pretty good, doesn't it?

Kip Boyle: It does. I like it. All right, so now let's get to the meat of where we're trying to get to, which is how do you answer rhetorical questions like the ones that I pitched at the top of the episode? So let's go a little bit further and let's talk about a formal definition of digital trust. Now this comes-

Jake Bernstein: Oh, goody, I love it.

Kip Boyle: I know you do. Satisfaction delayed, but satisfaction granted. The World Economic Forum provides us with a definition. Now, they're not the only ones that tried to define it, but I like theirs. And they say that digital trust is the expectation that digital technologies and services and the organizations providing them will protect all stakeholder's interests and uphold societal expectations and values. That's their definition. What do you think of that?

Jake Bernstein: It's a definition. I like it. I think it's an interesting definition because it both is, it's pretty broad. And I think there's actually a lot of ways you could define it, but I think this is an accurate one. It is the expectation that digital technologies and services and through the transitive property, the organizations providing them are going to protect our interests.

Kip Boyle: And so the story that I told you about the city, Hayward, California in July, 2023, I think it,

Jake Bernstein: Oh, it's absolutely an example. It absolutely is. And it says here that there's digital trust involves several interconnected elements. The security of systems and data that tracks privacy of data. Absolutely. Transparency of operation. I'm going to come back to that one. Yes, but that's also fascinating. And then reliability, which absolutely. So right away here we have, okay, this is super interesting, Kip CIA triad, confidentiality, integrity and availability. I think reliability and availability pretty much in this case, pretty much the same thing. I mean, not exactly, don't no one come at me, but I'm just going to say

Kip Boyle: It's anonymous for our purpose now.

Jake Bernstein: Yeah, pretty close. I think that privacy is obviously confidentiality. Security is obviously all three of them. But what I'm pointing out here, which is interesting, is that the CIA triad does not include a transparency of operation or really accountability when things go wrong. And that's fine. I mean, I'm not trying to make them all match up, but I think it's fascinating to note that when we talk about cybersecurity, we're almost always talking about the CIA triad. And there's two things here that are clearly important for digital trust, but they are beyond the experience or the expected expertise of security practitioners. And I think that that is perhaps one reason that I think the world is struggling a little with digital trust.

Kip Boyle: Well, that's one reason. Certainly that's one reason. And I love the fact that you saw that there was an overlay here between the work that we do and the characteristics of digital trust. But I would also say that think about ordinary citizens, the people that just call 911. And to your point, well-made I think earlier in the episode, it's like, "I don't care how you do it, just send help. Whatever you do." So ordinary citizens have an even less, I think, chance of understanding these things-

Jake Bernstein: They do.

Kip Boyle: ... which is what makes it more ethereal and difficult for them to grapple with. And I'm going to get you another example later in the episode where I think this is really going to shine.

Jake Bernstein: So, okay, let's move on. I know you have another example of a digital trust history lesson here. So I think, I want to know why don't... Why aren't all of these things built into the internet? I think about so many problems have arisen because the internet is just not built to be trustworthy.

Kip Boyle: And unwarrantedly.

Jake Bernstein: Oh yeah, definitely. What's going on here? Why are we in this position?

Kip Boyle: I think a lot of people in our audience probably are not surprised to hear you say that the internet was not built with security in mind. Most people know that already. But I want to expand on that a little bit. So the internet actually started in the late 1960s, and it was called DARPANET, and then it was called ARPANET, and it's gone through a lot of name changes and also architecture changes and so forth. In the 1980s, so the mid to late 1980s, it was called the ARPANET, and it was very small. In fact, the domain name service wasn't even in common use. So when you got on the internet, if you wanted to know what are all the systems on the internet, there was no Google. All you did is you downloaded a hosts file, and in that file was every single computer connected to the internet built into that file. So you had a static directory that the computers would use in order to find each other.

Now, who was on the ARPANET at the time? Well, it was mostly researchers, people in academia and the military and military contractors. Now, I've mentioned The Cuckoo's Egg before, which is a book by Cliff Stoll. And I think that his comments in his book about digital trust, he didn't use those words, are relevant here to understanding, how is it that we have all this unwarranted digital trust going on? And so I thought it would be useful to crack his book open.

So what happened was is that as he's writing his book and he's talking about how there was these intruders in his network, what he said was that these systems, there weren't that many of them. They were tied together on a national basis internationally, in fact. And the systems administrators didn't bother locking down their systems because even though there were a security capabilities of different types in these different systems at the time, the possibility of bad people, bad actors using them just didn't cross their mind because the community was small and everybody had a similar purpose.

Jake Bernstein: Well, and it probably didn't help matters that the skill set needed to do anything on that network would've been exceedingly rare at that time as well.

Kip Boyle: Mm-hmm, mm-hmm. So that's another barrier to entry. So the systems managers, the systems administrators, just, they couldn't conceive of anybody getting onto this network and then doing something bad because in their mind, this was a rare and valuable resource. If I wanted to do data analysis back in the 1980s, and I had a tape of all this data, let's say I'm an astronomer and I'd collected all this data for an observation and I wanted to do analysis, I wanted to hang this tape on the fastest computer I could find, and maybe that computer was thousands of miles away. And there was a time when you would have to mail the tape or hand carry it. Now when the internet starts to come into play, now I can transfer the data and I can use a remote computer which had a maybe much faster processor capability than the one that I have at my institution.

And so everybody was shocked when it became known that people were abusing the ARPANET and causing trouble because it was such a valuable resource. And so Cliff wrote in his book about how violated he felt, and the way he wrote about it almost made it feel like somebody had broken into his house and had used his toothbrush and put it back on the rack. I mean, he took it really, really, really personally. And so I really keyed in to his description. I read his book a long, long time ago, but when I started to dig into digital trust, it just popped right back into my head because I thought he did such a such a wonderful job of describing, what's going on here? Why did we get into this situation? And not much has changed.

Jake Bernstein: Well, I mean, there's a very simple analogy here to be made here too, this is what happens... The next thing you're going to, or I guess I'm supposed to say this, it's difficult for trust to scale. And I think that is very, very important. And what it brings to mind is really small town America, even today, where you know everybody. I would be willing to bet that the people on ARPANET probably all knew each other by name.

Kip Boyle: They did. They did. Or by reputation.

Jake Bernstein: Or by reputation. And certainly at the very beginning by name, because there just weren't that many people doing it.

Kip Boyle: That's right. It was a small virtual town and nobody locked their doors.

Jake Bernstein: Exactly, that's exactly my point is that when you have a small number of people, trust is easier to have because you all know each other, you're all much more, I think, motivated to maintain that trust and there's a safety that comes with it, and it's mutually reinforcing. And I think tribalism is embedded in our genes. It's just there. So-

Kip Boyle: We stick together. We protect each other. We support each other.

Jake Bernstein: We stick together now. Just like it didn't take us as humans, very long to invent warfare, I'm sure back in the Paleolithic, Neolithic, different tribes coming in, competing over certain things. It shouldn't have been a surprise that somebody would eventually misuse a network. After all-

Kip Boyle: Well, it was.

Jake Bernstein: Well, no, I'm sure that it... Yes, trusting a handful of users on ARPANET though is obviously much different than trusting 10s, if not realistically, hundreds of millions, billions really of strangers on the World Wide Web. And I think what Stoll was calling us to do with his book is to take the threat of scams, misinformation, campaigns and cyber crime seriously. Because once digital trust erodes too much, how will we collaborate with each other online? How do we even use the internet?

And this reminds me of what I was thinking when you were talking about just going through the grocery line. We also trust that the computer is going to only debit from my bank account the amount of money that it shows on the screen and that that number was correct. Imagine, how would you do anything if you couldn't trust, if you couldn't trust that the grocery store math was correct, if you couldn't trust that the bank was only going to take the amount that it showed on the screen, I mean, literally commerce would come to a screeching halt. We would go back to guarded little caves full of our gold pieces and carefully weighing them out on a scale, and preferably probably with big armed guys hanging around all the time to protect it. We need trust. And-

Kip Boyle: And we have scaled trust in payments, we've scaled trust in different ways.

Jake Bernstein: We have. We have.

Kip Boyle: But digital trust, scaling digital trust is so difficult.

Jake Bernstein: Yes. And so we continue to live in many ways in a digital frontier, a nearly lawless community, at least in some sense, with little access to do fully functional digital equivalents to police, firefighters or other protections that we enjoy in the real world. That is absolutely true. And I think that we, at our peril, we forget that. And once again, I'm going to ask-

Kip Boyle: Yeah, we're like a giant city with a hundred million residents, but with no law. I mean-

Jake Bernstein: I mean, you didn't need police in ye old, quintessential tiny, small town because you all knew each other and you'd have maybe a sheriff who would take care of outsiders who came in, but you don't even necessarily need that. I mean, if you go back far enough, you can imagine how just a few of the strapping young men in any given small village would just keep an eye on a stranger. Once you get to a city, you can't live that way.

Kip Boyle: Right, you cannot. And so I think of The Andy Griffith show right from the 1960s. I mean, that's Mayberry. That's small town policing. It's a great example. But the internet is not like that. The internet is already this enormous metropolis. And without digital trust, I mean we just can't get anything done.

Jake Bernstein: It's funny, we used to say that the internet is the Wild West, but the Wild West was mostly large open spaces with tiny pockets of people. I don't think that's an accurate representation of the internet, and it hasn't been for many, many years now. This is teaming 1980s New York where crime is off the charts, policing is problematic at best, and people are scared to go out at night.

Kip Boyle: Yeah, I'd say it's less Mad Max and more Escape From New York.

Jake Bernstein: Yeah, yeah, totally. So we've talked in the past about how much cyber crime is going to grow into this huge industry. If that's true, how long can we take digital trust for granted before we begin to lose faith in our digital services? I mean, I would argue that the last couple of years are really about digital trust gradually being lost from the users and the market.

Kip Boyle: Yeah. And I think it's, unfortunately, it's just getting worse. In 2015, the global losses due to cyber crime and cyber failures was something like a half trillion dollars. In 2025, the estimate is 10 trillion. And I just saw some new data the other day that said that in another three to five years, the estimates going to double, it's going to go from 10 trillion to 20 trillion.

Jake Bernstein: By the way, I just want to point out for listeners that even though we're recording this in December of 2023, that means that 2025 is a year and ten days away.

Kip Boyle: One year. Yeah, a year and ten days.

Jake Bernstein: Something along those lines.

Kip Boyle: So I mean, that's one way to measure digital trust is how much crime is going on, and it's only exploding. Okay, so bear that in mind as we continue here with what do we do about this digital trust. Well, the World Economic Forum says that digital trust, let's continue with their definition, they're saying that digital trust is built using two main components. One is called mechanical trust and one is called relational trust. I don't like those terms for what we're trying to do here because they're still a little too abstract. So instead of mechanical and relational trust, I want to just talk about tools and rules, all right? Tools and rules, because I think that's easy to remember. Now, what are tools or mechanical trust, and that's mechanisms that deliver predefined outputs reliably and predictably.

All right, so think about the 911 service. When it was cyber attacked, it wasn't operating reliably and predictably. But I have another example that I want to give you because I think this one's going to be very visceral for everybody, because not everybody calls 911 every day, but something a lot of people do every day is drive their car. And so think about your car as a tool, it's mechanical and do you trust it to stop when you decide that you don't want to go so fast anymore? Speeding down the highway, you put your foot on the brake, you don't even really think about it any more unless you're a brand new driver and you just expect that pressing on the brake pedal will reliably and predictably slow your vehicle down. So that's mechanical trust. You have a tool, it's called a brake system, and when you want to use it, it functions. Am I doing a good job of-

Jake Bernstein: Yeah, no, I mean that's-

Kip Boyle: ... explaining it so far?

Jake Bernstein: That is absolutely accurate. It's making me recall a time almost 20 years ago where a firearms instructor told me that a gun is a mechanical device and it will fail you when you need it most. And I still remember that exact quote because that's really what, if you're in the military or a police officer, that's going to be pretty relevant to you.

Kip Boyle: It's got a critical need detector and it fails as soon as that critical need arises.

Jake Bernstein: But I think this is-

Kip Boyle: That's why you got to keep everything clean.

Jake Bernstein: Yeah. No, that's right. But this is not that different than on the digital side. You can't have a... The definition of a functioning cryptographic hash function like SHA-2, SHA-1 even MD-

Kip Boyle: inaudible 56.

Jake Bernstein: Even MD-5 though it's been cracked, is that it always delivers the same result, given the same input.

Kip Boyle: Even though it's a black box.

Jake Bernstein: Even though-

Kip Boyle: And even if you know what the algorithm is, you have secret keys and that sort of thing. But let's continue. Now, if a system is secure and it performs predictably and reliably, then individuals will be more willing to use it. But the outputs have to be predictable and reliable, and to the extent that they're not, then trust erodes, right?

Jake Bernstein: Yep.

Kip Boyle: So let's think about facial recognition. Let's think about the mechanical trust of facial recognition. So we can use it in a lot of different ways. I use it every day, multiple times every day to gain access to my iPhone because I've got the face-

Jake Bernstein: Face ID.

Kip Boyle: ... face ID enabled. And I really don't have any problems with it. Unless somebody picks it up, looks at it and it unlocks, and I'm like, "How did you do that? It should only unlock for my face, why did it unlock for yours?" But I haven't seen that and I've never heard anybody complain about it. So facial recognition in my iPhone seems mechanically to be predictable and reliable, and so I trust it. But in 2019, the city of San Francisco banned the use of facial recognition at a community scale because they didn't trust it. So what they said was, "Look, nobody can use facial recognition on people who are on public streets and public sidewalks. You cannot surveil them and try to discern their identity based on this digital technology."

Jake Bernstein: So Kip, I have to ask if you've been perusing the FTC news over the last literal few days? I bet you haven't. I bet you don't know yet what I'm about to tell you, which is-

Kip Boyle: Lay it on me.

Jake Bernstein: ... which is that this week, the FTC in a first settled with Rite Aid, the pharmacy chain, about its use of facial recognition, biometric technology in stores to track shoplifters and-

Kip Boyle: To predict.

Jake Bernstein: ... and to predict. And it has been banned for at least the next five years, and along with a whole host of other requirements. So this is a really good example because it's happening all over the place constantly. And I think what's interesting is that now you have the kind of generic trade commission, Federal Trade Commission coming out and saying, "You can't use this." This is not just the city of San Francisco, now it's nationwide, basically.

Kip Boyle: That's right. So the city of San Francisco, 2019, it's now four years later, almost five years later, and we continue to say that facial recognition used on a community scale is untrustworthy. It's not predictable, it's not reliable. The New York Times had an article about a guy named Randal Reid from 2022. He lived in Atlanta, Georgia, but he got arrested because some cops in a completely different state that he had never even visited once in his entire life, relied unwarrantedly on facial recognition technology to accuse him of committing crimes, stealing women's purses in particular and they had the Atlanta PD arrest him, throw him in jail. And this guy had to fight against this misplaced trust that the police had put into a facial recognition system.

Jake Bernstein: Yeah, that's not going to work.

Kip Boyle: And so he had no transitive trust in it because he was innocent, and he was exonerated, he was released, blah, blah, blah. But I mean, wow. I mean, we are getting so many examples of digital technologies where we cannot establish mechanical trust.

Jake Bernstein: Yeah, absolutely. Okay, noting the time on this episode. Let me take a swing at explaining relational trust.

Kip Boyle: Okay, because remember there's two. There's two things here for digital trust, there's rules and tools. I just talked about tools, which is mechanical. So you talk about rules, which is relation.

Jake Bernstein: And which is the heart of law, the rule of law. So these are the rules-

Kip Boyle: That's why I gave this one to you.

Jake Bernstein: I can tell. It's a good choice. So these are the rules for when and how to use the tools. If people don't believe that we're all playing by the same rules, even though the technology might be trustworthy mechanically, our trust in that system is going to be broken as we just saw in that facial recognition example. But let's also ask, what good are brakes on a car that work if we don't have shared agreements about when to use them on the road?

When drivers see a red light on a public road, the red light means that we have to stop. If people don't agree on that, then it doesn't matter whether our brakes are mechanically trustworthy or not, because we'll press the brake pedal and still get sideswiped or rear-ended by someone who doesn't agree with stopping at a red light. Therefore, we need similar shared agreements on our digital technologies. It needs to be clear when they'll be used, where they'll be used, why we're using them, how they're useful, and that they won't be abused.

Kip, this sounds an awful lot like regulation and law. In order to trust the technology, like facial recognition, we will therefore need rules and agreements about how it can be used. And it's funny, we think that facial recognition technology is creepy, I think possibly even beyond creepy, scary, because at this point we don't trust, I would say anyone to not abuse it, governments, the private sector, anything. And I'm not-

Kip Boyle: Right. I mean, right now we're just talking about erroneous applications. What about abuse?

Jake Bernstein: So that's the thing is that's why these are separate, right? Facial recognition could work perfectly, and we still might not have the relational trust in order to ultimately have digital trust.

Kip Boyle: Right, right, which is why I gave the example of car brakes. We trust our car brakes, in the 1950s nobody trusted car brakes because they were just inherently untrustworthy. But now they're really, really good. Mechanically, we trust them.

Jake Bernstein: Is that true? They didn't actually work in the '50s?

Kip Boyle: Well, they were very unreliable. They would fail-

Jake Bernstein: That's crazy to me.

Kip Boyle: Yeah, they would... Trust me on this, I did the research.

Jake Bernstein: That's very interesting.

Kip Boyle: And that's why parking brakes were invented actually, was because the drum brakes weren't considered to be reliable enough to leave your car and to expect that it wouldn't roll away. Anyway, I don't want to unpack that.

Jake Bernstein: No, no, but wait one second though. So that actually explains why when you have to do your driver's ed test, you have to rotate your wheels into the curb and use the parking brake, because back in the day, your drum brakes might not... inaudible. I learned something new.

Kip Boyle: And transmissions, and transmissions too because you put it in park. Anyway. But yeah, so even though brakes are reliable, if we don't trust that everyone's following the same rules, maybe I'm not going for a drive because it doesn't matter how reliable my brakes are, I'm worried about the other guy.

Jake Bernstein: Well, you wouldn't, right? Because if you can't trust that people are going to stop at a red light, then you're literally taking your life into your hands anytime you go through a light.

Kip Boyle: Yeah, exactly. Okay, now continuing on, now that we understand digital trust in concrete terms, because hopefully everybody's still with us. We've taken something very ethereal and we've put real world things to it and so now we know we're talking about rules and tools. Okay, now how do we solve the problem of trusting those algorithms? What do we do? What do do with this?

Jake Bernstein: So I mean, I think our listeners are going to hope that we're going to provide some kind of concrete list of action items that could help establish or restore our digital trust. And you and I have talked for years about using the personal hygiene metaphor as a way to approach the need for good cyber hygiene. You wash your hands to not get sick with a virus, use virus scanners to not let your computer get sick.

Kip Boyle: Digital inaudible.

Jake Bernstein: Yeah. So if we think of that as the individual level, could we manage digital trust as we might think about or manage a public health problem?

Kip Boyle: I think that's the answer. I really do. And I'm going to give you an example. Anybody who's been alive for a few decades now has seen construction sites where the people on those sites are wearing hard hats, they're wearing gloves, they're wearing high visibility vests of some kind. You probably see a board posted where it talks about the number of days that this job site has operated free of injury, things like that. There's other stuff you don't see very easily. There's steel toed boots. When workers go into high places, they typically need to have safety harnesses, anyway. So there's a lot going on today when you go to a construction site, but it hasn't always been like that. In fact, this is actually a rare and recent phenomenon. In the United States in the year 1900, so about 125 years ago, 300 out of every 100,000 workers died in construction related jobs. And now in this day and age, we've gone from 300 to nine.

Jake Bernstein: Wait, you mean every year?

Kip Boyle: Yeah, every year.

Jake Bernstein: Wow, okay.

Kip Boyle: So 300 out of every 100,000 workers would not go home at the end of the day. That's about one a day. There'd be a family that would no longer have-

Jake Bernstein: That's crazy.

Kip Boyle: ... that family member.

Jake Bernstein: And what is the rate now?

Kip Boyle: Nine in 100,000.

Jake Bernstein: Wow. That's two orders of magnitude less.

Kip Boyle: Yeah, it is. And how did we do that? Well, I mean, I am not the biggest fan of the world of regulation, I think it's not the first place we should go when we have problems, but it was the Occupational Safety Health and Administration Act of 1971 that I believe, based on my research, is the biggest reason why we've been able to make that kind of progress on decreasing worker death on construction sites.

And it's no longer the fault of an individual worker like, "Well, he just wasn't careful enough." Or this idea that they should somehow know what safety equipment they need to bring to work. We've eliminated that now. And the whole idea that, "Hey, you can't tell me what I should wear when I go to work. Stop being such a inaudible." Or, "You're not my parent." I mean, it used to be a big deal to tell, and you can go on YouTube and see videos of iron workers and steel workers walking around hundreds of feet in the air with almost no safety gear of any kind and that was kind of like a macho thing to do. And yet, somehow we have turned the corner where nowadays we have actually shifted the culture and people on job sites understand the importance of it. And in fact, if a job site isn't safe enough based on various measures, the construction site can be shut down.

Jake Bernstein: Yeah, absolutely. So, okay, unlike in construction safety, I think we're going to have to see individuals and organizations demand, or maybe this is like construction safety, people are going to demand stronger security and privacy measures. It's a trend that we already see. Trust is going to have to be built with artificial intelligence systems, digital automation, these are all recent technologies that we have a hard time dealing with. A great example, the strikes by the Hollywood writers who expressed concern about AI replacing them have shown how people might be able to preserve their relational and mechanical trust in a technology. The last things we'll need to have are more laws and regulations such as the GDPR in the European Union and the CCPA in California, and perhaps even a national set of privacy laws for the US or across the world. And it's so funny that our upcoming episode is going to tell you all about the privacy laws that the States are passing. So there's a long list of things we need to preserve digital trust. And it's not going to be easy, is it Kip?

Kip Boyle: It really isn't because this is an adaptive issue, which is to say that there's no body of experts that we can go to and say, "Please, please, please tell us how to trust all this digital technology." In the same way that when nuclear weapons became a thing, nobody really knew. There was no place that you could go to say, "What are the rules for nuclear weapons? We had to figure out them for ourselves. We had to talk with each other. Enemies had to talk with enemies to figure out, how do we prevent a situation from arising where we kill each other when that's not what we meant to do? And it was awful, and it took decades. And I think you could even make the case that it still hasn't been figured out.

And digital trust is going to be a similar thing because we don't know what the answer is. We have to work with everybody together to figure out what the rules are for these digital tools and figure it out in a way that we can all live with. And facial recognition, I think, is going down in flames on a community scale because we can't seem to come up with enough mechanical trust to even get to the point where we can come up with rules about how it should be used. And in the United States, I think this is all going to be very difficult because we have a history of government distrust by the private sector, and we're going to have to figure that out in the same way that we're having to figure out how do we do cybersecurity in a modern age when the government doesn't own the digital super highways?

Jake Bernstein: Yeah. Yeah. I mean, we could probably go for another 45 minutes on this topic, but I think that instead we probably should wrap up the episode.

Kip Boyle: I appreciate that segue because this does wrap up this episode of the Cyber Risk Management Podcast. What did we do today? We did our best to turn an ethereal idea of digital trust into something more concrete that you can actually do something with. And why did we do that? Well, this is a huge, important issue. It's only going to get bigger, it's only going to stay difficult for the foreseeable future. So we hope that this helps you, and we'll see you next time.

Jake Bernstein: See you next time.

Speaker 1: Thanks for joining us today on the Cyber Risk Management Podcast. If you need to overcome a cybersecurity hurdle that's keeping you from growing your business profitably, then please visit us at CR-Map.com. Thanks for tuning in. See you next time.

Headshot of Kip BoyleYOUR HOST:

Kip Boyle
Cyber Risk Opportunities

Kip Boyle is a 20-year information security expert and is the founder and CEO of Cyber Risk Opportunities. He is a former Chief Information Security Officer for both technology and financial services companies and was a cyber-security consultant at Stanford Research Institute (SRI).

YOUR CO-HOST:

Jake Bernstein
K&L Gates LLC

Jake Bernstein, an attorney and Certified Information Systems Security Professional (CISSP) who practices extensively in cybersecurity and privacy as both a counselor and litigator.