Search
Close this search box.
EPISODE 53
Post data breach requirements for law firms

EP 53: Post data breach requirements for law firms

Our bi-weekly Inflection Point bulletin will help you keep up with the fast-paced evolution of cyber risk management.

Sign Up Now!

About this episode

May 12, 2020

Kip Boyle, CEO of Cyber Risk Opportunities, and Jake Bernstein, JD and CyberSecurity Practice Lead at Focal Law Group, discuss ABA Formal Opinion 483, which sets out requirements for law firms who suffer breaches of client data.

Tags:

Episode Transcript

Speaker 1: Welcome to the Cyber Risk Management Podcast. Our mission is to help you thrive as a cyber risk manager. On today's episode, your virtual chief information security officer is Kip Boyle, and your virtual cybersecurity council is Jake Bernstein. Visit them at cyberriskopportunities.com and focallaw.com.

Kip Boyle: So, Jake, what are we going to talk about today?

Jake Bernstein: Today, Kip, we're going to revisit the topic of law firms and cybersecurity by taking a look at another formal opinion from the American Bar Association. And that is formal opinion 483 from October 17th, 2018.

Kip Boyle: Okay. So it's been a while since we talked about an ABA formal opinion. So I guess maybe things have changed a little bit?

Jake Bernstein: Not exactly. The previous formal opinion that we discussed was from way back in May of 2017. And it really focused on reasonable cybersecurity as an ethical requirement for lawyers.

Kip Boyle: That was a good one.

Jake Bernstein: It was a good one. And this new one here, 483, it actually discusses what a law firm's obligations are after experiencing a data breach.

Kip Boyle: What? Law firms have data breaches? No way.

Jake Bernstein: It's hard to believe, but yes. As we've discussed, law firms make excellent targets for hackers because of the data that they hold.

Kip Boyle: It's a treasure trove.

Jake Bernstein: It is a treasure trove. It's one of those high value targets.

Kip Boyle: And it turns out that law firms and managed service providers have that in common.

Jake Bernstein: That's true. That's very true. And in fact, though most of what we're going to talk about today really stems from lawyer ethical rules, I would say that it's actually fairly broadly applicable to any kind of trusted information holder, whether that's a law firm, the MSP, or an accounting firm, or even an investment company.

Kip Boyle: Right.

Jake Bernstein: So though this is specific in a lot of ways to lawyers, I think it has much broader applicability.

Kip Boyle: Yeah. And that's what I love about being able to bring this onto the current episode is most of the people in our audience are not lawyers. They don't work at law firms, but I think the concept here is broadly applicable.

Jake Bernstein: It is. And I think it's also fair to say that most of our listeners' companies have lawyers somewhere. So it's always kind of interesting, I think, to understand what the lawyers have to do, when they experience a data breach.

Kip Boyle: Right. Okay. So, this is, I guess, an incident response plan for lawyers. That's what this formal opinion is opining?

Jake Bernstein: In a way, yes. But I would say that it's broader than that. And I think what this is really about, it's not super technical, rather the ABA's opinion goes into detail about first, the definition of data breach for law firms, which itself is actually kind of interesting. And then yes, it does go through a kind of a mini incident response from a high level, I would say.

Kip Boyle: Okay, well, at first, I'm just like, wait a minute, a data breach is a data breach. Why does the legal industry need its own definition, but that's all right. I'm sure we'll get to that. Let's dive in.

Jake Bernstein: We will. So the formal opinion, actually, one of the first things it does is define a data breach. And it says that a data breach is a data event where material client confidential information is misappropriated, destroyed, or otherwise compromised, or where a lawyer's ability to perform the legal services for which the lawyer has been hired is significantly impaired by the data event episode. So first of all, go ahead and react to my hilariously lawyer version of data breach.

Kip Boyle: Well, okay. So immediately, again, as my practitioner's perspective, immediately, I think, "Oh, those poor people and their payroll information." Because this definition doesn't say anything about payroll records, firm proprietary information. It's entirely focused on client. Am I right?

Jake Bernstein: So you are. And the reason is that this formal opinion is focused on the lawyer's ethical obligations to their clients, post data breach. So the very first thing we have to do is remember that everything else that we've ever talked about regarding a data breach still applies to law firms. They're still businesses. They still have the same challenges that everyone else does. The difference, though, is that... And what we care about for purposes of this episode is the significance of those ethical duties that lawyers have, that frankly are not widespread in other industries.

Kip Boyle: And that lawyers are a fiduciary, right? So they have special [crosstalk 00:04:40].

Jake Bernstein: Exactly. And that's where this comes from. So these are special rules that apply to lawyers. And that data breach definition is specific to a data breach that triggers a lawyer's ethical duties. Now, one thing that is really interesting, and we're going to illustrate this with some hypotheticals and examples, which of course, lawyers love. So let's go ahead and do that, and then I think we can talk about what makes this interesting.

So obviously ,there could be many different cyber events that occur in lawyer's offices. And to illustrate that, let's say first that something happens to the computer system and the network, and there is no actual compromise of material client confidential information.

Kip Boyle: I think you better acronize... I can't even say that. Make an acronym, Jake.

Jake Bernstein: I think we'll call that MCCI.

Kip Boyle: Okay.

Jake Bernstein: And now I'm thinking we should call it Mickey, but I think we won't. So, okay, so-

Kip Boyle: No Mickeys allowed.

Jake Bernstein: Yeah, if there's no actual compromise of MCCI, then there's not a data breach, right? That's a simple one. Here's another simple one. Exfiltration or theft of MCCI, guess what? That's a data breach. Now here's where it gets fascinating to me. Ransomware that does not access MCCI at all, but blocks a law firm's ability to use information, guess what? That is considered a data breach under this formal opinion.

Kip Boyle: Okay. I know why I think that makes sense. Okay. Let me tell you why I think it makes sense, but I'm not sure that the drafters of this formal opinion were thinking this way, but we know, based on the most recent news reports and forensics, that a purveyor of the most recent forms of ransomware will not only encrypt your files, but they'll actually take a copy of everything. And even though you, as the victim, have no direct evidence that that's been done, we know it's happening because the ransomware authors are taunting their victims on social media and releasing files as a way to get them to pay the ransom, even if they don't really need it, from a technical perspective. Now, so that's what I think when I read this. Do you believe that's what the writers of the opinion were thinking? Or what were they thinking?

Jake Bernstein: Actually, no, I think that what you just said would actually be an exfiltration or theft of MCCI plus a ransomware attack. And that's a clear data breach. I think what they're saying is actually a little bit different, and I think, in some ways, more interesting. They're saying that... Let's assume that your client confidential information is just ridiculously well-protected, but the file that controls the logins for your staff and lawyers is not. And that is what gets encrypted. What that ransomware did is, it didn't access the MCCI, but it blocked the ability to use that information.

Here's another way of thinking about it. Another example, rather. The destruction of IT infrastructure, even if there's absolutely no material client confidential information involved, is still going to be a breach under this opinion, if the lawyer cannot perform legal work.

Kip Boyle: Okay. Now I understand their criteria. It's about the fact that, whether the data's been stolen or not, the fact that it's not available for an attorney to serve a client, that's the issue.

Jake Bernstein: Exactly. So if we go back to our favorite acronym, CIA, obviously everyone understands that confidentiality and integrity are very important. But I think what this formal opinion is getting at is, is saying, look, if the availability of data has been negatively affected by some kind of data event, even though we don't historically call that a "breach," they're saying that is a breach, for purposes of lawyer obligations. And I found that really interesting because what it says to me is that lawyers have an obligation, not only to protect that information, the confidential client information specifically, but they really have an obligation to protect their ability to do legal work.

Kip Boyle: Right. Okay. Okay. Now this is really cool, and this is maybe a little aside here, but I can't help to mention this. So when I worked at Stanford research, one of my mentors was Don Parker. And in 1998, which is about when I showed up there, he had just published a book. And one of the things that he was saying in his book, he was a very farsighted man, is he said, look, confidentiality, integrity, availability, yes, of course. But there's actually three more attributes that we need to be paying attention to. And in our modern day, I think this is where we really start to need these three additional attributes. And he said authenticity of data, possession or control of data, and utility of data are three more attributes that we really need to be paying attention to. And I think that we're starting to put our fingers on these new attributes.

So utility, for example, when the data gets encrypted, you still have it, right? But you can't use it because it's encrypted. So the data lacks utility. It's not available, but it is available because it's right there. It's not like you can't access the data. You can access the data. It's just that it has no usefulness. And I just can't help but to mention this right now because it's taken over 20 years for Don's work to start to emerge in an extremely practical way for practitioners. So I don't know. I just thought this was kind of cool.

Jake Bernstein: That is interesting. And I think the GDPR adds of, I guess in this case, a seventh one, which is resilience, and I think the resiliency component of this is interesting. It's less to do... If you think about the original CIA triad, the resilience has less to do with an attribute of data itself. Instead, resilience has an attribute of the entity. The organization as a whole. So I think that's interesting. And I think where this is really getting at here... And we have a real life example.

Do you remember our favorite cyber weapon gone rogue? It was the NotPetya attack. And one of the victims of NotPetya was the international law firm, DLA Piper. And they were shut down for five days in the aftermath of that attack because their IT infrastructure was just shot. It didn't work .if my recollection is correct, that happened well in advance of the publication of this new formal opinion.

Kip Boyle: That was a 2017 event for DLA.

Jake Bernstein: It was a 2017 event, and this is late 2018 publication. But I honestly do think that the authors of this ABA formal opinion were looking at the DLA Piper attack and saying, five days of having your lawyers unable to work on your stuff, that's a real problem. And how unique is it, in some ways, to make the IT infrastructure just as important in a data breach scenario from an ethical basis, as the data itself? In some ways, that's a pretty big shift.

Kip Boyle: Yeah, no. So as a practitioner, again, I continue to think about the heavy emphasis on the lawyer's ability to serve their client as being a major focal point here. And I think that makes sense. I think that's actually very, very helpful. So how does a law firm know if it's been breached?

Jake Bernstein: That's a really good question, because one of the things that we've talked about many times is that the best hackers are silent. You don't know that they're in your system.

Kip Boyle: That's right.

Jake Bernstein: And so how does a law firm know it's been breached? And here's an interesting question. A law firm could have a couple of stances. They could treat themselves like every other major company that looks and pays attention and has intrusion detection systems. Or they could not. They could just ignore it. And if they don't know they've been breached, then really, their ethical obligations can't be triggered.

Kip Boyle: Well, okay, I'm going to go along with you on this. I'm not going to split hairs.

Jake Bernstein: Well, your reaction is the correct one. Obviously, and the ABA clarifies this just to be absolutely sure, yes, law firms must pay attention, and you have to monitor for breaches. And the reason for that is that if you don't employ... If the lawyers do not employ reasonable efforts to monitor their technology and office resources, then detection could be relegated to luck, to happenstance. And if you cannot detect, then you cannot respond. And if you cannot respond, then you cannot meet your ethical obligations. So it's pretty obvious really, in retrospect, but I think it's a question, particularly a question that lawyers will ask, because a lot of lawyers will look for loopholes in the rules. That is oftentimes their job.

And so, this is one where no, you have to look, you have to pay attention. You have to use monitoring. You have to know whether your employees are following your cybersecurity policies and procedures. And of course, you have all the other regulatory and legal rules that are in place. You have contracts. Now we have the California Consumer Privacy Act. You have-

Kip Boyle: state data breach law.

Jake Bernstein: State data breach law. You have GDPR, you have all these things. So you do have to monitor for breaches.

Kip Boyle: Well, I have to say, the fact that attorneys need to be reminded in a formal opinion like this, tells me that attorneys are just like everybody else. Because my non-attorney customers struggle with this as well. And it's one of these things we don't really talk about, but I can feel it. It's sort of like, well, we don't really want to have to tell people that their data has been breached. So if we just don't pay attention that there's a strange person walking around in the warehouse, hopefully, they'll just wander out on their own, and we won't have to ring any alarms and interrupt our work and risk our reputation. It's like, it was just a homeless person that kind of wandered through our warehouse, but that person's gone now. So, we'll just pretend that never happened.

Jake Bernstein: So no big deal.

Kip Boyle: Yeah. So no big deal. It's kind of the digital equivalent of that, right?

Jake Bernstein: I have a different metaphor or perhaps really more of an analogy for you, which is I think it's very fear based. I think you consider the kind of prototypical, stereotyped, 1950s era American male, who is stubborn, doesn't want to go to the doctor, maybe is a little scared of finding out just how much arterial plaque they have that's going to eventually lead to a heart attack. And so they don't go, and suddenly, boom, they have this massive cardiac event. That is the medical equivalent of what a lot of people's cybersecurity stance has been, which is I don't really want to know. And then suddenly they have a major incident on their hands.

Kip Boyle: Yeah. If I don't know, then it can't hurt me. I think this is a very common, very human thing to do in the face of a lot of potential difficult subjects, things that will make me have to change who I am or my daily routine. I just, as a human being, I just don't like that.

Jake Bernstein: It is, and it was ended with cybersecurity, as we've talked about before on the show, a lot of people... It's not intuitive to most people. And so I think even though it might seem obvious to us as cybersecurity practitioners and risk managers, I do believe that lawyers, as a group, need to be reminded about this, and they need to understand that their ethical duties require them to monitor.

And speaking of monitoring, it's worth taking a look into some of the other aspects of formal opinion 483. And one of those is, what about law firm employees? And this is really, really interesting because it turns out that there are two model rules, model ethical rules, that frankly, most states have adopted. And what it boils down to, and this is fascinating, is training is not optional. There is actually a duty to supervise and train other lawyers in the firm, non-attorney staff, and even third party vendors. So, we've talked about this numerous times, as a requirement for competent cyber risk management. But what I just said is that lawyers have an ethical duty to monitor their vendors and supervise and train their staff and other lawyers. Isn't that interesting?

Kip Boyle: Well, yeah. And I'm just sitting here right now, and I'm thinking, boy, if I went out and conducted a survey or some kind of a research project to find out how many law firms are doing this now, particularly with third party vendors, I'd be shocked if most of them were doing it already.

Jake Bernstein: I would imagine that a lot are not. I would hope, though, that... This was just over a year ago that this was published, at least, or more than that, back in 2018. So I'm hoping that this has percolated through the legal industry, but I don't know if it has. I think it hasn't necessarily gone through. And I think that if we keep going, and I think one of the... The interesting question is, okay, I have this duty to supervise and train, and I have to monitor things, but Kip, aren't hackers pretty smart?

Kip Boyle: Well, that's what Hollywood tells us. They're also disaffected and loners. No, but they are smart.

Jake Bernstein: And they are smart. And remember, as we often say, we say the word hacker, and people will probably have Mr. Robot in mind. But in fact, we're also talking about trained professionals in armies that are in nation states, right?

Kip Boyle: Oh yeah. Yeah. And cyber criminals [crosstalk 00:19:01].

Jake Bernstein: Cyber criminals-

Kip Boyle: Tony Soprano and his gang are digital now.

Jake Bernstein: And they're very well funded. And so, hackers are smart, and they do have resources. And so there might be this fear, if I'm a law firm, particularly if I'm not a particularly big law firm, how am I going to stop this from happening? And am I going to violate my ethics... Is it almost inevitable that I violate my ethical duties? And the answer actually is no. And this part is, I think, a bit of hopefulness for those feeling overwhelmed. And this is what the ABA opinion says, is that ethical violations will occur primarily through inaction.

Kip Boyle: Not because you did the wrong thing, but because you did nothing.

Jake Bernstein: Not because you necessarily did the wrong thing or because you didn't do enough, but because you did nothing or inaction. And as we know, security can not be made perfect. And once again, the requirement here is for reasonable efforts to avoid data loss, reasonable efforts to detect the cyber intrusion. And so I have made these little, I would call them formulas, except these are more just logic chains. And it goes like this. A lack of reasonable effort, which leads to a breach, leads to an ethical violation. But a reasonable effort, the understanding that hackers are smart and may nonetheless breach you, that is not an ethical violation. So, it's really important, is that there's not going to be a special rule for lawyers that says you have to have perfect security. That wouldn't be fair.

Kip Boyle: But you have to have some evidence of reasonable effort. Right? And that's one of the things that we do for our customers ,is we generate a lot of evidence for them so that they have records that they've been reasonable. And in the case where our customers, where we share customers, and they've elected to protect those records with attorney client privilege, it's even more powerful for them because they can release, in a discovery situation, just the records that they want, the ones that will absolutely show they're reasonableness.

Jake Bernstein: Yeah. And keep in mind, too, that it's not the evidence in the records are critical, but I think we've also talked about the process itself is critical to being reasonable. I think when we talk about the programs that we have and how we work with clients, the creation of evidence, it's almost more of a side effect of what is necessary to do reasonable cybersecurity. Obviously, it's important, and it's one of the things that we tell people about. But we do that almost more because it's a strong point in favor of doing in the process, is that, hey, look what you get. But we also know that really, if you don't do the process, you can't get to reasonable security.

Kip Boyle: Right. Right. It's the process that really is the underpinning for your being reasonable. Okay. So, what's striking to me is that when I think about the federal trade commission and how it defines a reasonable cybersecurity, in large part, based on the five major functions of the NIST cybersecurity framework, I'm absolutely seeing them here in this formal opinion. We've talked about protecting assets. We've talked about detecting assets, or I'm sorry, detecting incidents when they occur, even if they're difficult. So is there an equivalent to responding and recovering here?

Jake Bernstein: There is. So, essentially, the next question is, okay, you've been breached. So what do you do? And this is obviously the respond and the recover functions. And the lawyers here, this is no surprise at this point, have an ethical duty to act reasonably and promptly, to stop the breach and mitigate damage resulting from the breach .and how are they going to do that, Kip? What's the standard way that you do this type of thing?

Kip Boyle: You have an incident response plan, maybe even a crisis management plan, and you have to practice it.

Jake Bernstein: Exactly. That's exactly right. That's what you have to do. And let's say you're a law firm, and you're like, okay, well, that sounds simple. What does that actually mean? Well, this goes into... this is much more universal. An incident response process should identify and evaluate any potential network, anomaly, or intrusion. You have to assess the nature and scope, determine if data or information may have been accessed or compromised, quarantine whatever threat or malware may exist, try to prevent the exfiltration of data from the firm, get rid of the malware, and then restore the integrity of the firm's network, all within kind of that reasonable framework.

Kip Boyle: I got to tell you, just listening to you name off all of those actions, and all I can think of is just how incredibly time-consuming all that stuff is. And I think that's one of the reasons why managers, who are responsible for cyber risk, just sort of grown inwardly whenever somebody says, "I think my computer's got a virus," or "I think somebody's in the network," because you're talking about the diversion of substantial resources to do this. And I just don't think anybody really relishes that.

Jake Bernstein: They don't. And one thing I wonder, and this is kind of off the cuff, but hey, isn't that what podcasting is for, is we have fire departments. And what is a fire department's job? They literally sit around in a building and wait for a fire. And then as soon as that fire happens, boom, that alarm goes off. They go. That is their job. Their job is to sit and wait for fires, so they can put them out. Now, what you just said is what triggered my thinking here, which is managers in charge of cyber risk groan inwardly, because of what you said is a diversion of resources and time. Now, what if... and maybe we're a ways away from this, but would it potentially make sense to have a fire department for cyber events at some point?

Kip Boyle: Right. So now let's think about that. So I think you put your finger on something really interesting here, which is the people who are dealing with incidents, when the incidents are not happening, are otherwise engaged in profit-making activities or support activities for the people who are out there generating revenue. And today, the only organizations that actually have their own fire departments are gigantic organizations like Boeing, or just big, big municipalities. But that's the thing. Fire departments are run as public utilities. And so that suggests to me that we need a public utility that specializes in incident response.

Jake Bernstein: Isn't that fascinating? Because a cyber attack is like fire, which of course makes me smile because of, fire doesn't innovate. It is a public threat. Yes, just like a fire might burn down an individual building, we've long ago decided that fire is, nonetheless, a public threat because, hey, look, it can spread. Well, guess what? So do cyber threats. So though we have suddenly gone far afield of our original topic, and this may have to be bounced to a future podcast discussion, but maybe there does need to be a cyber equivalent of a fire department.

Kip Boyle: Yeah. And actually, I was being a little rhetorical, but actually, New Orleans is a recent example of a city that declared a state of emergency when they had the ransomware attack, and that allowed them to activate the National Guard. And it turns out that the National Guard has experts in cyber incident response and recovery, and they absolutely were able to get those experts to assist the city. And so I think that's an early example of incident response, as a public utility. Now it's not a civil organization, it's a military organization, but I think that's the direction we're headed in.

Jake Bernstein: It could be. Because if you think about a law firm, one of the reasons I think that a lot of people aren't as proactive as they could be, is the fear of the cost that could substantially impair the company. And we have... Obviously, fire insurance and cyber insurance are now a thing. But the one thing that is lacking in that continuum, in that metaphor between the two, fire and cyber security, is that the fire department doesn't charge you to come put the fire out. And the big problem, of course, is that forensic work on a cyber attack is incredibly expensive and time consuming-

Kip Boyle: Very. Which is why you should have insurance.

Jake Bernstein: Which is why you should have insurance. Exactly. But I do think that's a really interesting idea.
So, I think we could go further along the discussion here of what exactly it looks like for law firms. And I do want to quickly wrap it up. I think one of the questions here that lawyers have is, do I tell my client that I got breached?

Kip Boyle: Ouch.

Jake Bernstein: Yeah. Ouch, right? And this one actually is a little specific to law firms, because if you think about your typical corporation, unless it has gone public, most companies aren't particularly eager to advertise the fact that they got attacked or breached, right?

Kip Boyle: No. Nobody is.

Jake Bernstein: It's not really a top of the list type of thing. So with law firms, it turns out that, generally speaking, you probably do have to tell the client that something happened. Obviously, if the material client confidential information was actually, or there's some reasonable suspicion that it was access, disclosed, or lost, then yeah. You're going to have to notify your client. What about the confidential information of former clients? The information is old. It's unclear. The best practice, of course, is to agree on some kind of records return or destruction policy. Because I think if you have the records, and they get breached, you probably need to go back and tell that former client.

And then what do you need to tell the client? Well, you have to provide enough information so the client has enough of their individual agency to decide what to do next. And I think one of the best ways to handle this is to discuss some of these things beforehand. Put some contingency planning into the law firm engagement letters, and talk about it with clients ahead of time.

And so, that is probably the best takeaway from this opinion, as well as remember that we all use this technology. It helps us do our job, but you have to monitor it. Managing lawyers, in particular, have to supervise other lawyers and their assistants and the vendors to make sure that these ethical duties are complied with. Across the board. And when anything bad happens, you got to keep the client informed.

Kip Boyle: Right. So much of this really, I think, resonates with the non-attorney world, in terms of, again, this overall five functions of the NIST cybersecurity framework. And I think also just basic setting expectations with your clients and your customers and just making it clear to them. We try to prevent bad cyber events from happening, but they could still happen, and this is what we're doing to deal with that. And to manage expectations. So many bad things happen when you don't manage expectations.

Jake Bernstein: Yep. I agree.

Kip Boyle: All right. Well that wraps up this episode of the Cyber Risk Management Podcast. Today, we talked about the ABA formal opinion 483 and what law firms must do in the case of a data breach. We'll see you next time.

Jake Bernstein: See you next time.

Speaker 1: Thanks for joining us today on the Cyber Risk Management Podcast. Remember that cyber risk management is a team sport, so include your senior decision makers, legal department, HR, and IT for full effectiveness. So if you want to manage cyber as the dynamic business risk it has become, we can help. Find out more by visiting us at cyberriskopportunities.com and focallaw.com. Thanks for tuning in. See you next time.

Headshot of Kip BoyleYOUR HOST:

Kip Boyle
Cyber Risk Opportunities

Kip Boyle is a 20-year information security expert and is the founder and CEO of Cyber Risk Opportunities. He is a former Chief Information Security Officer for both technology and financial services companies and was a cyber-security consultant at Stanford Research Institute (SRI).

YOUR CO-HOST:

Jake Bernstein
K&L Gates LLC

Jake Bernstein, an attorney and Certified Information Systems Security Professional (CISSP) who practices extensively in cybersecurity and privacy as both a counselor and litigator.