
EP 26: Computer Fraud and Abuse Act (Revisted)
Our bi-weekly Inflection Point bulletin will help you keep up with the fast-paced evolution of cyber risk management.
Sign Up Now!
About this episode
April 30, 2019
Kip Boyle, CEO of Cyber Risk Opportunities, talks with Jake Bernstein, JD and CyberSecurity Practice Lead at Newman DuWors LLP, about how the 35-year-old Computer Fraud and Abuse Act (CFAA) is a useful tool for today’s cyber risk managers.
Episode Transcript
Kip Boyle: Welcome to the Cyber Risk Management Podcast. Our mission is to help executives become better cyber risk managers. We are your hosts; I'm Kip Boyle, CEO of Cyber Risk Opportunities.
Jake Bernstein: And I'm Jake Bernstein, cyber security council at the law firm of Newman Du Wors.
Kip Boyle: And this is the show where we help you become a better cyber risk manager.
Jake Bernstein: The show is sponsored by Cyber Risk Opportunities and Newman Du Wors LLP. If you have questions about your cyber security related legal responsibilities-
Kip Boyle: And if you want to manage your cyber risks just as thoughtfully as you manage risks in other areas of your business such as sales accounts receivable and order fulfillment, then you should become a member of our cyber risk managed program, which you can do for a fraction of the cost of hiring a single cybersecurity expert. You can find out more by visiting us at cyberriskopportunities.com and Newmanlaw.com. So Jake, what are we going to talk about today?
Jake Bernstein: Today, we're going to talk about the last several years of the Computer Fraud and Abuse Act, the CFAA. It remains the primary legal tool for prosecuting hackers, both criminally and civilly.
Kip Boyle: I feel like a lawyer only episode coming on.
Jake Bernstein: Not at all. No, no, no, no. The CFAA affects everyone involved in technology and the security industry. It's an important tool in protecting a company's assets from "inside jobs" and it actually sees the most use in an employment context. I think that everyone involved in cyber risk management should understand the CFAA and what it can and cannot do for them.
Kip Boyle: Okay. So the Computer Fraud and Abuse Act as I recollect is pretty old. Is it really relevant? I don't hear people talk about it too much.
Jake Bernstein: Well, it is pretty old. It was passed in 1984, which is not a coincidence. In fact, did you know that it's one year after the premier of War Games and yes, I'm talking about the 1983 classic Matthew Broderick movie. And that movie was actually discussed in the congressional record for the CFAA.
Kip Boyle: There's an apocryphal story about President Reagan watching that movie and being very disturbed by it. And then going on to call up members of Congress to have them do something about it. I wonder how true that is. It sounds like it might have been.
Jake Bernstein: It could be. I can tell you what is true, which is that there was a fear that computer crime would go unpunished, and so that's where the CFAA came from.
Kip Boyle: Well, fun little fact. We've talked in the past about a notorious wanted cyber criminal named Bogachev, who has a $3 million reward on his head from the FBI, and he was born in 1983.
Jake Bernstein: That is funny. We have 35 some odd years of CFAA to talk about. But it's been amended so many times that it's really best, I think to focus on just a couple of aspects today on this episode.
Kip Boyle: Great.
Jake Bernstein: And I think that one of the interesting questions about the CFAA has to do with the question of immunity for hardware and software industry. I think there are some who look at the CFAA and say that it is a bad policy because it in enables coders and companies to put out buggy software, vulnerable code without being liable under the CFAA.
And I think we'll talk about that and I'll give you my view on that issue. And then second is much more, I think, relevant for today's cyber risk managers. And the question is, what does the CFAA do today for our audience, for cyber risk managers, for in-house council, vice presidents, et cetera?
Kip Boyle: That's great. That's our focus here on the podcast is trying to help cyber risk managers thrive in this day and age where they're on their own for the most part. And so the fact that there's a law that could help them I think is fantastic.
So this is great focus and I'm really curious about both of these issues, but let's start with the immunity issue because I do hear people talking about that, although they don't reference CFAA so much when they do, but they just say, isn't it time to stop giving software developers a free pass on releasing what amounts to a defective product?
Jake Bernstein: Here's what the CFAA says. It says, "No action may be brought under this subsection for the negligent design or manufacturer of computer hardware, computer software or firmware." First of all, that's very specific. It's not a broad grant of immunity across all of law.
Kip Boyle: It sure sounds like it to a layperson.
Jake Bernstein: Well, so the key phrase is, no action may be brought under this subsection. The CFAA is just one... It is actually primarily a criminal law. It has a civil action component, but it's not a product liability statute. And what we're looking for is generally the product liability and consumer protection type law. And there's really no reason to expect that the CFAA is meant to take over for product liability.
Kip Boyle: Well, that's interesting. There certainly seems to be a lot of confusion around this topic because, again, the popular conversation that I hear is people are interpreting it that way. That's like, "Hey, this thing's given people the chance to duck their responsibilities for putting out a quality product."
Jake Bernstein: Let's talk about that real fast because what would it mean to allow... First of all, you would never want companies or coders to be criminally liable. That's just not what the criminal law is for. And in fact, the section we just talked about is actually specific to the civil action component, so it's still civil liability.
And the civil liability under the Computer Fraud Abuse Act is pretty far reaching. From a policy standpoint, if you hold coders and companies liable, first of all, what's the standard of care? What even is negligent software coding?
I believe I've heard it said multiple times that it has been mathematically proven that writing perfect code is impossible. And I'm not even sure that perfect code has a meaning because there's so many different ways to express yourself through those-
Kip Boyle: Okay, just to be like the devil's advocate here, this is fun. So when I first graduated from college, I learned how to do software development. And I went to a couple of software development conferences, and this is when I was in the air force. So these were DOD-oriented forums.
And I remember there was a big conversation about writing a good code and how could we write better code? And one of the models of good code that was held up as an example is all of the spaceflight software that's been developed by NASA to support all of this manned spaceflight.
And so it was interesting that when you've got a human safety issue, we're going to put astronauts on the top of a giant Roman candle and light it off, that software better be dang good.
Jake Bernstein: I think that when software was all handwritten by small groups, it was easier to have "better code" and I think as computing became ubiquitous, and Microsoft fulfilled that dream of a desktop on every desk, the calculus changed. The vast majority of code written does not affect human life, in the least.
In fact, at best, it's a minor inconvenience, at worst, it can crash computer, it can cause data loss. There are things that can happen, but it's generally not about human life. And I think that-
Kip Boyle: I think in the big picture, you're right. The vast majority of code is not a human safety issue, although I think that's starting to change. And when you talk about medical devices, power grids and infrastructure. If infrastructure fails and power goes out, or a medical device gets so-called hacked and a patient gets seriously hurt or possibly even dies, I think that would certainly shift the calculus. Don't you think?
Jake Bernstein: It will. And I think what will happen is that the way the law develops in our system, the British common law system is that cases have to be brought with the right facts in order for the law to evolve. And so I don't know that you've seen someone die as a result of a hacked medical device. I know that there's been plenty of discussion of about how it's theoretically possible, but I can't think of an incident that I know of.
Kip Boyle: No, I don't think it's happened yet. I'm not aware of a case either. But I have spent a good amount of time following developments in medical devices, and there's a lot of attention being put on the hackability of medical devices like implantable defibrillators is one.
Gosh, I'm going to get all these facts wrong, but I remember a couple of years ago that there was somebody who found vulnerabilities in a medical device and turned it into a money making opportunity by shorting the stock of a company and then going public with it. We're not there yet, but it seems like we're headed in that direction.
Jake Bernstein: There's really no reason to think that this type of liability can't be handled under all the current case law and statutes for product liability. And the basic idea is that if you create a negligent product and put it out and someone is hurt by it, then there are plenty of causes of action. You don't need to go to the CFAA for that.
Kip Boyle: Got it. We're talking about two points today about the CFAA. The first point was, is there really immunity? Which is what we're talking about now. Could we confidently conclude right now that CFAA does not grant immunity?
Jake Bernstein: Let's be precise. The CFAA itself is not a mechanism to sue for buggy software. It does not however grant immunity in the sense that the whole industry is safe from prosecution of any kind. You're not going to be sued under the CFAA, but product liability, consumer protection.
If you look at some of the cases that the FTC has brought about buggy software and bad code, and vulnerabilities, a lot of the time it's about what you say about your product. If you say this product is the most secure router known to mankind, and it turns out that there is a super easy vulnerability that's been exploited, then you're going to get in trouble for that.
Does it really make a difference ultimately if you're getting in trouble for what you said about it or for whether the code was "buggy" or not? I Would say that it's a very open question. It's not an open question that you can't sue under the CFAA, that's that's clear, but otherwise I wouldn't say that there is "immunity" for the hardware and software industries.
Kip Boyle: Okay. So when you hear people talking about how the hardware and software industries are immune from legal consequences when they release buggy products, that's not true?
Jake Bernstein: Correct.
Kip Boyle: Okay, great. That clears that up. Now, the second thing that we to talk about was how does the CFAA help cyber risk managers today?
Jake Bernstein: The CFAA is actually a very broad tool. And I'd say there are two major areas where it's going to help you out as a cyber risk manager.
Kip Boyle: Okay, and that's whether I'm a CEO or a chief information security officer. Does what you're about to say apply to really anybody in the C-suite who's making cyber risk decisions?
Jake Bernstein: It does, yeah. The first person it's going to apply to is whoever's in charge of HR. And the reason it's going to do that is that the CFAA at a very high level breaks down into two types of unauthorized access. One is completely unauthorized access. This is what is used to prosecute hackers.
They are an outsider. They come in, they never had authorization. They never will have authorization, but they come into the system anyway. That is not what I'm talking about with respect to HR. The other side of the coin is that someone exceeded their authorized access, and you can immediately see how litigious this can get. First you have to figure out what was their authorized access? And then did what they do-
Kip Boyle: Exceed it.
Jake Bernstein: ... exceed it? And those are questions that are pretty typical for litigation.
Kip Boyle: Okay. All right. So if I'm a cyber risk manager and I'm subscribing to the principle of least privilege. So I'm making sure that people only have enough access and authority on the systems to do their jobs. And if I ended up having to face a judge and a jury about somebody that I alleged exceeded their authority, what artifacts would you be looking at to figure that out?
Jake Bernstein: So I would look for the job function, a clear statement of what I'm supposed to do, what I'm not supposed to do.
Kip Boyle: As an employee?
Jake Bernstein: As an employee. I would look at employment handbook, looking up acceptable used policies. I would want to know, what does the system prevent me from doing or not doing? How good is the access control into the actual system? What guidance am I given on a day-to-day basis? All of these things are going to come into play here. The most important thing is probably the job description and my access.
Kip Boyle: And you'd probably also, I don't know, would you also be looking at, what do other people with that person's job title? What kind of access do they typically have?
Jake Bernstein: Yes, I think in the absence of other evidence, I would go there, but what we're looking at doing... The point for cyber risk management and for the C-suite and the HR managers is, you don't want to leave it to... The best you can do is look at what other people in similar situations have as their access.
There's really no reason for that. You should be able to spell out, particularly if you are truly subscribing to the principal of least privilege, there should be no difficulty with making it clear what someone can and can't do. And why would you care about this? Well, IP theft is one of the more common issues. People will take distribution lists, customer lists, straight up code.
Kip Boyle: Waymo and Uber recently got into it because I think Uber recruited somebody from Waymo, which is Google's self-driving car company, and that person allegedly brought hundreds of gigabytes of intellectual property with them.
Jake Bernstein: Yep, that's exactly what happened. And the CFAA could be used to prosecute that person, possibly criminally, but probably it's going to be a civil action. And then of course it doesn't really make a huge difference to the companies because they're still going to have to fight it out under a whole host of other legal theories. But for the employee, it is a crucial component of their understanding of their employment and what they can and cannot do.
Kip Boyle: Okay. So you'd expect to see those rules of what their limits would be in a job description. Would that be a good place to put it?
Jake Bernstein: Yeah, that would be a very important part.
Kip Boyle: And then thinking as a chief information security officer, the management team then would want to take what's in the job description and then we'd want to create a policy, a standard operating procedure, or something like that. Some very clear instructions and give those to the system's administrators, because that's where the actual implementation of this occurs. So it's not enough to write it up in a job description, there has to actually be a mechanism for enforcing it and enabling it.
Jake Bernstein: Correct. And that mechanism is the CFAA, that's kind of what we're going for here. That's what tells you this is illegal and the mechanism... Unless I misunderstood your question.
Kip Boyle: To be clear, it's one thing to state, these are the limits of your authority on our system, but then if a person gets provisioned an account on that system, and that account comes with administrator authority, which exceeds what the company has authorized them to have based on their job description, then that's a fail. That's a management fail.
Jake Bernstein: That's a total management fail, and the problem is that you have an evidentiary problem. On the one hand you have the job description and written policy that says, this is your access. And on the other side, you have the employees' actual access, which is way high.
Now this actually it does happen and it has happened. And what usually is the result is there is some debate over which one controls. And if the company can say that the access that was given on a technical level was a mistake, but that the employee should have known anyway, then the employee can actually still be liable under the CFAA.
Kip Boyle: Oh, that's interesting. Because what I was trying to drill for here is, okay, we're talking about, how can the CFAA be a useful tool for cyber risk managers? And I just wanted to be totally clear on what are the responsibilities for cyber risk managers in order to be able to have the tool the CFAA be on their side?
And so you just made an important clarification, which is it's very, very important to have clear statements. It's more important to have clear statements about what's expected and the implementation of those expectations in the case where they're not implemented quite right, that does not negate the CFAA's utility.
Jake Bernstein: No, it doesn't. And like I said, it's a question of evidence there. Let's just say that you don't put anything in writing about expectations and authorization and different types of access, and someone in IT makes a mistake and gives someone full admin access. In the absence of strong policy and other written guidance, the employee is going to say, "Well, look, I only used the access you gave me."
If you gave me too much access more than you wanted to, then you should have told me that, and that should have been made clear. And I think that's a good example of why you need to double check not just your technical controls, but also all the administrative ones.
Kip Boyle: Okay, all right. Cool. This is the second point that we wanted to talk about today. How does the CFAA support the work of cyber risk managers? And so we've certainly touched on a big part of that. Was there any other part that you wanted to get at?
Jake Bernstein: The other part that it's important to understand is if you are the victim of an unauthorized intrusion, you can use the CFAA offensively to recover damages. The problem is that you have to know who to sue, and that is usually not particularly easy.
Kip Boyle: Now you're talking about in the case of somebody from the outside coming in and exceeding their authority.
Jake Bernstein: Or if you've been hacked, then you can sue the attacker if you can find them.
Kip Boyle: And in the case of an insider, which is what we've been talking about, it should be pretty easy to find them assuming that they've been using their issued account. Because that was one thing that I encountered when I was a practicing CISO was that sometimes people would to use other people's accounts and do bad things, and then you would try to hold somebody accountable, but then they would claim, "Oh, I wasn't using the account."
So even though it's an insider, it can still be a little murky getting that identity to actually take those log entries and say, "Yes, I know it was Joe. And here's how I know it was Joe." Even though Joe swears it wasn't him because his credentials were compromised. But in the case of an outside attacker, it becomes even harder.
Jake Bernstein: It does. Absolutely. And it becomes nearly, like I said, usually the outsider has to make a mistake to be found, but it does happen and it is something that should be considered when you are looking at how to respond to a cyber attack.
Kip Boyle: Let's take just a moment to talk about one aspect of this I think is important. I was recently interviewed in a news article and the reporter was trying to focus on what do companies have to do downstream from a cyber attack to protect their evidence?
And one of the points that I made was, there's a lot of conflict that goes on in the minds of a system's administrator when something bad happens, like when there's a cyber attack. Because the basic expectation for a system administrator is to keep services running.
And so when you are cyber attacked and you all of a sudden realize, "Oh, maybe the best thing to do to stop the cyber attack is to smash the glass on the network cables and start pulling them." Oh, but wait a minute. If I do that and stop the attack, I'm going to disrupt services and that's going to cause a hit to revenue.
And so without clear guidance in advance, two systems administrators about how management wants to handle the potential to prosecute somebody under the Computer Fraud and Abuse Act, there's a good chance that the evidence is going to get destroyed before you ever get to it.
Jake Bernstein: And that is definitely an issue. Destruction of evidence makes it impossible for you to win your case, so you have to build in solid forensic mechanisms to your response and recovery efforts so that you maintain the ability to make those types of decisions going forward.
Kip Boyle: And that's really something you have to do with forethought. The idea that you're going to do that in the heat of the moment is not tenable in my experience. So listeners, you might want to start thinking about it if you haven't already. Under what circumstances would you want to use the Computer Fraud and Abuse Act to hold somebody accountable for a cyber attack where there's an insider or an outsider? And what guidance have you given to your system's administrators so that they know when is it okay to preserve evidence even if that causes a service disruption to customers? That's a tricky subject, but-
Jake Bernstein: It is.
Kip Boyle: ... it's got to be dealt with beforehand otherwise you're going to lose your option. You may not want to take the option to prosecute somebody, but knowing that you have it I think is very important.
Jake Bernstein: Yep. I agree. If you don't have any idea about your tools in the toolbox, then you're never going to know you can use them. Simple as that.
Kip Boyle: Right. Okay, well, that wraps up this episode of the Cyber Risk Management Podcast. Today we talked about the Computer Fraud and Abuse Act, a 35-year-old law that has been amended quite a bit, but how it can help cyber risk managers practice reasonable cyber security today, we'll see you next time.
Jake Bernstein: Thanks everyone.
Kip Boyle: Thanks everybody for joining us today on the Cyber Risk Management Podcast.
Jake Bernstein: Remember that cyber risk management is a team sport and needs to incorporate management, your legal department, HR, and IT for full effectiveness.
Kip Boyle: And management's goal should be to create an environment where practicing good cyber hygiene is supported and encouraged for by every employee. So if you want to manage your cyber risks and ensure that your company enjoys the benefits of good cyber hygiene, then please contact us and consider becoming a member of our cyber risk management program.
Jake Bernstein: You can find out more by visiting us at cyberriskopportunities.com and Newmanlaw.com. Thanks for tuning in. See you next time.
Sign up to receive email updates
Enter your name and email address below and I'll send you periodic updates about the podcast.
YOUR HOST:
Kip Boyle
Cyber Risk Opportunities
Kip Boyle is a 20-year information security expert and is the founder and CEO of Cyber Risk Opportunities. He is a former Chief Information Security Officer for both technology and financial services companies and was a cyber-security consultant at Stanford Research Institute (SRI).
YOUR CO-HOST:
Jake Bernstein
K&L Gates LLC
Jake Bernstein, an attorney and Certified Information Systems Security Professional (CISSP) who practices extensively in cybersecurity and privacy as both a counselor and litigator.