EP 33: How the blame game that follows big data breaches affects defenders
Our bi-weekly Inflection Point bulletin will help you keep up with the fast-paced evolution of cyber risk management.
Sign Up Now!
About this episode
August 6, 2019
Kip Boyle, CEO of Cyber Risk Opportunities, talks with Jake Bernstein, JD and CyberSecurity Practice Lead at Newman DuWors LLP, about what cyber risk managers can learn from the ASUS supply chain cyberattack.Kip Boyle, CEO of Cyber Risk Opportunities, talks with Jake Bernstein, JD and CyberSecurity Practice Lead at Newman DuWors LLP, about what cyber risk managers can still learn from the 2012 data breach at the South Carolina Department of Revenue.
Speaker 1: Welcome to the Cyber Risk Management Podcast. Our mission is to help executives thrive as cyber risk managers. Your hosts are Kip Boyle, CEO of cyber risk opportunities, and Jake Bernstein, cybersecurity council at the law firm of Newman Dewars. Visit them at cyberriskopportunities.com and newmanlaw.com.
Jake Bernstein: So Kip, what are we going to talk about today?
Kip Boyle: Today, Jake, it's going to be a history lesson. We're going to talk about the 2012 South Carolina Department of Revenue breach, the resulting theft of tax records, and then the resulting IRS fraud and identity theft that came from that.
Jake Bernstein: And I assume that we'll be talking about what we can learn from all of this because, as we know, those who don't learn from history are doomed to repeat it.
Kip Boyle: Yeah, that's right. I like history, but I don't really like dragging it into of the podcast for our audience, unless there's something really good that we can take away from it. And even though this data breach at SCDOR happened quite some time ago, seven years now, there's some really interesting things that we can learn from it, specifically around consequences, both the actual consequences, as well as the lack of consequences that came from this.
Jake Bernstein: Okay. Then let's go ahead and just cover the basics, right off the bat. As you mentioned, this is a 2012 breach and it started, like many breaches do, with a phishing email.
Kip Boyle: Yeah, exactly. I think the statistic I see right now is over 90% of all cyber attacks, unless you're Mossack Fonseca or Equifax, in which case, your terrible patching policies will make it much easier for people to attack you.
Jake Bernstein: That's not something to be proud of, though.
Kip Boyle: No, no. Not at all. But I mean, let's face it. It's easy to find out if you haven't been patching, it's harder to write a convincing phishing email. So just don't make it easy, right? But on August 13th, in 2012, there was an employee at the Department of Revenue. They received an email. It was a phish. It contained a malicious code. They clicked on it and they downloaded malware onto their official work computer, and the back door to the entire network was open.
Jake Bernstein: And so what happened is that, roughly two weeks later, that employee's username and password were used to remotely access her work account. Then the attackers installed additional malicious code throughout the South Carolina Department of Revenue network and began their reconnaissance. And they made sure, as many attackers do, to install a back door so they could keep getting access to the network, even if their credentials changed.
Kip Boyle: Yeah. The same thing happened at Equifax in fact, is you install a back door so that you don't get locked out. But once the attackers were in the network, they gathered more credentials; they continued to explore; they mapped everything out; they learned where the most sensitive data was located, and ultimately they began copying tax payer records into what you might call an internal staging directory. Which is to say, they started to accumulate all this data into a single point on the internal network.
Jake Bernstein: Yeah. And then, only after they had staged it, did they actually exfiltrate or steal the data by sending it out on the internet. I think, very briefly, it's worth mentioning that, this is not an uncommon attack pattern where data is staged internally, and they do that because the exfiltration stage is the most important to them. And it's probably one of the times that they have a highest risk of being detected, because information is moving from a network, through that networks, firewall systems out to the internet. But even with this careful planning, it only took these guys 32 days, from the phish to grab almost 75 gigabytes of data. And then, just to make it harder for the good guys later on, they deleted all the collected files and tried to hide their tracks. And this breach didn't get discovered until October 10th, when state IT officials noticed and I quote, "Suspicious activity," and sent the discovery up the chain, and what's... I said, "Quote," because it's almost always this suspicious activity that gets discovered. And I was going to ask, too, how common it is to have hackers try to clean their tracks?
Kip Boyle: Yeah. In this case, I would actually say cyber criminal because, yeah, I've noticed lately people tend to use the word hacker a lot. It's become this wildly overloaded term, and it's got a benign connotation to it, or even maybe a little bit of a fluffy connotation to it. Anyway, so I like to say cyber attackers these days. But yeah, it is common for cyber attackers to clean up after themselves and stay really clean while they're performing the reconnaissance in your network. In fact, one of the indicators, a compromise that I talk to our customers about is, Hey, if you're going to go patch a machine, like install security updates, and you go to put updates on a machine that you thought didn't have patches on it, but that it actually does, you should start to worry because that could mean that you have an intruder who's doing a better job of patching systems than you are, because they don't want to get thrown out.
They really try to stay clean and neat and tidy and clean up after themselves. The thing that's interesting about the way that the discovery happened, though, is, well, two things. First of all, SCDR had terrible detective capabilities, because you really should note is if 75 gigabytes are starting to collect somewhere on your network. I mean, you really should notice that. I mean, that should be, I would think, an unfamiliar pattern, right? It should look very, very different than the familiar patterns of how you move data. I mean, anyway. And then, another thing that I think is into resting is, in modern times, like right now, I don't know that a sophisticated cyber attacker would try to collect, stockpile, and then move 75 gigabytes all at once. We've seen cases where that information is trickled out, almost as they acquire it, because company have become better at noticing when large stores of data start to emerge in places that don't make sense. And especially if you've got, what's called egress monitoring, that is you're watching, what's leaving your network, large bursts of traffic should set off alarms.
Jake Bernstein: Absolutely.
Kip Boyle: Yeah.
Jake Bernstein: Now, in this case, for the times, it was a pretty typical data breach.
Kip Boyle: It was.
Jake Bernstein: And this was, the bad guys didn't get detected until well after the fact. And by the way, we haven't mentioned it, but nobody, to this day, no one knows who did this. I'm sure there are people who have ideas, but no one has been arrested or accused even of this.
Kip Boyle: But it still cost a ton of money, right?
Jake Bernstein: It did.
Kip Boyle: We're talking about a $14 million bill to the state of South Carolina. And what's really interesting, and I think this is a typical pattern for the expenses of a data breach is, 12 million of the total 14 million, right? So the vast majority of that, was to pay Experian for one year of credit monitoring service for all affected taxpayers.
Jake Bernstein: Which is somewhat ironic given that it was obviously Equifax who was breached, but Experian has dealt with its own security lapses as well.
Kip Boyle: And I, quite frankly, don't see a ton of value in credit monitoring anyway.
Jake Bernstein: No, not even crosstalk.
Kip Boyle: I mean, it seems like a bandaid on a pretty big gash of a wound. Anyway, so it's ironic that's, that's the vast majority of the cost there. But then half a million dollars to an IT response firm named Mandiant. Some of our listeners may have heard of them, they're pretty high profile. $150,000 to a public relations firm. And I'm sure this will make you weep, Jake, a hundred thousand dollars to some lawyers.
Jake Bernstein: Yep. That's sounds about right. In terms of what makes this breach interesting, it's a couple of things. I think, the main point of talking about this breach so far after it occurred, is to look at the aftermath with the benefit of hindsight in history. And specifically, the very intense blame game that occurred as a result of the breach, and then related to your comment about the value of credit monitoring, how the tax records were used to perpetrate tax fraud.
Kip Boyle: Yes. Be prepared, ladies and gentlemen of the audience. This isn't really a conversation about the technological details of the breach, but really what we're going to talk about now is how did people deal or fail to deal successfully with the facts and how those facts were used in this blame game? As a member of our audience, if you become involved in a data breach and continue to become involved downstream from that breach, this is... What we're about to cover is something that you can expect to experience for yourself.
Jake Bernstein: Yeah, exactly. And so we're going to start by discussing that blame game. And basically, every person, agency, and company that was involved and could possibly be blamed for this breach, was blamed, but they were all being blamed by each other. It became this large governmental finger pointing exercise. And in addition to all of this finger pointing, which as you probably can tell, it was not particularly helpful, something even more insidious happened, which I hope people have learned from over the last, gosh, seven years since this happened. But it is, I think, still a not uncommon trend and breach response, which is that everyone seems to want to find that single point of failure, that if only they had done something about it, this breach would've been prevented. And my thesis here is that this is a dangerous way to think. Kip, why don't you kick off our discussion by explaining why this... Well, first of all, if you agree with me. And then, if you do, why you agree with me.
Kip Boyle: Well, okay. First of all, I agree with you that this idea that there was a single thing that, if we had done it differently, that it would've made all the difference, in terms of preventing the breach from happening. And it's a very simplistic way of thinking, and I think that the folks that are most prone to thinking like this are the folks who really don't understand the situation on the ground as it were. In other words, these are people who are not very technical, who probably don't understand how very big complicated data networks are put together and how they're operated and maintained over time.
Yeah, I think a big contributing factor here is that you've got people pointing fingers at each other and trying to blame each other, but they don't really understand the nuances and the details of what's going on. And so they just tend to think about it very, very simply, right? They'll say things like, "It's not our fault," right. You just get these blanket denials, and then you get blanket allegations like, "Well, you should have encrypted the data," or, "you should have had multifactor authentication enabled." And then you get the Department of Revenue itself saying, "Well, the IRS didn't tell us that we needed to do anything differently than we were already doing," right? Yeah, this is very simplistic thinking.
Jake Bernstein: It is. And let's unpack some of those. First, the single point of failure is that's the dangerous game of just trying to find one single thing. And I think it's not only is it people who aren't necessarily technical who have a problem with it, but I think that even IT people who are not steeped in security thinking also may be tempted to say, "Oh, if only we had that encrypted," or, "gosh, if MFA had been enabled, then that person couldn't have gotten in," But that doesn't-
Kip Boyle: Yeah, I think that's true.
Jake Bernstein: But that doesn't... Yeah. And the failure there is to understand how the kill chain works. Every-
Kip Boyle: Right. Now, let's talk about what the kill chain is before we start dissecting it.
Jake Bernstein: Exactly. Why don't you go ahead and just describe what the kill chain is?
Kip Boyle: Right. Very simply, the kill chain is... There's a common pattern for how cyber attackers commit data breaches. And we've really talked about it already, as we described the history of this particular data breach. And so the thinking is that, if you have a chain of events, and those events have to happen in a certain sequence in order for cyber attackers to get in and steal what you have and get out again, then the defender's perspective is that, "Okay, here's a chain of events. And if I can sever the links in this chain, in one or more places, then the entire attack fails, no matter how far you get in. If you can get five steps into six steps of the kill chain, but I can stop you on the sixth step by breaking that link, then the entire attack falls apart." That's the idea of a kill chain, and that's what we're talking about.
Jake Bernstein: Exactly. And another way to think about it in more business terms is that, this is just a multi-stage project that has dependencies. And at any given point, you might have multiple ways to complete a given sub-project, but there's also going to be... As you get toward the end, the options to complete your project get smaller and smaller and smaller. And that is very similar to what happens with cybersecurity and kill chains like this.
Kip Boyle: Yeah. And I want to make a comment. Before you keep going, I want to make one more comment because I think this is important. We've said this many times in previous episodes, but it's very important for people in our audience to understand that, when you are talking about a multistage project, you're not talking about one person attacking you, you're talking about a group of people attacking you. Just like it takes a group of people to make major changes to your computing system or to complete a big multistage project, you've got to have a team of people. That's what's go on here as well. If you're still thinking that there's just a bunch of teenagers sitting around doing all this on their own, that's not true.
Jake Bernstein: No, it's not true. And so to highlight what was going on after this breach is that different parties, depending upon who they wanted to deflect blame to, were laser focusing on specific parts of the kill chain and trying to argue that the points of failure were someone else's fault. And it was just very unhelpful, and one of the things you mentioned... Well, actually, all of the things we mentioned were said. IT people were saying it should have been encrypted. There was an outside security vendor involved and so the state blamed that person or that group.
But then, the IRS being blamed by the governor of South Carolina at the time is very interesting. And the reason that she did that is that the IRS had rules and regulations in place for how states needed it to transmit tax information to the IRS. This is necessary when you have a state income tax and you have to send those tax records to the federal IRS, and the IRS's compliance requirements actually didn't include encryption. And so when the governor of South Carolina was saying, "Your standards were archaic, and this is your fault because you didn't make us encrypt," she was basically trying to deflect at least some percentage of the blame onto a completely separate organization that, I suppose, could have said, "do something differently."
Kip Boyle: Right.
Jake Bernstein: But, ultimately, what did all of this blame deflection accomplish?
Kip Boyle: Well, I want to make one more comment about why blaming is so important, is because, I mean, who wants to be held accountable, right? Nobody wants to lose their job. And so a lot of this simplistic thinking about, "Well, it's their fault because they didn't do this one thing," I think a good amount of the reason why that also happens is just plain fear. I mean, people just don't want to be pointed out as the one who didn't do X and therefore we had this horrible thing that happened.
Jake Bernstein: And the reason for that comes down to liability concerns. At the end of the day, as we said, this cost $14 million. Who has to pay that? Who's going to get sued? Who's going to potentially, of course, lose their jobs? Everyone wants to avoid responsibility for this. And I think, those acts of self-preservation have really served to diminish our collective ability to respond effectively to the changing landscape of cyber attacks. Particularly because one of the outcomes of this single point of failure focus is that, let's say, you activate encryption or MFA, there's a strong tendency to feel like, "Great, I'm secure now because I fixed what happened to me, and so that's not going to happen again." And you know what? That's probably true, but, Kip, does that make them secure?
Kip Boyle: No, not at all, because there's multiple ways to get things done, right? That was a point that you made earlier in the episode, is that you're talking about a... What'd you call it? A multi-stage project. And so there's more than one way to do this. Let's say you turn on multi-factor authentication. Well, great, you may not get breached in the future through the failure of multi-factor authentication, but you could have a failure because something else was compromised. So yeah, that's absolutely a fallacy in the thinking.
Jake Bernstein: It is. And I think, one of the other things that's really interesting here, going... just finishing the thread of the government, the intergovernment blaming, is that governor Hailey at the time said, made a big deal, that the Department of Revenue was IRS compliant, and it was. And she was blaming the IRS for lack security requirements. But there's a more important point here, which is the difference, as we've talked about on this podcast before, about between compliance and actual security.
Kip Boyle: Right? And as we've also talked about, compliance is usually driven by some sort of a checklist. And the problem with checklist is that cyber risk doesn't really checklists, cyber risk is dynamic, it's constantly changing. And any checklist that you create today to deal with it is going to rapidly become stale and brittle, and will become even more so over time, because we're not dealing with a risk that is enduring like fire. We've talked about that, right? Where fire is just oxygen, fuel, and heat. I mean, that's what it is, that's what it'll always be. But cyber is never the same. It's always changing.
Jake Bernstein: Right. It is. And I think that this viewpoint of, either pinpointing, or deflecting, or single point of failure, that we're focusing on compliance, is that it's failing to appreciate how companies and organizations should manage cyber security, which is of course, as we've said, many times, a risk. The risk-based approach expand your ability to see the whole picture and should allow you to get over that single point of failure blame game and move on to far more effective cyber security risk management.
Kip Boyle: Well, I think that's one takeaway from this. I think another takeaway is that you're going to end up in a blame game anyway way. I mean, even if you don't want to be in a blame game and you're not an active participant, it almost opens you up to having everybody point at you and say, well, "You're the one that's responsible for this, so you are the one that has to pay for it." It's unfortunate that the blame game is probably going to happen anyway, and still happens today 2019, and it'll probably continue to happen in 2029. But the issue is that, even if you prevail in the blame game, you still have a bunch of cyber risk that you have to manage.
Jake Bernstein: You do. There's one more point that governor Haley raised that I think is actually very, very important, which is some questions that she asked, which is, "Okay, this happened. We think the IRS should have done more." But moving beyond the blaming the IRS, she asked, "How much security is enough, who gets to decide, and who and how is it going to be enforced?" And we can't possibly answer those questions right now, but it's telling that seven years later, I don't think we... We don't have answers yet to these questions.
Kip Boyle: No, we don't. Not only do we not have good answers to these questions on an organizational basis inside of our country, but but from a national security perspective, we don't even know what the answers are to these questions. I mean, this is something that is confounding everybody at all levels of our economy and our government.
Jake Bernstein: Yep. And we promised, at the start, to talk a little bit about the accountability and lack thereof. Here's one to raise awareness about the difficulty that people had in holding anyone accountable after this breach. We've talked about the blame game. It was very effective because, what happened was, is that a class action was brought against South Carolina really just months after the lawsuit... Or, sorry, the breach was discovered, but this lawsuit was ultimately dismissed. And while some of this is changing because of either new laws or new case law, one thing does remain true, which is that linking specific cases of identity theft to any given breach is difficult to impossible, and-
Kip Boyle: This was a class action lawsuit brought against South Carolina by the citizens of South Carolina? Who brought the suit? I'm curious.
Jake Bernstein: It was, yes. It was individuals, individual citizens, who were victims of the breach.
Kip Boyle: Got it.
Jake Bernstein: And by the way, I'm not sure we mentioned it, but this affected 75% of the population of that state. And what made this hard is that there was eventually a huge rash of ID theft and tax fraud, but it happened later, and so this case was dismissed. And you might be wondering, Kip, "Why? Why was it dismissed?"
Kip Boyle: Yeah. Well, yeah, I'm wondering for sure.
Jake Bernstein: And I will tell you that the legal reason was, quote, "Lack of standing based on a failure to show article three harm," end quote.
Kip Boyle: Okay. No, that...
Jake Bernstein: Does it-
Kip Boyle: That sounds like some serious lawyer jargon.
Jake Bernstein: Yeah, it is. And doesn't that just make it all clear for you? But basically, the constitution, and I mean the constitution, the federal constitution, requires that parties bring actual cases or controversies to federal court. And one essential component of having a real case is harm. And the judge felt that, because the plaintiffs in this case couldn't show that any of them had actually lost money or been victimized, that the harm that they were claiming was just too speculative.
Kip Boyle: And that's because they sued fast. Whereas, the ID theft happened later. Is that one of the things we can see in retrospect?
Jake Bernstein: That is part of it. I think it's also a... I mean, remember this was 2012. I think there was a failure to understand how criminals used information. Prior to this Department of Revenue breach, most of the cases of... Or most data breaches were really after payment card data, and payment card data expires pretty fast because you can use it, but as soon as you start using it, the credit card industry has a very sophisticated fraud detection system and they will just cancel those cards. What people didn't understand was that, when you go and you grab a huge number of documents about social security numbers, address information, everything that's in tax records, which is a lot of personal information, that stuff doesn't expire. You can use it years later to open new credit accounts, to basically commit identity theft on a broad scale. And that is what happened.
Kip Boyle: Right.
Jake Bernstein: I believe there's crosstalk-
Kip Boyle: That's what makes health records so valuable.
Jake Bernstein: That's what makes health records so valuable. The bad guys are, are smart. Again, fire doesn't innovate, but cyber criminals do. And one of the ways they do that is they say, "Okay, well, the credit card companies have made the payment card data less than useful, so we'll just go after something else. And we are creative and we'll find a new way to monetize it." And that's what happened here. But like I said, it happened years later. And of course, even if they had tried to bring the lawsuit again, how are you going to show that it wasn't some other data breach by then that was responsible?
Kip Boyle: Yeah. It's difficult to actually get evidence that's clear and compelling and draws a straight line from the data breach to the harm.
Jake Bernstein: Yeah. And I think we're getting better about this. Some of the ways that this is being solved is, for example, the California Consumer Privacy Act, which hasn't gone into effect yet, but will next year. It bypasses this whole standing issue. One, by being a state law that you can bring in state court. And two, by basically presuming harm and giving you the right to collect statutory damages. And those are basically just a number that the legislature decides, "Okay, this is how much it hurts. If you can prove actual damages, be our guest. By all means, go ahead and do that. But if you can't, here is a minimum sum that we are going to say is your damage." And this is common. This is how cam spam works. And, and the California anti-spam law. It's hard to prove harm from receiving some spam email so there's statutory damages. And I just wanted to say-
Kip Boyle: Okay. CCPA also redefined the definition of harm, too, didn't it? I mean, it did an end around it against on the whole issue of you can't show harm.
Jake Bernstein: I wouldn't act actually go that far, only because after the South Carolina lawsuit, there have been other courts who've disagreed with that analysis. And that's why this is such a rapidly developing area of law that there's only limited value in examining the South Carolina case, at this point. Though, it was only seven years ago, that's essentially the prehistoric times in data breach litigation. We'll probably revisit this on a future podcast.
Kip Boyle: But I think it's interesting that here we've dredged up something from seven years ago, which in internet years, right, is forever. Right? But we're still wrestling with some of the core concepts of harm, and how do you approve harm, and how do you govern this stuff, and how do you adjudicate for people who've been wronged? And it's like, okay, so we're still trying to figure this stuff out. And I think that's why it's relevant to our listeners today.
Jake Bernstein: It is. And let's be honest, we're still dealing with the blame game and single point of failure obsession and failure to understand risk based management. It's absolutely relevant, even if some of the specific details are less so.
Kip Boyle: Yeah. Hey, listen, before we wrap up the episode, I want to just talk about one thing that I think is really important to wrap up on. And it back to this single point of failure idea. And so I just have to call out the fact that people were saying, "Well, if we only had multifactor authentication," "if only we had encryption," so on and so forth. But actually, if you go back and you look at the details of exactly how these cyber attackers got in and did what they did, what you find is that, a lot of the things that people actually talked about downstream from the hack would've made no difference.
For example, if the data had been encrypted. Well, here's the thing. If the data's encrypted, then you can't easily steal it off of a hard drive. But if you can get into the application that uses the data, then you can absolutely get to it when it's been decrypted, because you have to decrypt it in order to let authorized users have access to it. It's just stunning to me how some of these, "Well, we would've been okay if," conversations that happened, had no basis even in reality.
Jake Bernstein: Yep. And that's common. Everyone wants to be the Monday morning quarterback on this kind of stuff. And when you get into that level of detail, it's never that simple.
Kip Boyle: Well, I think it's important for our risk managers listening to realize that there are some very common prescriptions that people talk about, like, "We should use encryption. We should turn on a multifactor authentication," but make sure... Or when you grab onto those things, or you hear people talk about them, that you actually do your homework to make sure that that is an appropriate medicine for what is ailing you, because it isn't always.
Jake Bernstein: Yep. Exactly.
Kip Boyle: Okay. Well, that wraps up this episode of the Cyber Risk Management Podcast. And today, we talked about how the 2012 South Carolina Department of Revenue breach, and the resulting blame game that followed, still has lessons for us today. Don't focus on single points of failure, and remember to manage your cyber risk holistically. And as a dynamic entity. We'll see you next time.
Jake Bernstein: See you next time.
Speaker 1: Thanks for joining us today on the Cyber Risk Management Podcast. Remember, that cyber risk management is a team sport and should incorporate management, your legal department, HR, and IT for full effectiveness. Management's goal should be to create an environment where practicing good cyber hygiene is supported and encouraged by every employee. If you want to manage your cyber risks and ensure that your company enjoys the benefits of good cyber hygiene, then please contact us and consider becoming a member of our cyber risk business strategy program. Find out more by visiting us at cyberriskopportunities.com and newmanlaw.com. Thanks for tuning in. See you next time.
Sign up to receive email updates
Enter your name and email address below and I'll send you periodic updates about the podcast.
Cyber Risk Opportunities