
EP 57: The new “At a Minimum” FTC standard
Our bi-weekly Inflection Point bulletin will help you keep up with the fast-paced evolution of cyber risk management.
Sign Up Now!
About this episode
July 7, 2020
Kip Boyle, CEO of Cyber Risk Opportunities, and Jake Bernstein, JD and CyberSecurity Practice Lead at Focal Law Group, discuss the FTC’s new “at a minimum” language in its cybersecurity decisions and what that means for cyber risk managers.
Episode Transcript
Speaker 1: Welcome to the Cyber Risk Management Podcast. Our mission is to help you thrive as a cyber risk manager. On today's episode, your virtual chief information security officer is Kip Boyle and your virtual cybersecurity counsel is Jake Bernstein. Visit them at cyberriskopportunities.com and focallaw.com.
Kip Boyle: So, Jake, what are we going to talk about today?
Jake Bernstein: Hey, Kip. Today, we're going to talk about the FTC's latest and greatest decisions and the quote, at a minimum cybersecurity standards being used now to enforce the reasonable cybersecurity standard that you and I have talked about many times before.
Kip Boyle: Okay. So ladies and gentlemen welcome to yet another legal episode. You knew this was going to happen sooner or later so don't be surprised and wipe the smirk off your face. All right. So this is good though because FTC has been pursuing reasonable cybersecurity for, what, 12 years now or 10 years now?
Jake Bernstein: I think closer to 20 actually.
Kip Boyle: Okay, 20. Because I first became aware of it in 2010. So about 20 years. So this seems like a big change. But have they changed anything or is this more of an evolution?
Jake Bernstein: I would say it's more of an evolution. I think that there are some commentators on LinkedIn and the like who have mourned the lack of reasonable mislanguage in some of these current cybersecurity consent decrees and decisions. But it's not a sea change. It's more of a... I think it's actually a helpful evolution and we'll kind of talk about that.
Kip Boyle: Okay. All right. So evolution it is. Let's get right to it. Bring us up to speed.
Jake Bernstein: Definitely. Okay. So first we're going to be looking mostly at the FTC's September 3rd 2019 decision in the LightYear Dealer Technologies matter. And for those who are following along at home, this decision can be found on the FTC site or by simply searching FTC LightYear decision in your search engine of choice. So there are a few interesting things to note about this decision as a preliminary matter.
First, LightYear Dealer Technologies as you might expect is a software company serving car dealerships. The actual product was called Dealer Built and it was a dealer management system, basically a CMS for car dealers. But the FTC brought this case at least partially under the Gramm-Leach-Bliley Act standards for safeguarding customer information role. You might be thinking, GLBA, that is a financial institution law isn't it? And you'd be correct.
Kip Boyle: I think I can see where this is going.
Jake Bernstein: Now, what's so interesting here is that this case is therefore also somewhat of an expansion of what financial institution means to the FTC, at least in terms of cybersecurity. The reason that the FTC decided to use GLBA here is they said that Dealer Built was "Significantly engaged in data processing for its customers and those customers were auto dealerships that routinely extended credit to consumers."
So what I take away from this part of this case is be careful when you're processing data that's related to any kind of credit because if you are doing so, you very well may find yourself under the safeguards rule. And I'm not sure if we're going to have time this episode to talk about the safeguards rule, but it's something that we will probably bring up in future episodes. But it is another version of the FTC's standard that has been around for quite some time. So I just wanted to kind of get that out there so people are aware that this is an interesting case for that purpose.
Kip Boyle: Okay. So I got to say. When you first said LightYear Technologies, my brain went immediately to the first Toy Story movie when Buzz Lightyear showed up and Woody started picking on him and calling him Light Beer and Light Snack and really riffing on his name. So when you said this was a car dealer software, that just threw me for a total loop. All right, anyway. Just that's where I went in my head. So sorry anybody if you thought that was a weird pause. But on with the show.
Jake Bernstein: That was a weird pause. But to bring us back on track, I'm going to tell you about the breach because we always like to know about the breach, right?
Kip Boyle: Yes, yes, yes. But wait a minute. Before we do that, I just want to make another point which is I for the longest time did not understand even though I had bought cars, I did not understand that every car dealership in America anyway probably in other parts of the world is actually a little bank branch. I mean, they do the same stuff. Like you just said, they're credit granting. They do loans. They evaluate people's credit records. So I mean they've got all this PI. So I'm really glad to hear that the FTC has classified them as a financial institution because that's what they are except in name. So this is good.
Jake Bernstein: And this ties in well because what do you need to check people's credit?
Kip Boyle: Among other things, you need my social security number.
Jake Bernstein: That's right and it goes well beyond that, doesn't it? It goes-
Kip Boyle: Oh, you need-
Jake Bernstein: ... name and address.
Kip Boyle: Yeah.
Jake Bernstein: A lot of times-
Kip Boyle: It's everything you need to steal me digitally.
Jake Bernstein: Exactly, right? So what we had here was a situation where there were 12.5 million customers stored in 130 Dealer Built customers. Or I should say stored in a database that contained the information of about 130 Dealer Built customers and guess what? The social security numbers were not encrypted. They were just in plain text.
Kip Boyle: Oh, ouch.
Jake Bernstein: So it's unclear how much data was really downloaded. I think somewhere around 70,000 customers had their information fully downloaded. And just to be clear, it was a lot more than just social security numbers as we were talking about. But that in short is the simple version of the breach story. And so-
Kip Boyle: Do we know who did the download?
Jake Bernstein: We do not. At least it's not in the information that I have seen or gone through recently.
Kip Boyle: Do we know how they got notified that there was a data breach?
Jake Bernstein: Yeah, I believe their chief technology officer actually noticed some strange happenings and investigated and went from there. The full story if people are interested is in the complaint, which can also be found on the FTC's website.
Kip Boyle: Okay. The reason why I asked this is because I'm always interested to know how did a data breach come to the light of day? Because it's almost never the case that the organization themselves figure it out first. It's almost always the case that an outsider figures it out first and then notifies the offending organization or the victimized organization. So how interesting.
Jake Bernstein: It is very interesting and it's... Well, if we go through the facts of the breach, then I think you can see that this was a predictable event. Going through the complaint here, you can see that there are open connection ports that are not in any way blocked particularly to their network attached storage device, their NAS device. The respondent, LightYear did not perform any even basic vulnerability scanning, pen testing or any other diagnostics that could have allowed it to detect the open port and they really just didn't take security very seriously.
Kip Boyle: This is a blast to the past. This is something like 1998 to my ears.
Jake Bernstein: So actually I was partially correct. The chief technology officer was the person who first was notified of the breach. But guess what? It was actually a customer who called the chief technology officer and said, "Why in heck is all of this customer data publicly accessible on the internet?"
Kip Boyle: Okay. That's sounding more likely, a more common way to find out that you've been data breached. And it's a terrible way to find out.
Jake Bernstein: It gets worse.
Kip Boyle: Oh, because I'd rather have the FBI, the Secret Service come and tell me, to be honest with you, if I had a choice.
Jake Bernstein: The truly bad part is that it was only after a security reporter provided information to LightYear regarding the actual vulnerability that LightYear was able to figure out the source of the vulnerability which was ultimately this open port on the storage device.
Kip Boyle: Was it Brian Krebs? Was he the security reporter.
Jake Bernstein: He could have been. That information is not in the complaint.
Kip Boyle: Okay. That sounds very Krebs like.
Jake Bernstein: It does, doesn't it? This was back in November of 2016 when this happened.
Kip Boyle: Oh my god. That's a million years ago.
Jake Bernstein: It is, yes, but the actual litigation and everything that happened was much more recent than that.
Kip Boyle: Oh, yeah. The wheels of justice. crosstalk
Jake Bernstein: It's also unclear to me exactly when this was reported to the FTC and when they started their investigation. So regardless, what was... S the breach itself, as we just said is fairly typical, right?
Kip Boyle: Yeah.
Jake Bernstein: Which I think actually really in this case actually helps matters because it's not one of those one in a million hacks that's like, "Oh my gosh. I didn't see that coming. This was a run-of-the-mill.
Kip Boyle: Oh, yeah. Very vanilla.
Jake Bernstein: Open port, super vanilla, absolutely preventable data breach.
Kip Boyle: Yep. And as I said, this sounds like 1998 all over again. It just sounds like the security management practices were of another era.
Jake Bernstein: Or just non-existent.
Kip Boyle: In this case, that's the same.
Jake Bernstein: Yeah, it is. We know part of how this ends, right? There is a 20-year requirement and the way that they've worded this is interesting. So the actual decision says that LightYear is prohibited from transferring, selling, sharing, collecting, maintaining or storing personal information unless it establishes and implements and thereafter maintains a comprehensive information security program that protects the security confidentiality and integrity of personal information.
Kip Boyle: That sounds pretty standard for FTC.
Jake Bernstein: Pretty standard, yep. Pretty standard. But here's what's changed...
Kip Boyle: Yeah. I was about to say, so far this is all sounding very vanilla, very common. When do we get to the new stuff?
Jake Bernstein: Right now. This is not the first order to use this language, but it is a good one and we've sort of discussed already the reasons I've chosen this one to talk about. But here's what it says. It says immediately after telling this company that they're going to have to be establishing this comprehensive information security program, it says, to satisfy this requirement, respondent must at a minimum, colon. This is different. This is the new quote, at a minimum language and I think it's really interesting because in the past, the FTC has laid out various sets of consent decrees that have similar language related to what they think people should be doing for information security.
But this is the first time with using the at a minimum language where they actually say, "Okay, look. We're going to tell you roughly speaking how to satisfy this requirement." That's good. I think it's mostly good. I think there are some possible concerns about it. Maybe that's a different episode.
Kip Boyle: Well, they have never wanted to go into these waters before. I know that, right? From our previous conversations, they didn't really want to be prescriptive.
Jake Bernstein: That's right. And the reason is simple, right? If they prescribe certain treatments, then the bad guys are going to read this because these are public documents and they're going to say, "Great. We'll just do something slightly different in order to bypass these requirements."
Kip Boyle: I think that's part of it. But I think the other part of it too is look at PCIDSS, right? They are extremely prescriptive and I think one of the problems with that is that people get so locked into a checklist mentality and checklists are stale the moment you make them because cyber is a dynamic risk and partially for what you said about the adversary. Once they know what your battle plan is, they're going to go around it and you don't want that.
Now, you get weighed down because okay, here's the checklist and then you got to maintain that checklist. So then there's a whole bureaucracy around maintaining it, updating it. What should go on it, what shouldn't go on it? Should you ever let anything leave the checklist? This thing just turns into a hairy monster.
Jake Bernstein: It does. As a brief digression, I think it's worthwhile to quickly unpack this checklist. Our anti-checklist stance at least with cybersecurity, you think about where are checklists deployed and used very effectively. And I think of two primary situations. One is pre-flight on a plane and another is in surgeries.
Kip Boyle: Right, right. And Atul Gawande had a wonderful article in the Atlantic magazine and then he published a book about the checklist and how wonderful it is.
Jake Bernstein: Exactly. And it's wonderful when you don't have a thinking adversary who is responding to your checklist. A mechanical device like a plane doesn't take action to screw up your checklist, and the procedures in a surgery are always the same. You want to make sure for example that you don't leave a surgical instrument inside someone's body.
Kip Boyle: Or a sponge.
Jake Bernstein: Or a sponge. All of these things have happened and checklists are amazing at preventing that kind of thing, right?
Kip Boyle: Definitely.
Jake Bernstein: They're really, really good at it.
Kip Boyle: Yep. Static threats. That's what checklists are good for.
Jake Bernstein: But they fail completely when it comes to dynamic threats, which of course is what you have with cyber. So let's go through this and let's decide at the end if the FTC has fallen victim to the dynamic threat checklist failure.
Kip Boyle: Pitfall.
Jake Bernstein: Pitfall, yes. You have to come up with some clever acronym to describe this.
Kip Boyle: The great debacle.
Jake Bernstein: Yes. It's a fallacy. There you go. The dynamic checklist fallacy or the dynamic threat checklist fallacy. I'll have to figure that out some more.
Kip Boyle: Okay. Watch out ladies and gentlemen. Jake is working without a net.
Jake Bernstein: Yeah. You've heard it first.
Kip Boyle: But he's working without a net so it could get ugly.
Jake Bernstein: All right. I'm going to go back to my net. So here we go. So just as an example, if you go back to the Fandango settlement from 2013, you're not going to find this type of language. It's obviously not completely different because the FTC as we said very early in this episode, this is an evolution not a revolution. So it's not like they completely change things.
But I think the at a minimum language is great because it does actually help us answer perhaps one of the most common questions from clients, Kip which is, "Okay, but what should we really do about cybersecurity?"
Kip Boyle: Right. Where's the checklist, man. I mean, I think that's part of what's going on here. There's a human appetite, but there's a craving for a checklist.
Jake Bernstein: There is because it's not just that checklists are easy, it's that checklists allow non-experts to effectively engage in expert type behavior, right? And it means that a checklist allows the smartest person on the planet to come up with the ideal system and then everyone else can follow it. As we've talked about that's amazing for static threats. The problem here with cybersecurity, of course, is that it's not. Not static that is.
Kip Boyle: Right.
Jake Bernstein: So let's just go through and kind of touch on the first few of...
Kip Boyle: Yeah, please.
Jake Bernstein: Because guess what. There is a list. In fact, it's A, B, C, D, E, all the way down to I.
Kip Boyle: Little I?
Jake Bernstein: Well, these happen to be capital letters.
Kip Boyle: Well, I know in the brief. So a capital I and a little I can be different.
Jake Bernstein: Oh, they definitely are. Did you know randomly that those little I's are actually... They're called Romanettes.
Kip Boyle: I did not know that.
Jake Bernstein: You have roman numerals which are the capitals, but the lower case roman numerals are called Romanettes.
Kip Boyle: Ladies and gentlemen...
Jake Bernstein: That is the kind of value that you get by listening to the Cyber Risk Management Podcast.
Kip Boyle: That's right. There's a value add, Romanettes. Okay.
Jake Bernstein: Romanettes, yep.
Kip Boyle: It sounds like the Rockettes or something like a dancer troupe.
Jake Bernstein: I know.
Kip Boyle: Okay.
Jake Bernstein: Okay. People are going to start fast forwarding us.
Kip Boyle: They already did that man.
Jake Bernstein: They probably did. Okay. So A document in writing, the content implementation and maintenance of the information security program.
Kip Boyle: Okay. That's a written information security program, a WISP.
Jake Bernstein: It's a WISP. Sort of a checklist, right?
Kip Boyle: Yeah.
Jake Bernstein: I mean, I suppose you could check that off, but that is not a very helpful checklist in the scheme of things. In order to check this box so to speak, you need an expert to help you write and implement and maintain your information security program. I don't consider this... So I think so far the FTC is doing pretty well. They have not yet fallen for the dynamic threat checklist fallacy.
Kip Boyle: Keep going.
Jake Bernstein: DTFA. B, provide the written program and any evaluations thereof or updates there too, to the respondent's board of directors or governing body. If no such board exists then you can give it to a senior officer, but you have to do it at least once every 12 months and every time there is a "covered incident"... So let's pause on this one for a second. So one, this is reasonably checkboxy insofar as write a report and give it to someone. But does anything about that strike you is really interesting?
Kip Boyle: Well, so far they're not telling you anything about what should be in the report.
Jake Bernstein: That's true.
Kip Boyle: So that's interesting.
Jake Bernstein: But let's focus for a second on what it means... Why would they say, give it to the board of directors?
Kip Boyle: Okay. So the board of directors is typically a pseudo-independent decision-making body from the management team. Am I getting warm? Am I getting warm?
Jake Bernstein: You are getting warm, yeah. At least under American corporate law, the bucket stops with the board of directors. And I think what's really great about this is that it is helping to eliminate any possibility that cybersecurity failures are a corporate management failure. Now, they are, right? I think you and I would agree with that.
Kip Boyle: When they happen.
Jake Bernstein: When they happen. One of the big problems at least has been historically that the boards don't always know what's going on. They either don't think to ask although, I think it'd be hard to do that now. But this provision here just eliminates that. What it does is it also prevents the board from being able to say, "Well, we didn't know." They must be told. So I think that's useful.
Kip Boyle: Right. They must be told and if they're not told, then that's a violation.
Jake Bernstein: Yep. Now, this one actually truly is a checkbox type item. Designate a qualified employee or employees to coordinate and be responsible for the information security program.
Kip Boyle: Okay. What I like about this checklist so far is none of it strikes me as having a short shelf life as being too detailed or too ephemeral. So this is good so far.
Jake Bernstein: All of this is at a minimum. So this is the minimum floor of what you need to be doing.
Kip Boyle: And I'm sure car dealerships will do well in excess of the minimum.
Jake Bernstein: Oh, yes. Now, D gets interesting. So again, this is at a minimum. Assess and document at least once every 12 months and promptly following a covered incident, internal and external risks to the security, confidentiality or integrity of personal information that could result in the unauthorized disclosure misuse, loss, theft, alteration, destruction or other compromise of such information.
Now, let's pause and bask in the glory that the FTC is essentially telling people to do what we've been telling people to do all along, which is manage your cyber risks, figure out what they are, know what they are, assess and document them. One of the things I love about this is that it is a... It really... We don't usually toot our own horns on the podcast too obviously, but I think it's really valuable to point out that the work we do with our clients is an annual assessment and documentation of the internal and external risk to the security confidentiality or integrity of data. That's what we do. And to see the FTC come out and put it as the first real kind of guidance is, I think very exciting and helpful.
Kip Boyle: Well, okay. So I've got to admit that so far I had some trepidation about this checklist for all the reasons we discussed, but so far I actually like it.
Jake Bernstein: Well, and the reason is that it's not really a checklist, right? It's not a checklist in the way that people want a checklist. This is just a revised, I think easier to digest version of the previously used reasonable cybersecurity standard. But when all you have is words, the way you organize them obviously matters.
Kip Boyle: And the other thing I like about this is that they're not getting into so many details. It's not too prescriptive so far. I mean, it is a little bit, but it's not telling you like what the length of the encryption key should be on your WPA2 setting.
Jake Bernstein: That's true. Now, they'll never do that. But ironically, the very next point here which we're on E is a bit more prescriptive. And let me just read some of this to give you... I'll summarize.
Kip Boyle: Okay.
Jake Bernstein: So basically design, implement, maintain and document safeguards. So now we're talking about controls to include, and again at a minimum, employee training once at least once every 12 months on how to safeguard personal information. Technical measures to monitor networks and all systems. So really we're talking about IDS, IPS. Data access controls for all databases that store personal information including at a minimum, restricting inbound connections to approved IP addresses. So what's that? That's an IP whitelist, right?
Kip Boyle: Mm-hmm (affirmative).
Jake Bernstein: Requiring authentication to access them and limiting employee access to what is necessary to perform that employee's job function. So these are all fairly standard security practices.
Kip Boyle: Oh, I would say so. In fact, I would say that these things have been practiced by most organizations for years now and when I reflect on this list with respect to this particular incident, I can see why FTC feels the need to formalize it because I mean at a minimum is I think a very good moniker for this.
Jake Bernstein: It is. It is at a minimum, for sure.
Kip Boyle: Yeah. This is a minimal minimum.
Jake Bernstein: Yep. So number four here is... This gets a little bit more prescriptive. Encryption of all social security numbers and financial account information on the computer networks. But you'll note, it doesn't say how or what. It just says encrypt it somehow, which gives a lot of leeway. And then policies and procedures to ensure that all devices on the respondent's network that can access personal information are securely installed and inventoried at least once every 12 months. So that's really just part of the identification of assets. That is the end of E.
F is assess at least once every 12 months and then any time there's a covered incidence, the sufficiency of those safeguards. So that's also standard NIST Cybersecurity Framework guidance. We have to test and monitor the effectiveness of the safeguards at least every 12 months. Maybe we have to modify the information security program based on those results. We have to select and retain service providers capable of safeguarding personal information, which I think is pretty interesting. In fact, this decision contains a contractual requirement for service providers to implement and maintain their own safeguards.
And then finally, I is to evaluate and adjust the information security program in light of changes to operations or business arrangements, et cetera. So basically keep it updated and maintain it as you normally would.
Kip Boyle: So would you say that this is a subset of the NIST Cybersecurity Framework? I mean, was there anything in here that we haven't seen in NIST CSF?
Jake Bernstein: I don't think it is a subset of the NIST CSF. I think what it's getting at is that there's a lot of different ways to phrase what's in the NIST CSF. And I think this is an FTC style restatement of it. We both know from the past that the FTC likes the NIST CSF. It clearly influences them. They haven't gone so far, as far as I know in a consent decree or decision to directly reference the CSF, but this gets pretty close because they're substantively on point.
Kip Boyle: Right. And the reason, I'm asking you this question is I'm thinking about the customers that we serve and I'm thinking about the way we serve them today generally speaking. So I'm asking myself like do we need to change our approach? Do we need to change our advice? Do we need to change our tooling to keep up with. This evolved at a minimum standard.
Jake Bernstein: I don't think we need to change it greatly. I think that it might be helpful to explain to people that, "Hey, look. The CSF is not the law, but here is what the FTC says is the law. And it tracks with the CSF very well. So I think that's what we continue to do.
Kip Boyle: Okay, okay cool. Wow. All right. So this great. I have to say, I think the FTC is going in the right direction.
Jake Bernstein: I think they are too. And what's really exciting about this one for me is that, it's not finished yet. So what we have talked about so far is all falls under the heading of the mandated information security program. And I think this part is probably the new at a minimum language is I think helpful. But this decision and all the subsequent ones and even the ones that are a bit earlier than this follow this pattern.
After they tell you what you're going to do at a minimum, the FTC builds in required third party assessments of your information security program. Part of this is what they did before, but you'll see how it changed. What the FTC wants to see now is a qualified objective, independent third party whose name is provided to the FTC to help determine compliance with the company's own information security program. And to identify new gaps and new specific evidence of compliance on this biannual basis.
So that's really helpful in my mind because what it does is adds accountability to these consent decrees and decisions. And that's one of the things the FTC was really going for here is that for quite some time, they had been putting in place these 20-year consent decrees, but there wasn't necessarily accountability for them. They've added that. And the way they've added that is they've mandated these third-party assessments. They've mandated cooperation with the third party security assessor.
And here is the new part, annual certification. This is pretty interesting. This requires a senior corporate officer to certify in writing that the company is compliant with the information security program requirements. And this is the really interesting part. The certification must be based on the personal knowledge of the officer or upon personal knowledge of subject matter experts that the officer can rely upon. So let's unpack this quickly.
What it is saying is it is trying to find a way to force companies who are subject to one of these decisions or these consent decrees to come to the FTC every year with a legal document stating that they're compliant with their own procedures and their own program.
Kip Boyle: Wow. This is starting to sound like a SOC 2. It's kind of going in the direction of a highly formal external audit.
Jake Bernstein: So it is, right? But what's interesting is that unlike a SOC 2 where it's kind of a point-in-time third-party identification. Remember the SOC 2 is the third-party certifying that these controls are in place. It's pretty specific, right? This is an annual certification from within the company from management level individuals that the company is following completely its own security program. And as we just talked about, these security programs aren't necessarily... They're not just controls, right? That's the big difference between a SOC 2 and one of these information security programs is that SOC 2 look at controls whereas the full overarching security program builds in the kind of the review, the lessons learned, the modification, the assessments, all that stuff.
Kip Boyle: Yeah. But it's making me wonder now if this is going to create a need or a demand for some kind of a new specialized outside service provider product that will deliver this because I just kind of have this sense that the senior corporate officer isn't going to do it based on their personal knowledge, they don't have time or expertise. So they're going to have to rely on an SME. And I'm just asking myself like who are they going to turn to and rely on for this certification. Are they going to turn to their own chief information security officer internally? And will that person rely on their own judgment and their own team or will either of these people say, "We should probably get an outsider to give us an XYZ standard assessment." And then we'll rely on that. I don't know. Just wondering.
Jake Bernstein: Yeah. It certainly could be and I think what's what's so interesting about this is that on a SOC 2, it's the assessor that signs their name, right?
Kip Boyle: Yep.
Jake Bernstein: Here, it's the senior corporate officer who signs their name. That's a material difference.
Kip Boyle: Oh, it absolutely is. It's the difference between an external audit and a management action, right? They may look similar on the surface, but the whole tone and the experience on the ground is very different.
Jake Bernstein: It is indeed. And I think what's so very interesting about this, is that there's no need to go through the rest of this, but the only other requirement here is that if there is a covered incident basically a security incident of some kind, you have to tell the FTC. And in fact, you have to notify the FTC within 10 days of notifying any other government agency regarding a covered incident.
Kip Boyle: So if I don't tell other government agencies, I don't have to tell FTC?
Jake Bernstein: No, that not how that works. It's just a situation where-
Kip Boyle: Okay. That's what it sounded like.
Jake Bernstein: The respondent within a reasonable time after the date of respondent's discovery of a covered incident, but in any event no later than 10 days after you report to any other entity. So basically, you can't discover something and then tell your state AG's office and then just not tell the FTC.
Kip Boyle: Okay, okay. All right. So one more question and then I think we need to wrap up this episode. So all of these at a minimums that we've been talking about now for the past few minutes, do they only apply to companies that are sanctioned by FTC or can we expect that everybody should start doing this stuff so that they don't have to be sanctioned?
Jake Bernstein: There's really two ways to answer that question. One of course is that from a sanctions legal perspective, the exact terms of these consent decrees always only apply to the companies that have been sued, right? But your question is different really which is, shouldn't we shouldn't we be taking this new at a minimum language and applying it to everyone? And the answer of course is yes. That's a lot of what the FTC is trying to do here, which is through individual decisions and consent decrees, and cases help everybody else meet at least some minimum cybersecurity standard. So yeah, absolutely. If you do this stuff, then you're much less likely to be sued because you're less likely to suffer a breach of the first place.
Kip Boyle: Right.
Jake Bernstein: And even if you do suffer a breach, it's worth remembering that the FTC isn't going to go through this whole process if you already have a demonstrable information security program in place. That's not a case they would necessarily win and it's not a case they want to spend their limited resources on. So following this is a very good, shall we say immunization or vaccine against future problems.
Kip Boyle: Right, okay. That makes sense. Well, I'm going to be studying this document quite a bit following the recording of this episode. Okay. Any last words?
Jake Bernstein: Not today. We'll have many more words probably about the Gramm-Leach-Bliley Act and its standards for safeguarding consumer information. But not this time.
Kip Boyle: Okay, cool. So that's something that we can all look forward to and that wraps up this episode of the Cyber Risk Management Podcast. And today we talked about the FTC's new at a minimum language in its cybersecurity decisions and what that means for you dear listener, a cyber risk manager. We'll, see you next time.
Jake Bernstein: See you next time.
Speaker 1: Thanks for joining us today on the Cyber Risk Management Podcast. Remember that Cyber Risk Management is a team sport so include your senior decision makers, legal department, HR and IT for full effectiveness. So if you want to manage cyber as the dynamic business risk it has become, we can help. Find out more by visiting us at cyberriskopportunities.com and focallaw.com. Thanks for tuning in. See you next time.
Sign up to receive email updates
Enter your name and email address below and I'll send you periodic updates about the podcast.
YOUR HOST:
Kip Boyle
Cyber Risk Opportunities
Kip Boyle is a 20-year information security expert and is the founder and CEO of Cyber Risk Opportunities. He is a former Chief Information Security Officer for both technology and financial services companies and was a cyber-security consultant at Stanford Research Institute (SRI).
YOUR CO-HOST:
Jake Bernstein
K&L Gates LLC
Jake Bernstein, an attorney and Certified Information Systems Security Professional (CISSP) who practices extensively in cybersecurity and privacy as both a counselor and litigator.