Close this search box.
FTC 2023 Privacy and Data Security Update

EP 159: FTC 2023 Privacy and Data Security Update

Our bi-weekly Inflection Point bulletin will help you keep up with the fast-paced evolution of cyber risk management.

Sign Up Now!

About this episode

June 6, 2024

What kinds of unfair trade practices does the FTC look for when it comes to privacy and data security? Let’s find out with your hosts Kip Boyle, CISO with Cyber Risk Opportunities, and Jake Bernstein, Partner with K&L Gates.


Episode Transcript

Speaker 1: Welcome to the Cyber Risk Management Podcast. Our mission is to help executives thrive as cyber risk managers. Your hosts are Kip Boyle, Virtual Chief Information Security Officer at Cyber Risk Opportunities, and Jake Bernstein, partner at the law firm of K&L Gates. Visit them at and

Jake Bernstein: So Kip, what are we going to talk about today in episode 159 of the Cyber Risk Management Podcast?

Kip Boyle: Hi, Jake. We are going to do something that we do every now and then on the show, which is we are going to turn our attention to privacy and data security updates, and specifically what we're going to do is run through a relatively recent government publication which you showed me and you assured me would not be boring.

Jake Bernstein: It's not too boring.

Kip Boyle: Okay, not too boring, but necessary, right?

Jake Bernstein: Very necessary.

Kip Boyle: Yeah. We all have to know this. As cyber risk managers we have to be aware of this. These are the things that are going to, or they continue to put guardrails, boundaries, fences, whatever you want to call them, around what is considered to be acceptable behavior. We've got to know this stuff. So in late March, the Federal Trade Commission published its 2023 privacy and data security update, and it covers the FTC's enforcement and regulatory activities between January of 2021 and December of 2023. So that's a two year review, right, Jake?

Jake Bernstein: Three. It's three full years.

Kip Boyle: It's three. All of '21, all of '22, and all of '23.

Jake Bernstein: All of '22, all of '23, yep.

Kip Boyle: Three years. And given the FTC's Central role in US cybersecurity regulatory enforcement, I mean, they own the reasonable cybersecurity standard, right?

Jake Bernstein: They do.

Kip Boyle: Yeah, I agreed with Jake that this is worth an episode.

Jake Bernstein: And obviously Kip, yes, I agree completely, I did push this. The FTC has just been incredibly active the last few years. Depending on the outcome of the 2024 elections, it could remain so for the foreseeable future. And look, the importance of the FTC, even in light of the many, many state comprehensive privacy laws that are being passed, is that the FTC remains the primary, but not the only, federal regulator of cybersecurity and privacy failures, it is mostly failures, for the general economy. Recall that banks and financial institutions are there covered by Gramm-Leach-Bliley Act, which involves a host of federal regulators, including the FTC, and that the healthcare industry, HIPAA, PHI, all that good stuff, is primarily regulated and enforced by the Office of Civil Rights within the Department of Health and Human Services.

Kip Boyle: And though we're not going to talk about it in this episode, didn't I recently hear some rumblings about the possibility of a federal data privacy law?

Jake Bernstein: Well, yes you did, and I would say it was a lot more than rumblings about a possibility. There was a gigantic roar of a discussion draft that was released. But you're right, we're not going to talk about it because I don't want to jinx it. But suffice it to say, if... I will name it, however, if the American Privacy Rights Act becomes law, it will be an absolutely historic sea change in how this country handles privacy and cybersecurity, and we will almost certainly do several episodes about it. But unlike with the American Data Privacy and Protection Act, which I believe we talked about for at least one whole episode, I'm not going to do that with APRA until it passes, because there's no point.

Kip Boyle: All right, all right. And I agree, I think that's the way to go. So we'll dig into this. Now, you promised me this is just... Sometimes you show up and you go, "I promise it's one episode," and then it turns into two. That's kind of your sneaky backdoor pattern here. So I just want to make sure this isn't going to happen again with this one, right? Because I'm going to put the hammer down.

Jake Bernstein: I think you should, and my yawn indicates already that it's not our normal recording time, so I'm glad everyone got to hear that.

Kip Boyle: And you're so concerned about me putting the hammer down that you yawned. I am so unthreatening. It's just-

Jake Bernstein: You are unthreatening there.

Kip Boyle: It's embarrassing.

Jake Bernstein: But no, it's not another DBIR miniseries in the making. Speaking of which, inaudible out.

Kip Boyle: But we do need to... It is out.

Jake Bernstein: We need to get there.

Kip Boyle: We do.

Jake Bernstein: Okay, well, let's dig into this thing being this FTC cybersecurity and privacy report. Now, allegedly they've been doing this for some time. I don't remember seeing it. Maybe because I didn't look in the past. Regardless, whether it's new or not, and I'm sure someone will... If we had an easy feedback system, someone probably inform me that it is not new, but-

Kip Boyle: They can all find us on LinkedIn. People will flame us without any hesitation. So flame on, everybody.

Jake Bernstein: So this report, which as you said, covers three full years, is a great resource, and it really is worth starting, as one does, at the beginning, because in the preface, which is written by the director of the Bureau of Consumer Protection-

Kip Boyle: Well that harkens back to your days in the state, right?

Jake Bernstein: Oh, yes, yes. I mean the FTC, I go way back with the FTC before I even learned about data security and privacy issues. I guess it was related, but yes, I do. And consumer protection is perhaps the natural location for all of this work because it isn't antitrust, and unless something changes, those are the two main divisions of the FTC. Consumer Protection Bureau and Bureau of Antitrust. So what do we get in the preface here? We get four bullet points that really highlight what the FTC sees as its major achievements, and those are actions related to artificial intelligence, as demonstrated by the Rite Aid ring and Amazon orders and a few others. Number two is a focus on protecting children and teens, demonstrated by the Epic Games and Edmodo cases. And again, a few others.

They focused on protecting "sensitive data," which at least the FTC has defined as health-related and location-related data. And for these ones we have BetterHelp, GoodRx, Premom, Flow Health, Rite Aid again, and Kochava, which is worth talking about. And then last, but by no means least, they highlight the actions that they directed the market as a whole. So this is things like rule-makings, amending old rules, and they did quite a bit of that. We're probably going to not go into detail on that because otherwise, Kip, it will become a multi-part episode. We're going to instead focus on the cases because I think you can learn the most from cases. I mean, that is the essence of the American legal system. So it ought to be how we treat this report.

Kip Boyle: And MBA students everywhere also agree that cases are the way to go. Because, yeah, this sounds like a lot. And I want to also add for everybody listening that unlike what I thought the FTC isn't just a bunch of government lawyers and suits. They have a lot of other people who work there too. They have economists, investigators, technologists, and all kinds of other specialists, and okay, they probably all wear suits, but it's just that they're not all lawyers, so they're not operating in a complete vacuum here. So I think it's worth reminding people about this just so nobody will assume that the FTC will forget about the technical stuff, because they don't.

Jake Bernstein: Yeah, it's really worth making that clear. I mean, they do hire... They have red teams and purple teams and they know what they're doing, they know what they're looking for. And I think forgetting that the FTC will look at source code, for example, is a good way to make a major mistake. So let's start with some overall statistics. I think this is helpful. Now the FTC has been doing this for about 20 years. Shake out well since 1999, which is about exactly, unfortunately 25 years. I say unfortunately because that's when I graduated high school, which means that's my 25th high school reunion in the near future, which is great.

Kip Boyle: Wait to out yourself. Keep going.

Jake Bernstein: Yeah, it's all right. It's pretty obvious. I think that's probably on LinkedIn. But they've brought 97 privacy cases, which does not include the 169 telemarketing Sales Rule and CAN-SPAM cases. So whether or not you think that's a lot, it's a lot to me. When I think about the FTC, I think about one or two cases a year that really makes the news. They've had more recently, but to say that they've been doing... They've had 97 cases since '99 means really they've done about four cases, four major privacy cases per year. So I think that's a good reminder. I also bet you Kip, I have not done this analysis, but I bet you those are way skewed to the last 10 years.

Kip Boyle: Yeah, probably. I mean, as the use of computers and databases and everything... Data's the new oil, so of course, all eyes are on it, and it's more intense than it used to be.

Jake Bernstein: It is indeed. Okay, so why don't you go ahead and dig into it maybe a little bit like the DBIR, but not totally.

Kip Boyle: Yeah, sure. The report's only about 39, sorry, 38 pages long, but it's it dense, it's densely packed. So let's look at artificial intelligence. And I got to say, I didn't expect to see anything about artificial intelligence in here, or at least not under that label. So that was surprising. So even though it's been in the news a lot, I'm just used to regulators being far behind the curve, but certainly that doesn't seem to be the case here. And of course artificial intelligence puts all kinds of technology into play. Now the showpiece case in this category is the Rite Aid facial recognition matter, and that got in right under the wire because it was filed on December 19th, 2023, which is just a few days before the close of the period. And the case, it's about Rite Aid's alleged failure to take reasonable steps to ensure that its AI facial recognition system that it used in its retail stores did not erroneously flag consumers as shoplifters or wrongdoers.

In other words, Rite Aid, they bought and implemented a system that could impact people's legal rights without testing it to make sure that it actually worked. Presumably they just relied on vendor assurances. Now, the FTC alleges that the technology was used unfairly, and that Rite Aid failed to consider and address the risk of misidentification to people of color, failed to assess the accuracy of the technology, they used low-quality images, and they failed to train or oversee the employees who were using it. And even worse, they didn't monitor the rate of false positives, and if this is all true, then I think this is a good example of a complete policy failure at the management level, because that's where all of the remedies or the preventatives should have come from. So if the FTC wins this case, Rite Aid will be prohibited from using facial recognition technology for security or surveillance purposes for five years, which almost doesn't seem like that bad of a deal because I know that the FTC has to set decrees that can go up to 20 years in the cases of information and security failures.

Jake Bernstein: And they are going to be required to... They're not allowed to use a technology for security purposes. Think about that. I guess the way the FTC would think about this is they've so blown the trust that they had that we're not going to trust them with facial recognition technology for at least five years. I don't know. Something about that just seems a little off. I think that-

Kip Boyle: What do you think is more reasonable?

Jake Bernstein: I'm not sure I would've banned them from using it at all. I think I would've required that they put it through some major privacy review and compliance processes, just because... I mean, this system didn't work. Now, maybe it's the case that we don't want AI-powered facial recognition systems anywhere, but that to me seems more like a legislative decision, not something that a regulator should just decide on own. But that's my opinion. That's my opinion.

Kip Boyle: Well, I don't struggle with it in the way that you described. I actually think that it's... I stack it next to the city of San Francisco, for example, who outright banned the use of facial recognition on city streets by its police force because of similar failings. And I'm aware of other uses in policing, where police departments use systems like this one that was described, and they recognized somebody, so-called recognized them, as a wanted fugitive. It was the wrong person. And this person ended up in jail for many days and had their reputation destroyed, and it turned out that it wasn't that guy. So to me, it's not surprising, "Oh, he's yet another failed attempt to do some sort of asset protection in the real world with one of these kludgy systems.

Jake Bernstein: And I think that's the key, is that you can't use something that doesn't work just because you say it's for security purposes. And I think that... Let's say there's a hypothetical imaginary facial recognition system that works perfectly. I don't know. I guess that's what I kind of think by... That becomes a legislative question. If we have one that works perfectly, that seems like it's a decision for lawmakers. One that's busted, which maybe all of them are right now, I think that this clearly caused enough damage, enough harm that the FTC thought it was worthwhile to bring this case. It is actually a lawsuit. It's not an administrative action. So we will see where this goes over the course of the next couple of years.

Kip Boyle: They must have less of a problem with the technology and more of a problem with the way management went about doing it?

Jake Bernstein: That is what I would expect. Let's move on to one that was finished. This is this federal court order against Ring, and just to get people in the right time period... I mean, obviously we know it's the last three years. This was another complaint that was filed in May of 2023, so not quite under the wire, but still more recent. And this one had to do with compromising customer's privacy because there were... Ring makes all these connected home security cameras and the FTC alleged that they illegally surveyed customers in private spaces of their homes, and most importantly, failed to take reasonable steps to prevent hackers from gaining access to the customer accounts, which therefore meant live stream of video and stored videos.

So this is a more traditional cyber security, data security type of consent order here, which they did agree to. That was in June 16th, 2023. I mean, that was 14, 15 days, so they didn't really fight it. This was the intent. But there was an... Where the refunds are starting to go back out. So this was was a case where Ring had to pay $5.8 million for consumer redress and notify them about that FTC action. So I think what this one shows is that... Oh, by the way, this is technically in the AI section. Why is it in the AI section? Because Ring was using a lot of that data to create models and algorithms derived from it, which it was unlawful reviewed. So when I say that they had illegally surveyed customers, it isn't... Or surveilled? Surveilled.

Kip Boyle: Surveilled.

Jake Bernstein: It isn't that there was a human watching people walk around their house. It was that they were training AI models and algorithms using this video

Kip Boyle: To recognize when humans were walking around their house.

Jake Bernstein: Various reasons. So the order then required them to implement a privacy and security program with novel safeguards about human review of videos, as well as a bunch of other stringent security controls, including implementing multi-factor authentication for both employee and customer account.

Kip Boyle: What's a novel safeguard in this context? What does that mean, novel?

Jake Bernstein: Well, I mean, I don't think it... It just means something new. I mean, in this case, ask yourself how many companies are there with huge stores of streaming video footage in people's houses? And the answer is not that many, which means inaudible-

Kip Boyle: There's no such thing as the best practice.

Jake Bernstein: Which means there really wasn't necessarily such a thing to deal with this, so the FTC had to create one.

Kip Boyle: Okay, that's good. I get it. I find this whole product space, Ring and just a whole bunch of other stuff, just perplexing to me why people would willingly put this technology in their homes. It's creepy as heck, just like Alexa is creepy.

Jake Bernstein: In your opinion. In your opinion, man.

Kip Boyle: Of course it's all in my opinion, of course it's all in my opinion. This is my podcast with you. This is where we talk about what we think of this stuff.

Jake Bernstein: I have a Ring doorbell camera. It actually strikes me as rather different. I think this tended to be more about the inside the home security cameras, but I'm not sure. I think it probably encompassed everything, but I don't see quite the same level of harm for external facing camera.

Kip Boyle: I think for you personally probably not, if all you have is an outward facing video camera. However, this also does remind me of... Not in this report, but in other reports, about how Ring has been supplying that surveillance footage, the video footage to local police departments without warrants.

Jake Bernstein: And they have stopped that. And I can't remember who was more upset, the law enforcement community or even some people... I mean, that's a tough one. That's another one of those things where it's like... Well, it's not immediately clear what is right, what is wrong, which means it's probably a legislative type of issue to deal with. By the way, most of the time, I believe that was in relation to the external ones. Ring wasn't sharing in-home footage with police.

Kip Boyle: As far as we know.

Jake Bernstein: As far as we know. I'm pretty sure that... If I recall these articles, it was really about it, "Hey..." Sometimes it was something that happened across the street, but since you had a camera pointing that direction, maybe it help us solve this case.

Kip Boyle: But rather than coming to me, because it's my camera, they just went to Ring and said, "We want the video footage from the camera mounted on this house," and they went, "okay, here you go."

Jake Bernstein: Yeah, and I can't remember now with the facts on that if the homeowner had to approve it or if there was an option. Maybe you're right, though. Maybe it was they just-

Kip Boyle: My best understanding was the homeowner wasn't even involved. Ring just acted like it was its own wholly owned surveillance system with which they could do whatever they,

Jake Bernstein: They call it a distributed surveillance system.

Kip Boyle: Yeah, you might. Okay, well anyway, so just sorry, but I just had to point out that this stuff's creepy.

Jake Bernstein: It definitely has... There's a lot of... I mean, yes. When people at scale begin putting cameras and microphones inside their own homes, I do think one thing is very clear, and I'm glad the FTC pays attention to this stuff because no matter what vendor it is, if you're putting a camera and a microphone in your house, you probably care... You should care about the privacy implications of that.

Kip Boyle: Yeah, you should. And if you think that I'm overboard on this, and you may well, I would challenge you to read the book 1984 and then come back and tell me if you still think that I'm over the top on this, that it's not as creepy as I'm making it sound like.

Jake Bernstein: This is one of those public-private distinctions, and nobody would want the government to put cameras and microphones in their home. We tell ourselves it's okay if we do it by choice when they're still a private company. But a lot has changed since the... What does the CCTV stand for? Closed circuit television. That's a very different thing. Old school security cameras.

Kip Boyle: No internet connectivity.

Jake Bernstein: Yes. Closed circuit literally means it's only viewable within a certain place. Now, there are major pluses and minuses to both of these systems. I like being able to see who shows up at my door, even if I'm nowhere close to home because I have a cloud-enabled Ring system. But there are issues.

Kip Boyle: Well, I mean, for example, you'd better have that thing on a dedicated... I would say VLAN, except at your residence, you probably just want it on a guest Wi-Fi network.

Jake Bernstein: I should probably...

Kip Boyle: Okay, think about that for a while.

Jake Bernstein: Yeah, I'll think about that. Okay, let's move on to the next... Kip, I clearly was wrong. I cannot believe I even insinuated that this might be a short episode.

Kip Boyle: We're not capable of short episodes, that's why we use scripts and timers.

Jake Bernstein: Why don't you go ahead and... There's more.

Kip Boyle: There is more.

Jake Bernstein: Just letting everyone know there's more under the AI heading, but we're going to move... We're going to skip the rest of it and move into the health privacy and security. And there's too much to cover, here. In fact, Kip, there's too much to cover... Even as you said, it's "only" a 38 page document, but it is incredibly dense. And the reason it's so dense is that virtually every paragraph has a hyperlink to entire press releases and legal documents. So there's a lot going on in this thing. But go ahead and tell us about...

Kip Boyle: It's like version two of the NIST Cybersecurity Framework. They released this svelte PDF, but the links will take you off and you'll lose an entire afternoon reading all the stuff that they point you at. Okay, so let's move ahead to the health privacy and security section. And the first matter in there is related to BetterHelp, which is those two words run together. It's an online counseling service, and they have to pay $7.8 million in consumer refunds, and the reason why they have to do that is because they shared sensitive health data for advertising purposes with, guess who, Meta, who is, guess who, Facebook's parent, right?

Jake Bernstein: That is correct.

Kip Boyle: Now we may have talked about this in a previous episode, but it's worth repeating. Make sure that your marketing department is on the same page as whoever writes the privacy policy, because ultimately what got BetterHelp in trouble here was there was a misunderstanding of how data was being used. It's a great cautionary tale for all of our listeners. They just didn't disclose something that was going on, and they got hurt.

Jake Bernstein: It was almost worse. It was almost worse. They explicitly said that they weren't going to share or sell your personal information in their privacy policy. And the problem is that everybody involved on the high level decision making side was like, "Oh yeah, that sounds right. That's true." But when it came down to some nitty gritty details of what was actually happening... And this should shock nobody given all of what's been going on with the Meta pixel, and... I don't mean to pick on Facebook. In this case, it isn't really Facebook's fault. They offer a service that people can plug into and it offers useful things for a business, which BetterHelp is a business, but what companies using the Meta or X or even LinkedIn marketing technologies need to understand is that there is data passing between them and whatever advertising partner they're working with, and simply being a customer of BetterHelp tells you something about your health.

So again, we are oversimplifying these cases. Our goal is not to go into extreme detail on all of them. That would just take too long. But rather to let you know about this great resource, because if you're in any of these industries in particular, one, obviously feel free to call us for help. We don't say that often enough, but that is one of the reasons we do this podcast is to let people know about these things and that there are places they can go to help with the understanding of them. But in this case, this report is a great resource. The FTC put it out.

Kip Boyle: And they're telling us, "Hey, this is what we're looking at. This is what we're paying attention to."

Jake Bernstein: That is the point, is... As I've said it before, the FTC Act is very, very short. The operative language for all of the stuff we're talking about is essentially unfair and deceptive acts for practices in trade or commerce are hereby declared unlawful. That's it.

Kip Boyle: That's a broad mandate.

Jake Bernstein: It's a really broad mandate, and there's all kinds of great reasons why that's the mandate. But in the absence of hyper-specific regulations, which all of us in technology can agree would be utterly pointless because they would be outdated... By the time they got finished, they'd be outdated and useless. So in the absence of that kind of stuff, you have to look at what the enforcement agency is doing. And here's an entire PDF that consolidates three years of enforcement behavior, packages it up, and says, "You want to know what we care about? Here's what we care about." So in a similar vein to BetterHelp is the GoodRX case, which I know we talked about in a previous episode, so I'm not going to go into detail. But this is the very first case which the FTC enforced its health breach notification rule.

And if you recall whatever episode it was, and I'm sure Kip will find it and put it in the show notes, you may recall that we kind of ranted and raved to some degree about how there was no actual data breach and there wasn't. But what there was the sharing of health related information, and I think this was another one where it was against the very terms of the privacy notice that GoodRX put out. And regardless it was against the rules of the health breach notification rule. So it was very similar, involved sharing data with some large social media companies for advertising purposes.

And this one was in order to target ads to people who were buying certain medications or obtaining certain treatments. And I guess I don't have a lot of sympathy in this one, but... I guess that's why the FTC is going after surveillance capitalism and we're not going to get into judgments about that kind of stuff, Kip, because this is a podcast about cyber risk management and not surveillance capitalism. But I will tell you that GoodRX for its trouble was hit with a 1.5 million civil penalty and can't do this anymore. So you can draw your own conclusions about whether we should or shouldn't do this kind of stuff, but right now the law is don't do it.

Kip Boyle: Right. Okay, moving on to another case, which we may have never mentioned this one before.

Jake Bernstein: I'm pretty sure we've never talked about this one.

Kip Boyle: Okay. Well, the FTC settled with a company called Easy Healthcare Corporation, and they publish a menstruation, A period in ovulation tracker application, which is called a Premom. Interesting. I never heard of that. Here-

Jake Bernstein: By the way, it's an app, Kip, because this runs on smartphones, inaudible called an application, apparently.

Kip Boyle: It's an app. We got an app for that.

Jake Bernstein: It is. This is a classic app for that situation.

Kip Boyle: Okay, great. So you had talked about how the FTC has all kinds of people working for them. Well, here's a great example of where they showed off their technical know-how. They alleged that Premom shared app event data containing sensitive personal health information with third-party advertisers like Google, and the company paid a $100,000 civil penalty, and this was another health breach notification rule case, by the way. And Jake, didn't they figure this out by actually sniffing the traffic?

Jake Bernstein: Yeah, well, this is one where you can look at the code, I believe. So the app events... So on a smartphone, there's not cookies. Things work a little different than they do on just the straight up web in a browser. But app event data is essentially similar to website type movement, telemetry-

Kip Boyle: Trackers, yeah.

Jake Bernstein: Cookie tracker type stuff. It's the equivalent. So the FTC... You think about what it would've taken for the FTC to figure this out. It's not obvious, and I tend to doubt that somebody complained because it isn't obvious.

Kip Boyle: Yeah, how would you know?

Jake Bernstein: I think this is just a really good example of they can get the source code, and if they can't get the... I mean, they can ask you for it and you'll have to give it to them. But they don't necessarily need the source code, because you're right, Kip. Just like you do when you're doing cookie analysis, you can-

Kip Boyle: Sniff the traffic.

Jake Bernstein: Sniff the traffic, exactly.

Kip Boyle: Yep. Assuming that it's not encrypted or that you can grab it before it gets encrypted or what have you. But I think however they did it, I think this makes the point that the FTC has smart cookies working for them. They understand technology. You're not going to just be dealing with a bunch of lawyers who don't understand code and traffic sniffing and that sort of business. So I think it's good for us as cyber risk managers to know that. And it turns out the FTC also settled with a company called Flow Health, which is a Premom competitor, for essentially the same thing as what Premom did. They shared users sensitive health data with third party analytics providers, which again reinforces the idea that data is the new oil and everybody wants it.

Jake Bernstein: And now that was only two of seven bullets underneath the health information, so where are we are... Industries. So the next one is super fascinating to me. This one is about location tracking. Now you talk about creepiness factor, Kip. You talked about all of us who willingly install microphones and cameras in our homes. Well, what about everybody who carries around a GPS location tracker in their pocket, or these days in their pocket, on their wrist, in their car? And the point that I'm making here is that precise location data, which certain state laws define as roughly 1,800 feet of accuracy, which is what, quarter mile, third of a mile, I believe, something like that. So pretty close. It's not read the license plate level, but it's very close. Certainly at a high level enough to make a lot of assumptions about where you go and what you do.

But it's everywhere. We have this data. We are creating so much data. It's honestly overwhelmingly ridiculous how much data that we create these days with all of these smart devices connected to the internet and all these computers. I don't know why it just occurred to me maybe because I just saw a headline about Boeing launching its Space Dragon... No, there's a capsule going up today, I can't remember the name of it.

Kip Boyle: Oh yeah. I can't remember the name either, but I know what you're talking about.

Jake Bernstein: They're launching astronauts. And I just made me realize, appreciate the fact that the computing power in one person's iPhone is like magnitudes, many, many magnitudes bigger than all the computing power probably on the entire planet Earth when we went to the moon in the '70s, late '60s, '70s. It's just unimaginably more powerful. So we have all this data, and I think it's probably good that the FTC, this is one of the things they're looking at. And there's a number of cases here. I think one that's most interesting is this company called Kochava, and I don't know that I'd heard of Kochava before this case.

Kip Boyle: Well, that's kind of par for the course. These are all data aggregators. I mean, they don't advertise.

Jake Bernstein: No. So Kochava was this data aggregator that compiles and sells consumers' precise geolocation data gathered from their cell phones. Now, let's just stop right there. I don't know that people are aware that this is happening, and I suspect that most people would not be okay with it.

Kip Boyle: If they understood it fully, yeah.

Jake Bernstein: If they understood it. So the FTC alleges that Kochava sells this data in a format that makes it easy to track consumers to sensitive locations, like medical facilities, places of worship, homeless, domestic violence shelters. That last one is particularly dangerous.

Kip Boyle: Yeah, it is.

Jake Bernstein: Because if I'm a... I mean, I don't know that I need to explain why it's so dangerous, but let's just say-

Kip Boyle: Well, I have all the vocabulary, it's really simple, because I served on the board of directors for five years at a domestic abuse agency in the county that I live in, and it's really simple because we saw these things. So you're a domestic abuse victim/survivor. You're trying to get away from your abuser. So the abuser uses all this consumer tech to surveil you and specifically GPS tracking, and there are magnetized, fairly small sized GPS tracking devices, and we would find them in the wheel wells or in the engine compartments of these vehicles that these abuse survivors were driving around in and they were coming to shelter, they were going to get legal aid, restraining order and that sort of thing, and they were being tracked by their abusers the whole time. So super, super dangerous situation.

Jake Bernstein: And in this situation, the FTC also alleged that Kochava didn't have any technical controls to protect the privacy of the people whose data it had. And this is in their lawsuit. Again, a lot of these are true enforcement actions. I think I've bored people in the past about administrative law versus a filed lawsuit, but this is a full-on court case, federal district court in Idaho, and the court denied a motion to dismiss the complaint from Kochava saying that the FTC stated a legally and factually plausible claim that Kochava's practice of selling vast amounts of data about mobile device users at least may violate Section 5A of the FTC Act by depriving consumers of their privacy and exposing them to significant risks of secondary harms. Now, this is a new case, and it is in active litigation, so it'll be one to inaudible.

Now, on the other side of this litigation versus administrative coin here we have an administrative settlement with a company called Support King, LLC, which formerly did business as, get this, Kip,, spelled with an F instead of a P-H. And this was a case where straight up the FTC says that there was the licensing, marketing, and selling of stalkerware apps that allowed purchasers to surreptitiously monitor photos, text messages, web histories, GPS locations, and other personal information from the phone-

Kip Boyle: Domestic abuse.

Jake Bernstein: This is a situation where if you share your... I don't want to make people paranoid, but maybe people should be more paranoid. One of the reasons that you shouldn't share your phone password with anybody, or just password sharing in general, is that without having shared the password, it's very difficult to do this kind of stuff. But when someone has shared a password, it's trivial. So gosh, that's a really complicated thing to talk about, isn't it?

Kip Boyle: Yeah, it is.

Jake Bernstein: It is.

Kip Boyle: We have no visual aids.

Jake Bernstein: Aids. No, we have no visual aids. It's very hard. Suffice to say that Kip and I are both looking very thoughtful and somewhat distressed as we think about this. But it's a really important and very scary thing. I mean, Kip, I think I did know that you served on that board, but thank you for reminding me. This one in particular is just... I mean, I think the FTC is absolutely justified in going after a company like this, because basically... I mean, here's what they say. The FTC basically alleged that this company unfairly sold stalkerware apps without taking reasonable steps to ensure that the purchasers of these apps would use the apps only for "legitimate and lawful purposes." I'm not sure what those are.

Kip Boyle: I don't know either. But it reminds me of the Apple... What are those things called, those little disc trackers?

Jake Bernstein: AirTags.

Kip Boyle: AirTags. It reminds me of AirTags. I mean, in its most benign incarnation, what are we using those things for? Well, the first one I ever used for was for luggage tracking. I put it in check-

Jake Bernstein: I've got one on my dog. It works great.

Kip Boyle: Every freaking set of keys in my house has one on for the cars, primary and backup. My daughter has a violin that we rented, and guess what? There's a tracker in that violin case. So there's plenty of legitimate... I actually have one in my backpack that is associated with my wife's iPhone so that when I go on a trip, if something happens to me, she can figure out where I'm at. Where's Kip's body for insurance purposes. So I have lots of great reasons-

Jake Bernstein: That's morbid, Kip.

Kip Boyle: Okay, it is. But guess what? Crap happens, and I want to be thoughtful about the debris left behind if something untimely should happen to me, what will my family do? But maybe I'm just hurt or in trouble. But anyway, there's lots of legitimate purposes for this, but how can they possibly know... This is where I'm going to push back on the regulator. How can they know... What is a reasonable precaution. I mean, nobody asked me what I was going to do with these AirTags.

Jake Bernstein: Well, actually, it's not you the customer, necessarily, it's... And when all the companies that are dealing in this space realize the potential issues. You may have noticed how much the user experience with AirTags, for example, has changed.

Kip Boyle: Oh, I've noticed.

Jake Bernstein: Now you get these warnings that there might be someone who's tracking your location, like, "Hey, this tag seems to be following you around." So that's an example where the hardware and software maker did take reasonable steps. I mean, they had to figure it out.

Kip Boyle: Yeah, it was a post-sale, post-market introduction action that they took, which is fine, but I still am complaining about this because that one AirTag that I carry that's associated with my wife's phone, so that they'll know if I'm in trouble, that they'll know where I'm at. Oh my God, I-

Jake Bernstein: It tells you you're being inaudible.

Kip Boyle: Constantly telling me that I'm getting stalked, and I'm like, I want this to stalk me. Is there a button in here that I can press that says, "Yes, it's okay."

Jake Bernstein: I actually think there is now.

Kip Boyle: There is now, but there hadn't been for quite some time. So anyway, talk about living on the bleeding edge of technology. Anyway, speaking of that, we're at 45 minutes, Jake.

Jake Bernstein: I know. I can't believe I said this would be a short episode. Well, look... Oh my God, I'm actually tempted to make it a two-parter, Kip.

Kip Boyle: Oh, you.

Jake Bernstein: What are you going to do?

Kip Boyle: No, no, no, listen, listen. This is what I'm going to do. I'm going to tell everybody that there are other sections in here of this report. For example, there's a children's privacy section with all kinds of really great examples of how the FTC has been focused on the privacy of children's data, which I like because I've got three kids at home who are fifth grade and third grade respectively, and the amount of surveillance capitalism aimed at them is unbelievable. And then there's just a whole other section on data security, which we all love data security, and I would encourage you to get in there and read that stuff. You will recognize almost all of the names. There's credit reporting and financial privacy.

Jake Bernstein: Yeah, well, I can say that the data security thing only takes us to page 14 of 38. I mean, just to give people the idea here, as you said, there's a whole credit reporting financial privacy section that talks about all sorts of different cases.

Kip Boyle: Spam calls, spam emails.

Jake Bernstein: Spam calls and emails, and then toward the bottom of it, again, we're not going to talk about it, but it is interesting, is a whole section on what they're doing in the AI space. Not cases, but workshops and blog posts. Those are all interesting. And then at the very, very end, or I should say the very bottom of page 28, policy statements and other actions. This is all the different things, like obviously policy statements, but also warning letters and a notice of penalty offense. I'm not even sure what that is, Kip.

Kip Boyle: Wow. Biometrics.

Jake Bernstein: Biometrics. Here's a policy statement on enforcement related to gig work. There's all kinds of stuff. This thing is just very, very rich, and I think we have accomplished our primary goal for this episode, which is to alert people to its existence.

Kip Boyle: And that you should open it up, use the find feature, search for your industry, search for the names of... Your name, search for the names of your competitors. Just get some keywords going here. You don't have to read this thing page by page, line by line, unless you like doing those kinds of things. I know somebody who likes doing those kinds of things. He's hosting a podcast with me right now. But if you're not that guy, just do the keyword search and figure out what's going on in here, and if any of it applies to you. I mean, how many times do the regulators tell you exactly what they're up to? Here is one of those times. Can I wrap up the episode?

Jake Bernstein: Let's wrap up the episode, Kip, thank you.

Kip Boyle: All right, this wraps up with this episode of the Cyber Risk Management Podcast, and today we discussed the incredible fecundity... Did you really write that word? Did I even say that right?

Jake Bernstein: I didn't write that. No, I didn't write that word. I actually don't think I finished this script. That might be from the previous script.

Kip Boyle: No. Okay, well, anyway, what did we talk about today? It wasn't fecundity. It was the FTC's 2023 privacy and data security report.

Jake Bernstein: I guess the FTC has been fecund in terms of creating cases and case law, so it kind of works. Let's go with it.

Kip Boyle: It does work, but it was unexpected. All right, everybody, thanks for being here.

Jake Bernstein: Yep. We'll see you next time.

Kip Boyle: See you next time.

Speaker 1: Thanks for joining us today on the Cyber Risk Management Podcast. If you need to overcome a cyber security hurdle that's keeping you from growing your business profitably, then please visit us at Thanks for tuning in. See you next time.

Headshot of Kip BoyleYOUR HOST:

Kip Boyle
Cyber Risk Opportunities

Kip Boyle is a 20-year information security expert and is the founder and CEO of Cyber Risk Opportunities. He is a former Chief Information Security Officer for both technology and financial services companies and was a cyber-security consultant at Stanford Research Institute (SRI).


Jake Bernstein
K&L Gates LLC

Jake Bernstein, an attorney and Certified Information Systems Security Professional (CISSP) who practices extensively in cybersecurity and privacy as both a counselor and litigator.