Skip to content
Let's Converge Podcast

Ep. 12: Why It Pays to Take Data Privacy Seriously

Jun 12, 2023 | 20 min 32 sec

ISACA’s Safia Kazi discusses the wave of fines, headlines, and reputational damage coming to brands that don’t prioritize the privacy of their customers.

Summary

Consumers are more aware of—and concerned about—data privacy than ever before. And they’re starting to ask questions: Why is this brand tracking my internet usage, or selling my personal data, or mapping my face without my permission? In this episode, we explore a new ISACA survey of nearly 2,000 privacy and cybersecurity professionals to find out who’s prioritizing data privacy, who’s not, and how enterprises can practice better “privacy by design.”

HOST: Doug Thompson, director of technical solutions engineering and chief education architect, Tanium
GUEST: Safia Kazi, privacy professional practice adviser, ISACA

Show notes

Check out ISACA’s survey, plus these articles in Focal Point, Tanium’s new online cyber news magazine, and our exclusive webinar on how the NIST Privacy Framework can help public and private organizations protect sensitive data.

Transcript

The following interview has been edited for clarity.

Safia Kazi: I live in the state of Illinois, which has a very strong biometric privacy act. And the other day I got a check from Facebook for something they did—I don’t know what, but I’ll take it. I think what this means though is that the average person who doesn’t care at all about privacy is now starting to care.

Doug Thompson: $650 million. That’s how much Facebook’s parent company, Meta, agreed to pay last year to settle claims that it illegally collected biometric data from the people using its website. And just this winter, the Illinois Supreme Court ruled that fast-food giant White Castle would have to face similar claims for allegedly scanning workers’ fingerprints without their consent. The burger chain said the ruling may cost them more than $17 billion. That’s a lot of sliders.

Hi, I’m Doug Thompson, and today on Let’s Converge, we’re talking data privacy and the public, painful consequences of not taking it seriously. Strong data privacy laws have been in place in Europe for several years, and now various states in the U.S. are debating passing similar legislation. And enterprises are scrambling to keep up.

Joining us today is Safia Kazi, a privacy professional practice adviser to ISACA, the Information System Audit and Control Association. Safia is here to discuss a new ISACA survey called Privacy in Practice 2023, in which they poll nearly 2,000 privacy and security professionals to find out who’s prioritizing data privacy, who’s not, and how enterprises can practice better privacy-by-design. How’re you doing today, Safia?

Kazi: I’m great. Thanks for having me.

Thompson: You know, we talk a lot about security breaches on this podcast, but what a lot of people don’t realize is what results from a breach: a loss of privacy. And that’s where you come in and what your organization is all about.

Kazi: Yeah, absolutely. So you’re right. A lot of people think about security, but they may not necessarily think about the privacy implications of a security breach. I sometimes have people ask me what’s the difference between security and privacy? Aren’t they synonymous? To which I say you absolutely cannot have privacy without security, but you could have fantastic security measures in place and do a really bad job of privacy. So is privacy a subset of security? Kind of, but not really. The two of course go very hand in hand. And I’m sure as we talk about our survey findings today, you’ll see that privacy professionals and security professionals work really closely to best protect the privacy of people who are giving over their valuable information.

Thompson: Well, let’s start with that. I mean, every month we get a new headline because some company is paying fines now. There’s big dollars associated with this loss of privacy.

Kazi: It’s interesting. Here in the United States, we have a very strange patchwork of privacy laws where, depending on the state you live in, you may have some more protections than somebody living just a few hours away from you. This of course also makes it really hard for organizations, who just kind of have a moving target of what is compliance and what is it that they need to do. Just recently Iowa passed a privacy law.

I think overall boards are starting to understand that privacy is something they need to think about just because we’ve all seen the negative headlines from organizations that don’t respect privacy and the massive fines they have to pay. And that’s saying nothing of the reputational damage that affects their bottom line. So ISACA did a state-of-privacy survey and in this we found that, compared to last year, boards seem to be stepping it up. It looks like funding and resources for privacy are getting a lot better. We’re seeing larger privacy staff sizes. People who work in privacy are less likely to say they feel that they’re understaffed. They’re less likely to say that their privacy budget is underfunded. And overall, it seems like more people believe that their boards are adequately prioritizing privacy compared to just last year.

Thompson: With some of these smaller companies, are they more challenged than some of these larger, well-resourced companies?

Kazi: I think so. A lot of times they don’t have a dedicated privacy team. They have a security person who is then tasked with also doing privacy in addition to all of their security work. And I’m sure as our listeners know, security teams are already feeling the crunch of staffing shortages and just having to do a lot with very few resources. So when you throw privacy tasks on top of that, that makes it even more complex. So I would absolutely say that smaller organizations are definitely struggling with this.

Thompson: Now, when you talk about privacy by design—and I used to work for Microsoft where we talked about security by design when we started the operating system and, and it really changed the game—but when you’re talking about privacy, it’s more than technical controls that go along with that. So explain about the design piece of that. How does that look?

Kazi: Right. So privacy by design is thinking about the user. That’s one of the key things. It’s of course the systems that you’re building, but it’s also the users who are going to be using what you’re building. Um, so there are seven key tenets of privacy by design, and the first one is proactive, not reactive. Preventative, not remedial. Basically privacy’s not a bolt-on. You’re thinking about it from before you even start building things. It’s about privacy being the default setting. People shouldn’t need to do anything to have their setting be one that protects privacy. It should be embedded into the design. So this is kind of where we get it more of that technical piece of privacy. It’s also about full functionality, positive sum, not zero sum. We shouldn’t need to trade off security for privacy or usability for privacy. Realistically, we can have privacy and achieve other objectives.

Kazi: It’s also about end-to-end security and full life cycle protection. This is so crucial when we’re thinking about data that we have. How long are we holding onto it? How are we protecting it? Even when we’re done using it, do we need to hold onto it? It’s about visibility and transparency, so communicating clearly and ensuring that everybody in the organization and people giving their data know what they’re doing. And then the final (and one of my favorite principles) is the idea of respecting user privacy. Keep it user-centric. If you think about the user, you’re going to be developing systems that are easy to negotiate from a privacy perspective. And if you are breached, which unfortunately is just probably going to happen at some point, you can really limit the damage that’s happening because the default is that you’re protecting privacy and you’re not collecting excessive amounts of information, which can really help the data subject who is compromised.

Thompson: So, in reading the survey, I’ve found that at least 60% of the enterprises are starting to practice this privacy by design at least frequently, you know. Maybe not always, but compared to other statistics, it seems like we’re heading in the right direction. But what about those folks that are in the “rarely” or “never” category , what’s their barrier? Is it a barrier to entry, or is it just, you know, are they hiding under a rock, or what?

Kazi: I’m sure there are some people who are hiding under a rock , but I honestly think a lot of it is just a lack of resources. I think a lot of people know that they should be doing privacy by design as often as possible, but they just can’t.

Thompson: And security and business continuity, you know, they have a backup or they have a disaster recovery plan, but if it’s not tested, it’s sort of worthless. So in the case of privacy, is there a similar-type thing where we throw in a scenario, maybe do a tabletop drill or something? Hey, we’ve had this leak of data. Is there some way to do that?

Kazi: Potentially. I have heard some people say that their organization tries to do privacy tabletop exercises. I personally think there’s a bigger challenge. We asked our survey respondents, have you experienced a material privacy breach? And almost 20% said, we don’t know. That, to me, shows you don’t know your data. I think it means that they know, OK, we had a security breach, but what data was it? Was it personal information? What was taken? What was viewed? So I think as far as understanding how to respond in a breach, the first thing you can do is understand your data so that if a system is compromised, you know what data is compromised and what the impact might be to your data subjects.

Thompson: And that goes back to the design piece of that. If you know what silos and what areas your data’s in and what is there, what type of data, then it’s a little bit easier to triangulate, OK, what did we lose from that survey? Absolutely. Are there organizations you can hire to come in and help you do like a consultancy or something?

Kazi: Absolutely. A lot of our survey respondents actually said they’re relying on outside consultants or contractors to help them with privacy-related tasks. I think this can be helpful. Something else that organizations are also doing is looking internally to see if they have people who are maybe experts on certain technologies or certain systems but aren’t experts in privacy and can they get on-the-job training. That can be a really great way to not have to worry about posting a job description and potentially spending six months trying to find the right person for a role. But yes, there are absolutely consulting agencies that can help people with their privacy-related tasks

Thompson: So you’re sort of reminding me that I was looking in the survey there and you talk about who’s accountable for privacy, and it was of course the chief privacy officer in co what I didn’t see is it’s sort of like security. That’s really everybody’s job, isn’t it?

Kazi: I would totally agree. I mean, some of it is gonna depend on the way your organization is structured, who reports to whom. But I absolutely agree. I think privacy’s kind of everybody’s job. You know, we were talking about some smaller organizations—they may struggle, so what can they do? Provide privacy awareness training to everybody in the organization. It may seem small, but just once a year, if you tell people, let’s verify who’s calling, if they’re trying to get personal information… Something that small can actually go a long way in protecting your data subject’s privacy.

Thompson: What surprised you in the survey, if anything?

Kazi: The one thing that really surprised me was that more people didn’t say they planned to use AI for privacy-related tasks. So 11% of our respondents said they did, 20% said they plan to in the next 12 months, which, interestingly enough, last year 20% said the same thing. I don’t know if it’s the same people, but 38% said they had no plans for AI for this function. That surprised me, because maybe when people think AI, they think of the cheesy stock images you get when you search AI of, like, a robot walking around. But are you really not using AI in your systems to detect personal information or sensitive information like birth dates? Especially considering the staffing challenges some organizations are facing and the increased number of data subject access requests, I was really surprised that more people didn’t say they’re going to rely on AI or are already relying on AI.

Thompson: Well, I know you have the Skynet dystopian-type thing and we don’t want that in here and the other, like you said, it’s the little robots. So let’s go the other way: What was encouraging about what you found?

Kazi: I think the one thing that was just really encouraging was seeing that people working in privacy didn’t feel quite as overwhelmed as they did this time last year. And we saw that funding is better, staffing is better, all of that seems to be a little bit better. The one thing that I would like to see some improvement in is, you know, we were talking about privacy-awareness training. We asked, how do you evaluate how good your awareness training is? And it was less than a quarter of our respondents who said they do pre- and post testing. Most people just treat it as a check-the-box exercise, [logging] how many people did it. I don’t think that’s a great metric. Is it a good training? Did anybody learn anything?

People are also looking at the number of complaints they receive about privacy and the number of privacy incidents that happen, which to me is far too reactive at that point. The damage has been done; you’ve waited until something bad happens to improve your privacy-awareness training. So I would say that’s one area where I’d like to see us maturing a little bit, just being a little more proactive in how we’re looking at the effectiveness of privacy-awareness training.

Thompson: Yeah, getting the rubric of which to measure. That’s always, it’s somewhat art and science that goes with it, and it does vary per organization or vertical or things like that. Somebody at a state and local government’s gonna have a little bit different thing than at a commercial place. Say I’m at a smaller organization and privacy is something I want to sort of go take over—I know we’re not doing a good job of it, but I need to get some air cover. What are some good resources or things they could do as a leader to sort of take that mantle, charge that hill.

Kazi: I mentioned a good privacy-awareness training, but I think it goes a step further: Instead of just saying personal information should be protected, maybe make it a little bit more real. We can see real impacts of people who’ve been harmed because of potentially being stalked or just having too much information out there. This can make it a little bit more interesting, but it also helps people see why privacy is so important. It’s not just a privacy notice on a webpage; it’s actually about data that represents humans and their lives and their families. And I think when we start to view privacy from that perspective, even people who don’t work in privacy can start to see that it’s something they should care about. Because for some people it actually might mean the difference between life and death.

As dramatic as that may sound, just with all of the information that’s out there about anybody available from a very quick Google search or maybe even a quick ChatGPT search , it’s incredibly important to understand there’s a lot of information out there about us. And so as organizations, if we can act in ways that can limit the harm that can come to data subjects, we’re going to have a significant competitive advantage. And especially an organization that’s struggling to find their foothold or dealing with a lot of competition, if you are one that protects privacy and really prioritizes your users, that’s ultimately going to be a huge competitive advantage.

Thompson: That’s a buying criteria for my wife and me—we look at, OK, how do we use services because they, you know, or we don’t use some services… because they haven’t been taking good care of stuff in the past.

Kazi: I live in the state of Illinois, which has a very strong biometric privacy act, and just the other day I got a check from Facebook for something they did—I don’t know what, but I’ll take it. I think what this means though is that the average person who doesn’t care at all about privacy is now starting to care. I have people who are friends with me and they said, I got a check from Facebook, what was that about? And although they may be OK with getting the check, they’re starting to think about privacy and understand, well wait, what does it mean if a social media site has a mapping of my face and didn’t ask me about it, or didn’t tell me about it? So I think the average person is going to start caring more and more.

Thompson: You brought up a point I hadn’t thought of before, from a generational perspective. There’s those of us who grew up before or who built the internet along the way and put it together, and we have a different perspective of privacy than my children or grandchildren do. How does that impact what regulations are getting put in place? Is this something that’s down the road, are we gonna have a bigger problem with it, or what?

Kazi: Potentially. I feel like in general, children just aren’t quite protected as adequately as they should be. There’s nothing stopping an 8-year-old from saying they’re over the age of 18 and creating a social media account, right? There really isn’t verification. I think there’s also a broader question because now there’s a generation that’s grown up with their entire life, potentially pre-birth, online—like, right from the moment their parents found out they were expecting, they posted on Facebook, and when they were born and all of their major milestones. And I honestly don’t know what the implications of that are. Just recently there was, I believe it was a YouTube family kid who wanted to seek damages because their life as a child was broadcast online. That said, there aren’t really laws or regulations in place, to the best of my knowledge, that are protecting people whose lives are online in this way when they are children and when their parents are the ones making these decisions for them. So that’s something that I think should probably change, because we don’t even know what the impact of that is going to be in just a few years.

Thompson: Well, and talking about change and getting back to the original question, I think these are some of the things that are probably coming, maybe not immediately or as fast as we would like. But what are some of the things you see that may be bigger hurdles [coming] sooner?

Kazi: I think one other thing that we’re starting to see more is cracking down against privacy, dark patterns. So those are just things that trick you into acting in a way you wouldn’t—like those confusing cookie notices where [you have to] accept or you can click to learn more, and then you have to click a thousand times to say don’t track me. That’s an example of a dark pattern. If you make it easier to click Accept than Decline, then enough people will just click Accept. A lot of privacy laws and regulations don’t specifically talk about it. However, we saw last year, a few states’ attorneys general sued Google because of dark patterns in Google Maps and not being completely transparent about what tracking was taking place.

While I think a lot of privacy laws and regulations have done a good job at trying to get people’s consent, some organizations have turned to tricking people into getting their consent. And I think that we’re probably going to see more of an emphasis on privacy dark patterns and ensuring that when people are giving their consent for tracking, they know what they’re signing up for and organizations are communicating clearly.

Thompson: I’m glad you put a name to something that’s been a bane of my existence for a while with clicking and stuff, just sort of like the end user license agreement and stuff that nobody can read because it’s 8,000 pages long and we just click through and we don’t know. We need to simplify these things and make it very clear—like, I would go to them and say explain it to me like I’m a fifth grader, let me understand, right, what this is? I would be all for that. Let’s help make that stuff change. Cuz there’s just some things that I don’t think they need. But I’m of the generation that still had a passbook for my savings account. You know .

Kazi: Well, you know, there was a bill last year—I don’t know what happened with it—but there was a bill to basically create nutrition-label–style privacy information. Like, here’s what we’re tracking and just a really quick, easy-to-understand thing. I think that could have been great. Because, you’re right, to actually sit and read the terms of service of everything, we’d have to all take two weeks off of work. I mean , it’s just not realistic.

Thompson: All right. Well, thanks for sharing your experience and a little more insight into this survey. I think it’s great to get a better understanding of what’s going on, and I appreciate your time and I appreciate what you and your organization are doing.

Kazi: Thanks so much for having me on.

Thompson: I’ve been talking with Safia Kazi, a privacy professional practice adviser at ISACA.

If you’d like to learn more about data privacy, check out Focal Point, Tanium’s online cyber news magazine—we’ve got links to several articles in the show notes—or visit tanium.com. To hear more conversations with today’s top business leaders and security experts, make sure to subscribe to Let’s Converge on your favorite podcast app. That really helps get visibility out. And if you like this episode, please give us a five-star rating.

Thanks for listening. We look forward to sharing more cyber insights on the next episode of Let’s Converge.

Hosts & Guests

Doug Thompson

Doug Thompson is Tanium’s Chief Education Architect. A conference speaker, podcast host, and storyteller, he architects solutions that keep our schools’ sensitive data secure.

Safia Kazi

Safia Kazi, CIPT, is the privacy professional practices principal at ISACA. In this role, Kazi focuses on the development of ISACA’s privacy-related resources, including books, white papers, and review manuals. In 2021, she was a recipient of the AM&P Network’s Emerging Leader award, which recognizes innovative association publishing professionals under the age of 35.