Skip to content

Shining a Light on Dark Patterns and Personal Data Collection

ISACA adviser Safia Kazi on the ways companies trick you into sharing personal data, the need to protect that data, and why we need more women in tech.


If you haven’t heard of “dark patterns,” just google “Google.” In January, four state attorneys general sued the tech giant, claiming it used the practice to mislead consumers by tracking their locations even after they had turned off that function.

“Dark patterns” are deceptive design choices that companies deploy in apps and on websites to manipulate users into choosing options contrary or detrimental to their intent.

As data-privacy regulations spread (often in a messy sprawl), many organizations are trying their best to track, monitor, and secure the sensitive data they collect and store. Unfortunately, says Safia Kazi, privacy professional practices principal at ISACA, the international IT governance association, organizations often fail.

Kazi recently spoke with Endpoint about dark patterns and their cousin, “dark data,” as well as what companies should consider when collecting information and, in advance of International Women’s Day on March 8, how data privacy could benefit from more women entering the field.

(The following interview has been condensed and edited for clarity.)

Let’s start with the basics. How and why do organizations use “dark patterns” and “dark data”?

Dark patterns are ways that enterprises trick you into giving consent for the use of your information. One of the things I sometimes see is: I’ll go to a website. They’ll have that cookie notice, and the “accept all” button is a bright blue color and the “decline” button is grayed out. You think you can’t click on it. But if you hover over it, you actually can. Yes, I’ve given them consent for the cookies, but it’s only because I thought I didn’t have a choice. It’s sneaky, and it’s frustrating, too.

I’ve given them consent for cookies, but it’s only because I thought
I didn’t have a choice.

Other times, a website makes you give your email address just to be able to look at the products they’re selling. Well, they don’t need that information. They just want it so that they can send you promotional email. It’s wrong to get people’s information when they aren’t aware that they have a choice and they don’t actually have to provide it. I’m hoping we see more of a crackdown on these dark patterns.

Personal data collection is a big concern these days because companies are storing a huge amount of it. Why is that a problem and how might that sensitive data be compromised?

Let’s say I submit a résumé for a job. The company says, “We’ll keep your résumé on file.” Well, that résumé has my name, address, phone number, email address, work history, and maybe job references. Then, because cloud storage costs are so cheap, which lets companies keep and store information they’ll never use, this “dark data” piles up. Now, all that sensitive data about me could end up in the wrong hands, do me a lot of harm. Someone could sell that data, or even tell my current boss, “Hey, she applied for [another] job. You probably shouldn’t promote her.”

The secrets of sensitive data monitoring

That’s why it’s so important for organizations to identify the sensitive data they have and protect it. Or ask themselves: Why are we hanging onto this? Should we be? Do we have good controls in place to keep it secure and private? If you don’t know what you have, you can’t protect it. You don’t understand the sensitivity of it.

How can organizations inventory, monitor, and secure sensitive data?

First, they have to know what data they’re collecting. What is human resources collecting? What is marketing collecting? What’s accounting collecting? They have to ask: What do we have? Why do we have it? Where is it stored? How often do our data-privacy and human resources professionals actually sit down and talk about what data they’re collecting? Or ask: “Is this data we still need?”

Organizations [should] have an understanding of how information they collect could potentially be used to cause harm.

The other thing I recommend is that organizations have an understanding of how information they collect could potentially be used to cause harm. For certain data, maybe you do need to have it on hand for a certain amount of time. You develop policies around how long you keep certain information. If you say, “Every two years, we will purge job applications we didn’t pursue,” having a policy like that in place makes it much more manageable moving forward.

That way you’re aware of the data you have and you have the policy in place, and that can be really helpful in combating dark data. But it’s a hard task. I would anticipate most organizations don’t really have great policies. That said, I think with the really rapidly changing privacy, legal, and regulatory landscape, that might be changing.

Do we need legislation to persuade companies to develop strong data-privacy policies, or will bad publicity from data breaches spur them into action?

It depends on the organization. You know a company like Facebook could easily absorb a large fine or even some bad press, whereas for a smaller organization, those could be much more harmful. Having the legal element in place can be helpful because it does push organizations toward the right thing.

Right now, we have states taking the data-privacy initiative because we don’t have anything at the federal level. So California is saying we want to protect the data of our residents, and they’re doing it with the California Consumer Privacy Act. Other states have followed. That said, I can only imagine how hard the job of a privacy professional is going to be if there are 50 different state-level regulations they have to comply with. Realistically it would be far better if we had something at the federal level. That said, I don’t know when that would happen.

How has the pandemic affected data privacy and the way companies handle data security?

People are working remotely now, so we’re giving them laptops. Or they’re using their own laptops. If they’re using their own laptops, what kind of protection do we have around the enterprise’s data? What if I had a roommate who worked for a rival of ISACA and I just left some files sitting around? Or what if I left my laptop unlocked while I had lunch?

[Read also: 3 ways to help employees combat pandemic stress]

On the flip side, I am now being much more vulnerable with my space. If we had this conversation in 2019, I would be in the office and I wouldn’t care what’s around me. Whereas, now, anybody I’m video-chatting with is going to be in my home.

Let’s talk about women in tech. You’ve said we need to bring more women into the data-privacy and security fields, which could have an outsize impact on the industry. Can you explain that a bit?

Stolen or misused personal data disproportionately affects women. Consider that nearly one in six women has experienced stalking, compared with one in 17 men. And stalking is increasingly starting or taking place online and through digital records.

Women have an opportunity to help build technology that’s much friendlier to anybody who might be discriminated against or harmed.

Women have an opportunity to help build technology that’s much friendlier to women, to anybody who might be discriminated against or harmed.

What are the challenges for women interested in the data-privacy and cybersecurity fields?

The tech culture is a huge issue. You see this sometimes on Reddit forums, where women will say something and then all these people will go in and dox her, and that’s directly a privacy issue.

We still have this perception that people who work in cyber are, you know, wearing hoodies and drinking Gatorade. And that’s not really the reality of it. You know, cyber is for anyone. Privacy is for anyone.

[Read also: International Women’s Day 2022—breaking the bias in tech]

We need to change that hoodie-and-Gatorade perception to make the field more inclusive and welcoming to women. We also need to address the wage gap, which is one way to help keep women in
these careers. We have a culture where we don’t talk about money, but that’s what allows pay inequality to persist. With COVID, we
saw a lot of women leaving the workforce because their spouse
earned more.

What are your concerns for the future of sensitive data, privacy, and security?

I’m very curious about what the implications are for kids who are growing up with social media and sharing every aspect of their life online. What are the privacy implications of that? What’s it going
to be like when you are looking for a job in your 20s, or in your 30s,
but the cringey memories from eighth grade are immortalized on
the web?

People tend to overshare online and through apps, and they don’t necessarily see the ethical impact of that. I have a lot of friends who are having kids now and just sharing far too much about their children. I mean, Facebook existed when I was in high school, but it was very different than it is now.

Bruce Rule

Bruce Rule is a veteran editor, reporter and public-speaking coach with more than 30 years of experience. He worked for more than 19 years as a business editor for Bloomberg, where he covered a wide range of topics of interest to Wall Street, including technology, company events, market news, regulations and policymaking.

Tanium Subscription Center

Get Tanium digests straight to your inbox, including the latest thought leadership, industry news and best practices for IT security and operations.