The {Closed} Session

Privacy & Data Security Law with Alysa Hutnik, Chief Privacy and Data Security Architect @ Ketch

Episode Summary

We are delighted to share our new episode of the {Closed} Session podcast with guest Alyssa Hutnik. Alyssa looms large in the privacy world, and she’s been thinking deeply about the intersections of data, technology and the law for nearly two decades. She’s also the Chief Privacy and Data Security Architect at Ketch, a super{set} company, as well as a lawyer. Hope you enjoy the episode!

Episode Notes

We are delighted to share our new episode of the {Closed} Session podcast with guest Alyssa Hutnik. Alyssa looms large in the privacy world, and she’s been thinking deeply about the intersections of data, technology and the law for nearly two decades. She’s also the Chief Privacy and Data Security Architect at Ketch, a super{set} company, as well as a lawyer. 

Listen to the episode and read the transcript at superset.com

***

The {closed} session - Season 4, Episode 2

Guest, Ayssa Hutnik, Chief Privacy and Data Security Architect at Ketch.

LinkedIn: https://www.linkedin.com/in/alysahutnik

Ketch: https://www.linkedin.com/company/ketchdigital/, TW: https://twitter.com/Ketch_Digital

Super{set} Twitter:@supersetstudio, @ClosedSeshPod

LinkedIn: https://www.linkedin.com/company/superset-studio/

Twitter: @tommychavez, @vsvaidya

Episode Transcription

Speaker 1:

Welcome to the Closed Session, How To Get Paid In Silicon Valley with your host, Tom Chavez and Vivek Vaidya.

Vivek Vaidya:

Hello and welcome to the Closed Session podcast. I'm your host Vivek Vaidya and with me, I have.

Tom Chavez:

Tom Chavez in the house.

Vivek Vaidya:

He just made it. You just made it.

Tom Chavez:

I literally just ambled in here three minutes ago.

Vivek Vaidya:

Yeah.

Tom Chavez:

It's karma. It's meant to be.

Vivek Vaidya:

It is meant to be. So today, we have a special guest in our podcast. We have Alysa Hutnik who chairs Kelley Drye's privacy information security practice. And Tom and I have the privilege and honor of working with Alysa as the chief privacy and data security guru at Ketch. She delivers comprehensive expertise in all areas of privacy and data security and advertising law. This includes compliance with federal and state laws throughout the US, including CCPA, CPRA compliance in California and internationally, GDPR. That was bureaucratic, right?

Tom Chavez:

Very, very vicious.

Vivek Vaidya:

I did it though. I did it. I was told to do that, so I did it.

Tom Chavez:

Well, and I just saw Alysa last week in person at the IAPP conference, which is the, let me see if I can, it's the International Association of Privacy Professionals where Alysa looms large. There's a number of conferences and sessions and so on. And I saw Alysa at one of our breakouts and I said, I can go to all of these different events or I could just sit down for 10 minutes and talk to you and know everything that's going on in privacy. Bang. So we got 35 minutes of Alysa here.

If you're listening, everything you need to know about privacy, you got dummies like me and Vivek asking open-ended questions. Listen closely because Alysa looms large in privacy. Hi Alysa.

Alysa Hutnik:

Hi. That's a tall bill to live up to, but I'll do my best.

Tom Chavez:

We're your hype man.

Vivek Vaidya:

We are your hype man. Exactly. And Alysa is also on the board of the Ethical Tech Project, Tom.

Tom Chavez:

Ethical Tech Project. Let's do a quick mention. Ethical Tech Project is a convening of bright minds from legal and technology and business and regulations focused on the application of data judiciously and ethically to create cooler, more nourishing human experiences. We really mean it. It's that broad. It's that ambitious.

We understand that it's a collaborative thing. You can't have just techies, for example, tackling problems of ethical data use. You can't have just legal folks. So we've convened the Ethical Tech Project and Aylsa sits on that board with us as well. We like to think of it as a think and do tank. So there's thinking and policy and advocacy, and then we also show up with our engineering hats on to suggest open standards and reference architectures for how actual system builders could embed privacy into their system. So it's pretty cool to have Alysa with us on that.

Vivek Vaidya:

Awesome. And Alysa, you got into privacy really, really early, right, and now, you must be sitting back and saying, ah, the planet and star finally started to tilt. I predicted it God knows way back when. So what drew you into privacy in the first place?

Alysa Hutnik:

It's been an interesting journey. As I think back to it now, it's been over 20 years or so. And I think originally I got into it, we're consumers and I think all of us really care about our own privacy and dignity and probably read too many books growing up and worried about Orwellian scenarios. But as a lawyer, I had just the greatest luck and started working with a group that did a lot of FTC, Federal Trade Commission investigations. And I just happened to be there at the right time where the agency, they'd always done advertising related work.

But they started pivoting to also including privacy and data security as part of what the agency was focusing on. This was early 2000s, so think really boring webpages that did not do a whole lot. And it's been a wild ride since then.

Tom Chavez:

You're such an OG. And to build on Vivek's question, I mean, first there weren't any of these model clauses in the EU at that time, which was the precursor to GDPR. It was entirely a state of mind, mostly in the US. You were operating and advising a number of ad tech companies as I recall, right? So, were there particular insertion points or things going on? We're trying to go back in a discussion like this, especially how the hell did we get here? What were the early seedlings in terms of market concern for issues relating to privacy in 2000?

Alysa Hutnik:

Yeah, sure. So, it always starts with what's the relationship with the consumer? You're a retailer, you're a technology company, have you, you've got a relationship with an end user and you're trying to market to them. You're trying to find more customers. And so, I'd be advising on marketing laws, advertising laws, that was part of my ambit. And a lot of that is based on permissions. Can you send an email to somebody? Can you call them? Can you text them? And there are rules around that.

And the rules started getting longer and longer. And the way that we were seeing governments enforced or lawsuits being filed at the end of the day, the court was just more this complicated web of how companies used information about their customers. And the way my mind works is I always try to make sense of the chaos and consumer protection law, it's a lot of dots. And you got to find where the patterns are in that and there's just a lot of interesting patterns that kept building on each other.

I did some work in the health space, which had its own HIPAA privacy law. That was early days, so that was its own set. But there's a lot of themes that were in HIPAA that we would see and apply in other contexts. There was data security enforcement early 2000s. And there's a lot of themes there that you would similarly apply to data uses. And at the end of the day it's coming up with what makes sense as a program if I work with companies. And they don't want to focus on the law, they want to focus on the business.

So what is essentially rules of the road do's and don'ts that can allow them to use their data in a way that they're not likely to get in trouble with the law.

Tom Chavez:

Yeah. Well, so building on that, because Vivek and I run a startup studio that is focused on building data-driven companies, we've been working in data management for 20 plus years. So just like you're coming up and doing this thing that now is at the center of the planet, we also kind of feel similarly that, wow, things kind of really suddenly ... We did it because it was cool and we're geeks, but then suddenly everybody's talking about data AIML. So that's what we do with our jobs.

Now, data is either a geeky word or frequently a dirty word in this context. People here, especially in the context of privacy, it gets everybody's hackles up. And so, I'm wondering if you could talk for a minute about how, and maybe it's too grand a question, but let's take a stab, how do you make it less geeky? How do you make it more relatable and accessible and not so charged? And are you seeing any changes that way?

We just went to IAPP and all of these sessions and so on about AI and data. So everyone's doing the talkie talk, but it still feels kind of elusive and geeky and weird. What's your thoughts?

Alysa Hutnik:

Well, two, one, when we're talking about data, you can lose the fact that there's a person involved. There's a human with feelings and behaviors and may not like certain things or may really like certain things. So I think if you humanize that it's more than just flat data, that's really important because it's really about relationships and trust. But I think the other part too is in the US, privacy wasn't super regulated. That's changed.

But for many, many, many years it really was more of best practices. And so, I don't see too many companies, I don't see any companies that would say and really invite regulators to come in and check everything out. Because I don't think companies necessarily feel super proud that they have everything, the bells and whistles, on their privacy program because they didn't have to. I think they didn't see where that was a revenue generating opportunity, it was more in the compliance side.

So nothing you'd want to put a big spotlight on. And, well, I don't think we're there yet that there's that invitation. I think as companies have put the human first and really created something where they're proud of it, when you see a lot of ad claim, advertising claims out now on privacy safe and privacy supported and all because it is a draw. Now that claim also has to be supported. But I think that it's indicative of a trend where it's an area of investment that businesses see the business benefit of it as opposed to just a legal compliance related obligation.

Vivek Vaidya:

Yeah. And so, that's interesting. So you're looking at it from the outside in the consumer's point of view and treating the consumer as an individual with feelings and expectations and all of that. And as companies try to adopt these practices and change in mindset, I'm sure you must have run into these challenges that companies face where they want to do the right thing, but for whatever reason, they have these internal issues, let's just call them those, that are getting in the way of them wanting to do the right thing. So what are some examples of those that you've seen?

Alysa Hutnik:

So, I think a lot of companies have just collected data over the years. Maybe they acquired other companies and assisted all sorts of systems that might not speak well to each other. So they don't even know what data they have. And it's bringing order to just that massive amounts of data is a lot. It's difficult. It's easy to start it from the ground up, but to apply new requirements that really require you to understand what data you have and why you have it and why you can use it, that is work and it is hard.

And so I think retrofitting to privacy obligations to previously collected data is hard. I think relationships are really hard. You might have the best intentions of having a robust privacy program, but if your business partners are not on the same wavelength in terms of whether personal data needs to be opt in or opt out the permission so that you can pass down the same level of protection. It's hard to jump in that pool alone. You really need others to be in the same level of what the expectations are.

And so it's fluid. I think the third aspect here is just knowledge. This has been moving so quickly. It is complicated. And I think we can get lost in the weeds on what the rules are that I have to remind myself a lot that you got to zoom out and what is this all about? You can focus on the particular details, but really good judgment and trying to think of that consumer and what's reasonable, what would they expect can go a long way.

Tom Chavez:

Man, I really appreciate that point about remembering there are people involved that there's somebody on the other side of the screen that's generating the data you care about and that person has motivations and affinities and beliefs and expectations. And before we deface and defile them with terms like consumer, I'm going to do this quick. I've always had a beef with the word consumer. It's such an impersonal word. You're like, we're all little faceless conscripts out there consuming on behalf of a company.

At our last company, we actually took to calling it, we wouldn't refer to it as consumer data, we called it people data. It's sort of an oxymoron like jumbo shrimp because data is the scientific engineering thing and a person is an actual person. I had a board member tell me, that's a terrible idea. Don't call it people data because nobody's going to know what it means. We're trying to change the conversation. What a difference a decade makes, right?

I'm glad to hear you say, Alysa, that companies now are starting to recognize that it's not just a chore, but it's a responsibility. They have to be good stewards of that people data and they can't just keep on voraciously devouring as much of it as they can without concern or constraints or separate related like big tech has gone hog wild with too much data without enough good reasons for holding all of that data.

Alysa Hutnik:

Right. And maybe just to tack onto that, I think the industry changes with Apple's move to ATT opt-in, the future deprecation of third party cookies by Chrome has really motivated companies that your first party data supply, how you get your customers, we don't have all you can eat buffet of online advertising in the way that it used to be. So you have to be so much more intentional in your customer acquisition. How do you retain those people and really build that trust because that data is all the more valuable.

Tom Chavez:

Totally. Well, so I mentioned we were at this DC thing and it's really just the second one that we've attended. Ketch is still a young company, so it's the second one we've been to. And man, what a difference a year make. So I wanted to see if I could get your view and maybe I'm seeing what I want to see, but I was excited because last year felt a little cheery, honestly. This year felt like people are on the move and they're much fewer defensive conversations tied to compliance. And at least several more conversations I noticed in and around our booth tied to opportunity. How do we go on offense? Sort of along the lines of what you're saying, Alysa, but what's your sense of the zeitgeist at IAPP this last year?

Alysa Hutnik:

I also felt the energy. I don't know if it's a matter of post pandemic or folks have gone through the different stages of grief. They've now accepted it and are willing to move on and build. But I think in some ways, it's human nature. When you know that the dynamic has shifted, now it's moving forward. How do you move forward? You have to. That resonated for me in terms of the conversations that I had with folks. I think AI, every conversation was AI in one respect or another and privacy considerations.

So I think there was a lot of energy about that and just curiosity. Amazingly, so many of these sessions, they had reached capacity. There was one that had the California regulators in there, and there's a line of, I would say, maybe 200 more people who wanted to get in before they cut it off. And so, when it comes to demand for content, demand for direction on privacy, that was pretty telling in both about the enthusiasm and attention that the conference really had captured this year.

Vivek Vaidya:

Yeah. I wasn't there, but my offline kind of takeaway was that regulation was not a four letter word and people are actually talking about it in ways that could make businesses better. Especially to your point, as AI starts to take more and more front and center stage, the intersection of AI and privacy is open territory right now. So what's your view on where all this kind of goes? How should a new company or even existing companies think about the interplay between AI and privacy? What can they do with their products, for example, to take advantage of all of this?

Alysa Hutnik:

Yeah, sure. So, I'm going to go back to that good judgment, read the terms, read the fine print. So, for example, open AI's terms. They've got a lot of restrictions in there and disclosures. And so, if you think about any platform, would your use comply with those terms or are consistent with the way that ChatGPT, for example, is powered. Their privacy concerns in terms of the type of data that's ingested and it's really based upon, and would you be contributing your customer data?

Is that consistent with what you've said in your privacy policy? So thinking about just the privacy implications. I think the other side to this is, really, how do you describe the AI tool and do that accurately when you're not really sure how it's powered or the emphasis of what data and the accuracy of the outputs. And so, just even the description of it could be a legal claim, false advertising type of claim that the FTC or just private litigants are very likely to scrutinize.

There's a ton of other legal issues, but I think with any shiny new technology object, really understanding what its capabilities are and what its limitations are. And so that you match your use for it to really account for those types of considerations.

Vivek Vaidya:

Yeah. And it's fascinating what you just said. It segues into one thing I want to talk to you about. And we mentioned it, Tom mentioned it at the beginning with the Ethical Tech Project. The very nature of what you just described. In order for you to do that successfully, it's an interdisciplinary kind of approach. It can't just be the remit of lawyers or legal people, product people have to be involved and engineers have to be involved, IT people have to be involved, et cetera. So, what do you see? In companies, are these cross-functional task forces being assembled to tackle the privacy problems, so to speak?

Alysa Hutnik:

Well, when I see that they are assembled, that's where you see, really, just more competent successful programs. If you have legal running this, then it's going to be compliance and somewhat defense risk management. And that's not how most businesses are run. So I think if there's a good translation layer, which comes from all the different stakeholders trusting each other and really trying to problem solve as opposed to speaking in silos, you just get better outputs from that.

But that's investing in relationships too. And I think that's a culture to the company and I think it really contributes to the success and a whole range of things.

Tom Chavez:

So now I'm going to get a little sporty, but I remember early on in my career when I was talking to lawyers and I remember just exiting those conversations depleted because I just got 45 minutes of all the reasons you're going to die, all the reasons you can't. And I mean, the legal training, creates a lot of that dynamic for people. And I'm newer to the privacy law elements here, but I do remember Alysa early on, before we were fortunate to meet you, some of these conversations were just punishing because you have officious, very zealous privacy lawyers kind of just throwing the gauntlet at you.

Again, all the reasons why you can't and all the things you can't do and no problem solving to your point. I don't want to turn this into therapy session, but it feels like if we're going to succeed here, the privacy lawyers need to get a lot more interested in solving problems and partnering tightly with techies like us and business people to try to thread this needle.

As you've educated us, there's a lot of murk still in these. There's a lot of squish in the way these things can be interpreted and put into action. Again, I want to see what I wanted to see, but I felt like this last year at IAPP, it was more practical the way people like us show up. Okay, what are we going to do now? Are you seeing people in your, and you're training people at Kelley Drye, I presume, to take that posture and get in there and don't just throw the book at them, figure out ways to solve these problems. Is that a broader sea change, do you think? Are we seeing that happen now?

Alysa Hutnik:

I'm hopeful. I mean, it's back to the culture of the lawyer and the law firm in some ways. I think, for us, it's client service standards. And so, that's all about how do you support the business objective? It's not, the law doesn't exist for the sake of the law. We're trying to support a company and its mission. And I think you probably do have a lot of lawyers just like you have any other professionals that may have limitations. So I think in some ways, finding your right privacy support and that could depend on the company and really what its needs are.

But I was excited about at IAPP, yes, you have lawyers and you also all these non-lawyers, privacy professionals. So that is pretty exciting in the terms of the support that companies can get. That I think also could be a good translation layer and an economical one so you're not paying lawyer type rates for all of the implementation work that needs to happen.

Tom Chavez:

Right. Hey listen, can I interject with a totally unpaid for promotion?

Vivek Vaidya:

Oh sure.

Tom Chavez:

Yeah.

Vivek Vaidya:

You have one?

Tom Chavez:

I do have one.

Vivek Vaidya:

Okay.

Tom Chavez:

And you don't even know what it is. Sometimes if you're listening to these podcasts, we've got a little something up our sleeves. This time, I'm just throwing it out there.

Vivek Vaidya:

Drop it.

Tom Chavez:

So you mentioned, I was just barely on time and I got to stop and get a big shout out to a Luxor cab driver named Scott. So when I land at the airport, a lot of people like to do their Ubers. I love to do my old school, just walk out and get a cab ...

Vivek Vaidya:

Oh wow. Look at that.

Tom Chavez:

I know. I'm like, I'm a man of ...

Vivek Vaidya:

You're old school.

Tom Chavez:

I'm old school. So I go out to the taxi line and I hop into this cabin. It was a delightful ride and a nice little walk back down because this guy is a veteran, born and raised in San Francisco. I tell him where I live, it's a quirky street. He knows exactly ...

Vivek Vaidya:

How to get there.

Tom Chavez:

And by the way, yes, it's like three secret routes. Because you live in San, when you live in San Francisco, it's a quirky city. So you have all these weird little routes that only the veterans know of and the amateurs take obvious streets. So we have a delightful conversation about how to get to my place. And in that context, he takes me back to old stories of San Francisco, which was again lovely. Also, I found this interesting. He was talking about, I asked him, how long have you been driving? Oh, I've been driving a cab for 40 years. Never had another job. Those are the last of the of Mohicans. I mean there's like five of those guys in San Francisco.

But talking about how he had basically divorced friends of his who were cab drivers and then went to what he calls the dark side. He won't even say the word Uber.

Vivek Vaidya:

Wow.

Tom Chavez:

Yeah, he won't even say Uber. But you know what?

Vivek Vaidya:

Did you get his consent though, Tom, before using his name though?

Tom Chavez:

I did not, but there's a lot of Scott.

Vivek Vaidya:

See, Alysa, what do you think about that?

Tom Chavez:

I didn't say his last name. And it's his ...

Vivek Vaidya:

Luxor cab. Scott. There's a lot of information being disclosed over there, right?

Tom Chavez:

You got me.

Vivek Vaidya:

I love that you're considering that. I feel like, Scott, we're probably still in a somewhat anonymous state there. Wouldn't give you too much heartburn.

Tom Chavez:

When you got a name like Scott or Tom, you could be anybody. Vivek, in contrast, that's more ...

Vivek Vaidya:

I don't know about that.

Tom Chavez:

That's a more discriminatory power.

Vivek Vaidya:

I don't know about that.

Tom Chavez:

We pretty much know who you are. Anyway, shout out to Scott. Totally unpaid for promotion. If you're in San Francisco and you see a Luxor cab and a nice older gentleman is driving, he's probably Scott, give him a hug for me.

Vivek Vaidya:

There we go. Scott, I'll be on the lookout for you.

Tom Chavez:

There you go.

Vivek Vaidya:

Should we get back to privacy?

Tom Chavez:

Okay, let's do it.

Vivek Vaidya:

All right. All right. All right. So, just building on what we were talking about earlier before we heard this magnificent story about Scott, what do you think Alysa is working or not working when it comes to data privacy, especially when it's applied? So the practice of data privacy in the enterprise or the industry out there.

Alysa Hutnik:

So I think where we're used to legal support is, if you're lucky, you have somebody who specializes in certain areas and they work in the company. And the privacy would be in their domain and they're responsible for it. I think that is fine when there's not a whole lot of obligations and it's really more ad hoc advice here and there. The problem is that we now have a whole slew of requirements that go to the fabric of a company. It's not a legal issue. It's how you design your systems? How the marketing is even thinking about their strategy, their digital strategy, their offline strategy? How they're setting up clean rooms? It's like the big shiny new thing.

All of that raises privacy issues, the risk assessment issues. And you can't have one or two people be responsible for that for the company. I mean, it's a leadership issue. It's a stakeholder issue. And it has to be, you know the word privacy by design, but really it's like everybody needs to be aware and at least trained within their role on what they should be issue spotting for and incorporating. So until we're there, then I think we're still working uphill.

Vivek Vaidya:

We're optimists, right? So, optimistically speaking, what are some things that you're excited about as you climb this hill? What are some milestones along the way where you're like, oh, wow, that was cool. That's exciting and that's hopeful.

Alysa Hutnik:

Well, I am an optimist, and as you know, I totally nerd out over privacy. So, I get really excited about all of this. And I'm excited for the reasons Tom said that people are talking about it and people are talking about it who are not lawyers. It's in the consciousness. We are seeing more leadership really speak about it to their companies within the company and be open to change that actually accounts for privacy.

So, I do see a tide turning. And I think just even the innovation, and you don't need to use so much personal information for certain business functions, and that has been a hard concept to really swallow for a lot of companies. But it is a mindset and it's cool to see the different kinds of innovation, like clean rooms, like privacy enhancing technologies. I just see a lot of promise in where we're going there. Even in the advertising context, contextual advertising has gotten a lot better than it should be.

Tom Chavez:

Well, let's go deeper on a particular point here. So California has a privacy regulation called CCPA and then CPRAs is coming out. Now, I wanted to ask you to educate our listeners for a minute as to what, and maybe a little case study on CPRA, what is it called for? What are the main elements with that? We can geek out for, but give them the main thrust. But along those lines now, see, everybody looks at California. When I grew up in Albuquerque, New Mexico, my dad would playfully refer to California as the land of fruit and nuts because all those crazy Californians, they're always coming up with nutty stuff and who knows.

But it's not just California, it's Virginia, it's Colorado, it's Connecticut, it's Utah. So, let's also make sure people understand it's not just crazy Californians doing the privacy thing. Other states are doing it. So, the first part is, give us a little tutorial on CPRA. But then, go forward a little bit and tell us what we should make of this patchwork of state regulations. At Ketch, we refer to it as whack-a-mole, right? We see companies who are frustrated trying to figure out how to navigate this complex array of different state regulations and they're not the same, right? So what's ahead in that regard? But first, CPRA.

Alysa Hutnik:

Yeah, so CPRA, the version 1.0 was the first version, and it was enforced so that made it real really quickly for companies. It obligated companies to make specific disclosures about their data practices, which also had to be accurate. So it forced companies, really, to look at what they are doing and get pretty granular in terms of the kinds of third parties they're sharing personal information with and their advertising practices and the different uses.

And I think that was a fairly intensive obligation for a number of companies and just getting those notices right, not very sexy, but that was one of the obligations. The other one, contract terms. They had a magic language privacy terms in their contracts with partners, with service providers and have a good sense of who actually was a service provider that you shared personal information with. And I think many companies just assumed, look, we've got contract terms. They're our partner, they're our vendor.

And California made a really big distinction on is it actually just using the personal data, your customer data for you? Or was it powering other business commercial endeavors by that partner? And if that's the case, then there are certain rights for that consumer on whether they want their data to be shared in that way or not and the choice to choose not to have that. And so that was a big change that companies did not have to do before.

The other things they had to go through their websites and really understand what tags they were using on their websites and what tags were firing, advertising and analytics tags, and have some process around that. So when a consumer said, I don't want that, the consumer could actually be able to opt out of that and have those right notices explain those rights. And then, the rights. So I was talking about the opt-out, but consumers also could ask a company, give me a copy of all the data you have about me. Or I want you to delete all the data that you have about me and have that done within a pretty reasonably short timeframe to do that.

Tom Chavez:

Yeah. Well, on that point about service providers, Vivek and I have had big feelings about that for well over a decade. The first product we built at the last company called Krux was a thing called Data Century. And this is before, well before CPRA, so if we get the fine print of a privacy regulation. From our perspective, way back when you saw all of that third party script on people's websites, we were trying to, with modest success, I would say, we had some companies who felt strongly about it where, no, no, no, no, that's your data.

In fact, we had stickers on our laptops. That's your data. It's not their data. Why are these other weirdos coming in and extracting data under the dark of night? And by the way, is that any good for the consumer on the other side of the screen? Did they even know? Of course there was no opt-in it was, again, in the dark of night. So it was crazy town in 2010. I'm glad that there are actually regulations now that attend to that problem. It's still a biggie.

Alysa Hutnik:

There are. They're just not the same.

Tom Chavez:

Yeah.

Alysa Hutnik:

You mentioned the other states besides California. So we have six states. And depending on the type of personal information, if it's sensitive for example, it might need an opt-in instead of an opt-out. And so I think for a lot of companies, just really thinking through the data they have, classifying it and having the right set of permissions. And do they want to do something different for California because it's the most specific on certain things? Or do they want to do it to whole United States? Even if some of the states, many of the states, don't yet have specific laws.

To the whack-a-mole point, I think just really having a strategy so that you're not constantly telling your engineers and your marketing team one thing and then going back to them a month or two months later and saying, actually there's a new law and now we have to do something slightly different. That's a hard program to roll out. So it's really anticipating around the corner and having something that's more durable in my view.

Vivek Vaidya:

So let me just ask a somewhat controversial question, I guess. Do you think we'll see a federal regulation on privacy in our lifetime?

Alysa Hutnik:

In our lifetime?

Vivek Vaidya:

Yeah.

Alysa Hutnik:

That was a very good caveat. So I had mentioned earlier I am an optimist, and I am. I am not optimistic though that we will see a comprehensive privacy law, the federal privacy law. I think we might see a new child's update to that or teen's data. It's just been really difficult to get bipartisan compromise on this. There are strong feelings on whether there should be only a privacy law in states should not be allowed to have their own laws.

And I think that that is really just a gating issue. And particularly when you have laws like California that now not only are out, but have regulations have been enforced, companies have designed programs around that to suddenly shift course that seems less and less likely with the more time that goes by.

Vivek Vaidya:

So the whack-a-mole is going to continue.

Tom Chavez:

On this podcast, we don't show for our companies, but it almost, it's just nutty and irresponsible not to mention that this is why Ketch exists, right, because we've seen these problems. There is this whack-a-mole, this patchwork of state regulations, and so many of the privacy promises that have been made were just kind of a state of mind. They weren't actually being enforced and enacted. We looked at that and said, okay, that's a beautiful place to put the machines to work. You can't hire enough privacy program managers and lawyers. You can't throw enough bodies at the problem to manage it in that way.

So it's a beautiful opportunity for software. And that's the essence of Ketch. I mean there's many nooks and crannies and complexity, but the essence of it is turning privacy programs with lawyers and program managers and making it programmatic, making IT software enabled and go to sleep. If the privacy demons go bump in the night, the software's doing what it's supposed to do.

Vivek Vaidya:

Yeah. So ...

Tom Chavez:

That was an unpaid for promotion as well, I guess ...

Vivek Vaidya:

For whom though? For Ketch?

Tom Chavez:

For Ketch.

Vivek Vaidya:

There we go. Okay. Yeah, yeah. Good well done.

Tom Chavez:

At least I'm owning it. At least I'm owning it.

Vivek Vaidya:

At least you're owning it. That's right. That's right. So as we close this out, Alysa, just a great conversation. What advice do you have for our listeners who are kind of grappling with these issues? If there are software companies like Ketch or large companies who are trying to deal with these privacy regulations, what advice do you have?

Alysa Hutnik:

Have a data strategy, really. Thinking about why data is important to them and their future. And I think if you really build your data strategy and how privacy supports that data strategy, because clarity of mission, I just think it's so critical. And if you're dead by a thousand cuts on privacy compliance along the way, it's just really ... I think it just cuts against the ability for a business to move forward. And this is happening, right? It's not changing.

And so, the more you are sophisticated and really understand, back to the good judgment on how can I have durable strategy for my data and where does privacy fit into that? And if you're not getting practical advice from the lawyers with whom you're talking to solve the problems, then talk to some other lawyers. Because I think having those relevant stakeholders at the table and really thinking big picture before you dive into the weeds, I think, you'll just be so much more efficient and you'll have clarity of mission that the privacy compliance just really goes along and helps support that.

Vivek Vaidya:

No, I think that's spot on. I wish we had done that at Krux way back when, right? Because we actually did have some of these issues and it was hard for us to retrofit all these. I retrofit our technology to comply with GDPR and things like that. We were able to do it. It was painful. But yeah, I sure wish we had thought about a lot of these issues way back in 2010 that would've simplified our lives quite a lot, actually.

Tom Chavez:

You're speaking our language, Alysa. And how interesting is it to have a privacy lawyer on this podcast who is advocating a data strategy. Find the data strategy, and then attend to the privacy considerations. But sort of to your point, Vivek, and we've seen this firsthand. You can't staple on these solutions later. You got to get it right at design time.

Well, Alysa, what a pleasure having you with us. And it's so cool to get to work with you in all these different contexts and to introduce you to our listeners today on this important topic of privacy.

Alysa Hutnik:

Well, thank you so much for having me. I've been an active listener, so it's super exciting to be on the podcast. Thank you.

Vivek Vaidya:

Well, thank you, Alysa. And yeah, thank you listeners, and we'll see you soon.

Tom Chavez:

Thanks everybody.