You Can Find Love (And Privacy) In a Subaru

You gotta remember, your car is a computer. And it's a computer that's less secure than a Waffle House.

You Can Find Love (And Privacy) In a Subaru
Photo by Jakob Rosen / Unsplash

In America, few relationships are more intimate than the one we have with our cars. They're loud, bad for the environment, and people can use them to kill others, whether intentionally or accidentally.

And yet we just can't get enough of these two-ton space hogs.

Unlike most of the rest of the world, we also don't have a great nationwide mass-transit network. You have Amtrak, which I personally love, and then you've got a few bus companies. That's about it.

So, depending on where you live in America, if you don't have a car, you're kind of screwed.

For that reason, we're going to revisit the intersection between cars and privacy throughout this series. Our original car episode, which aired back in 2022, was re-aired as part of this series as Episode 4, which you can listen to here.

This week, in Episode 11, we have the brand-new 2025 updated edition.

This is also the next-to-last "All New Episode" until 2026. I wanted the chance to preview the new style and format for the show, and the response has been really good. So after next week we'll go back to the original series — with updates and new interviews, so don't snooze on them! — and now you know what to expect when we come back with over twenty brand-new episodes in February/March.

I'm feeling really good about what's coming your way next year. We're going to increase this newsletter to twice a week (Tuesday and Friday), we've got new episodes coming, the free book — all you gotta do is become a subscriber to this newsletter to get the .mp3 and .pdf — and maybe even some video.

So I hope you'll stick around and tell your friends. We're just getting started.

— BJ

Show Notes

Stupid Sexy Privacy Show Notes For Season 1, Episode 11

Guest: Lauren Hendry Parons, Director of Communications at the Mozilla Foundation.

Episode Summary: Lauren Hendry Parsons, from the Mozilla Foundation, explains how cars have become data‑hungry surveillance devices that undermine user control. Our host, Rosie Tran offers some solutions to this problem, but you gotta remember, your car is a computer. And it's a computer that's less secure than a Waffle House.

This Week's Privacy Tip

  • Avoid connecting your primary smartphone to your car and use a secondary, low‑info phone (or USB drive) for books, music, and podcasts to prevent manufacturers from harvesting your personal data.
  • If you can, buy older, pre‑2015 used vehicles or privacy‑forward brands (Subaru applies the California Consumer Privacy Act to all U.S. customers) so you can opt out of data sharing and invoke the right to be forgotten.
  • Data from connected cars enables profiling and real‑world harms (insurance price hikes, security risks); protections can be costly, so ... You're not a Privacy Karen. There's legitimate reasons to be concerned here, and this data collection in your car is costing your money.

Highlights From Our Interview With Lauren Hendry Parsons, Director of Communications at the Mozilla Foundation

  • Cars have become pervasive surveillance devices—collecting location, voice, contacts, and behavioral signals—that affect both drivers and passengers and enable profiling, insurance/ employment impacts, and manipulation.
  • Mozilla’s 2023 review of 25 automakers found widespread over‑collection, poor transparency, and weak security; European brands (e.g., Renault, Dacia) performed relatively better while Nissan, Kia, and Tesla ranked among worst offenders. (No surprise there with Tesla, which is why you should join the Tesla Takedown.)
  • The key problems facing drivers are coerced/forced consent (buying or updating a car effectively opts users into future data abuses) and lack of meaningful data agency—no easy opt‑outs, deletions, or visibility into who sees data.
  • Some options to fight back for drivers includes buying privacy‑forward brands, buy older/non‑connected vehicles or avoid Bluetooth/app connections, perform periodic “digital hygiene” and remembering to occasionaly check the car app settings.
  • How do we fix this situation? Support systemic change by backing regulation like the New York State Privacy Act — still trapped in committee — and community projects like Mozilla's Common Voice that create open datasets and push industry standards toward people‑centered, transparent technology.

Our Sponsor: DuckDuckGo <--Recommended Browser and VPN

Recommended Products

Get Your Privacy Notebook: Get your Leuchtturm1917 notebook here.

-BitWarden.com (Password Manager: easier to use, costs money)

- KeepPassXC (Password Manager: free, harder to use, but more secure)

-Slnt Privacy Stickers for Phones and Laptops

-Mic-Lock Microphone Blockers

-Mic-Lock Camera Finder Pro

-BitDefender (Anti-Virus)

-Stop using SMS and WhatsApp, start using Signal.

-Use Element instead of Slack for group coordination

Get In Touch: You can contact us here

Want the full transcript for this week's episode? Easy. All you gotta do is sign-up for our free newsletter. If you do, you'll also get a .mp3 and .pdf of our new book, "How to Protect Yourself From Fascists & Weirdos" as soon as it's ready.

Episode #11: You Can Find Love (and Privacy) in a Subaru

-DuckDuckGo Ad #1

Announcer: Here's three reasons why you should switch from Chrome to the free DuckDuckGo browser.

One: It's designed for data protection, not data collection.

If you use Google Search or Chrome, your personal info is probably exposed. Your searches, email, location, even financial or medical data, the list goes on and on. The free DuckDuckGo browser helps you protect your personal info from hackers, scammers, and data-hungry companies.

Two: The built-in search engine is like Google, but it never tracks your searches.

And it has ad tracker and cookie blocking protection. Search and browse with ease, with fewer annoying ads and pop-ups.

Three: The DuckDuckGo browser is free.

We make money from privacy respecting ads, not by exploiting your data. Download the free DuckDuckGo browser today and see for yourself why it has thousands of five-star reviews. Visit DuckDuckGo.com or wherever you get your apps.

Stupid Sexy Privacy Intro

Rosie: Welcome to another edition of Stupid Sexy Privacy. 

Andrew: A podcast miniseries sponsored by our friends at DuckDuckGo. 

Rosie: I’m your host, Rosie Tran. 

You may have seen me on ChimeTV’s A Brand New Yay!

Or on Season 2 of Peacock’s Comedy InvAsian.

Andrew: And I’m your co-producer, Andrew VanVoorhis. With us, as always, is Bonzo the Snow Monkey.

Bonzo: Monkey sound!

Rosie: I’m pretty sure that’s not what a Japanese Macaque sounds like.

Andrew: Oh it’s not. Not even close.

Rosie: Let’s hope there aren’t any zooologists listening.

Bonzo: Monkey sound again!

Rosie: Ok. I’m ALSO pretty sure that’s not what a Snow Monkey sounds like.

*Clear hers throat*

Rosie: Over the course of this miniseries, we’re going to offer you short, actionable tips to protect your data, your privacy, and yourself from fascists and weirdos.

These tips were sourced by our fearless leader — he really hates when we call him that — BJ Mendelson. 

Episodes 1 through 24 were written a couple of years ago. 

But since a lot of that advice is still relevant, we thought it would be worth sharing again for those who missed it.

Andrew: And if you have heard these episodes before, you should know we’ve gone back and updated a bunch of them.

Even adding some brand new interviews and privacy tips along the way.

Rosie: That’s right. So before we get into today’s episode, make sure you visit StupidSexyPrivacy.com and subscribe to our newsletter.

Andrew: This way you can get updates on the show, and be the first to know when new episodes are released in 2026.

Rosie: And if you sign-up for the newsletter, you’ll also get a free .pdf and .mp3 copy of BJ and Amanda King’s new book, “How to Protect Yourself From Fascists & Weirdos.” All you have to do is visit StupidSexyPrivacy.com

Andrew: StupidSexyPrivacy.com

Rosie: That’s what I just said. StupidSexyPrivacy.com.

Andrew: I know, but repetition is the key to success. You know what else is?

Rosie: What?

Bonzo: You'll have to listen to hear this. ;-)

Rosie: I’m really glad this show isn’t on YouTube, because they’d pull it down like, immediately.

Andrew: I know. Google sucks.

Rosie: And on that note, let’s get to today’s privacy tip!

This Week's Privacy Tip

Rosie: This week, we enlisted the help of Lauren Hendry Parsons. Lauren is the Director of Communications at the Mozilla Foundation. 

You see, back in Episode 4, we aired our original episode on privacy as it relates to your car. 

Better known as that thing in your driveway that goes vroom vroom.

Our advice from that segment, which first aired in 2022, still holds. 

Whenever possible — if you want to listen to your audiobooks, music, or podcasts — get an inexpensive secondary phone with a headphone jack. 

You can then use an auxiliary cable, to connect your second phone to your car.

This will keep your information out of the hands of the car manufacturers and data brokers.

Alternatively, you can also use a USB Flash Drive with your audio files stored on it.

But basically, our advice is this: 

#1. Don’t connect your smartphone to your car. 

#2. Don’t put the app that comes with your car on your smartphone. 

Simple enough, right?

But, let’s say you have to use the car’s app  …

In that instance, you have another reason to get that second phone. 

The trick is making sure the second phone never leaves your vehicle, and doesn’t have any of your personal information on it.

In future episodes, we’ll show you how to do this.

But. There’s a bigger problem here; and that’s why we wanted to talk to our friends at the Mozilla Foundation.

The problem is this: Your car isn’t a car anymore. It’s a computer. 

And it’s not even a good computer. It’s a bad one with less security than a Waffle House.

So, not only did we let the tech bros ruin the joy, freedom, and independence that comes from owning a vehicle; when you buy one, you’ve also signed up to give all your data away to the manufacturer: Where you go. How often you brake. And how bad your singing voice is.

SFX: Monkey Sound!

Rosie: In America, depending on what state you live in, there’s very little you can do about this.

Now you’re in a situation where, let’s say you own a car that’s spying on you, which is pretty much all of them since 2015. 

That data can easily fall into the wrong hands. The kind that can find your garage code within that data, and enter your house. 

Unfortunately, none of the solutions to this specific problem are going to be cheap.

We recommend purchasing a used vehicle made before 2015, for example.

But. If that’s not something you’re interested in, then your best bet is to buy a new Subaru.

They are usually the top rated vehicles by Consumer Reports. 

Why Subaru?

Because they do something the other car manufacturers don’t. 

As of this recording in November of 2025, Subaru is the only manufacturer that applies the California Consumer Privacy Act to all American citizens. 

That means, regardless of whether or not you live in California, you can opt-out from Subaru’s data collection and distribution through their online privacy portal. 

On that page, you’ll also see each part of the CCPA that you can cite in your requests. Two important ones being: The right to opt-out of any data sharing, and the right to be forgotten.

The right to opt-out will immediately save you money. Because what’s happening is that the manufacturers are selling your information to car insurance companies. Who then adjust your premiums based on the data they’re getting.

The right to be forgotten means Subaru has to delete almost all of your data, and tell the other people they’ve shared that data with to do the same. 

For that reason, if you’re going to buy a new car, we recommend you purchase one from Subaru.

And yes, you should always buy a car over leasing it, unless you have an accountant that tells you otherwise.

Now let’s get to our interview with Lauren, who’s going to share even more tips on protecting yourself from fascists and weirdos, when it comes to your car.

Interview With Lauren Hendry Parsons, Director of Communications at Mozilla

(Note that the following transcript has been lightly edited for brevity and clarity.)

BJ Mendelson, co-producer of Stupid Sexy Privacy: All right. So I usually have everyone start out by introducing themselves and just telling us a little bit about who they are and who they're with. So I'll have you begin there.

Lauren Hendry Parsons, Director of Communications at the Mozilla Foundation: My name is Lauren Hendry Parsons and I'm the Director of Communications at Mozilla Foundation. My background is in digital rights advocacy, and the reason why I'm with the Mozilla Foundation is because we're trying to build a better technology future which is powered by people and open by design and fueled by imagination.

I think the thing that really sets Mozilla Foundation apart is that we come at everything with this defined optimism that things can be better, but it takes all of us coming together, believing that things can be better, dreaming up the alternatives to the status quo that's out there right now, and then going ahead and building it. The future of technology can be good if we build it together and we need to defy the defaults and step up together and make sure that technology serves people, not corporate interests or other systems.

BJ Mendelson: I've been thinking a lot about Star Trek lately — Just because I'm catching up on Stranger Worlds — and they have a very sophisticated AI that they use when they have questions and problems and we can have that world, right? Like you don't have to worry that some company ... You can have an information exchange with a computer system is what I'm saying, without that fear of where's this information going, right?

They've solved that problem in that show and we could solve that problem here.

Lauren Hendry Parsons: I think Star Trek is such a great example of the way that you can dream towards a different society and a different way of doing things, not just in the technology and approach to AI, but in so many different facets of that show was imagining different ways that society could be. From money to identity to the ways that you engage with the world.

And I think one of the things that I really love about Star Trek, it's funny you pressed on this one, is the fact that they don't work for money, they work for purpose. And at one point I remember somebody who was new to the culture saying, why does anybody work then? And they said, well because we want to, we want to contribute, we want to be connected to community, we want to be part of something that has meaning and a higher purpose.

And I think so much of that is about being human, what we bring to the table, what we're motivated by. And one of the things that we're saying at Mozilla Foundation is that the way that technology is being introduced, the way that AI is being deployed, the way that data is being handled, these things are not centering humanity, connection, purpose, people. And that's when you get into extractive practices or into situations where technology is not serving the people who are using it.

BJ: For people who aren't familiar with Mozilla, so they might associate the name Firefox and Thunderbird. So is tha typically like the entry point for people to the Mozilla foundation?

Lauren: It absolutely is for over two decades. Mozilla has been synonymous with fighting for an open web, with being proudly independent and nonprofit. And of course, with being a great, open source browser with Firefox today, Mozilla foundation is looking to make, inspire and mobilize people around the world to build a better technology future that serves and is shaped by the people who use it.

We want to build the digital world that people hope for and deserve by fueling technology that's open, trustworthy, and prioritizes the public good. So at the foundation, we do things like advocacy, education, funding, and innovation. And really it's about helping people first imagine and then build alternatives. Because if you can't dream it up, if you can't even begin to imagine that things could be different, how can we build it?

BJ: I love that answer because we... By the time our listeners get to this interview, we've already done two episodes on creativity and storytelling.

Lauren: That's great. I think all of our work is really grounded in integrity and imagination, the belief that the better is not only possible, it's ours to create. And I think that if we can reposition ourselves within our own mental frameworks, not as consumers, but as co-creators, that immediately changes the dynamic in any system or with any technology that we're engaging with.

BJ: I think that's a good segue to talk about what's going on with the vehicles because these days your car is basically a computer.

Lauren: You're absolutely right. A car started as a mechanical technology where you're alone in a bubble, it epitomized freedom. You got on the open roads, you drove, you sang, you screamed, you had conversations, you explored, and there was a sense of like untethering from the world.

And I think that's so big in the stories that we tell ourselves and in the tradition of the road trip, and the idea of what the car can be, and the way that it turns up in pop culture. And I think as technology has been integrated throughout different parts of the automotive industry, the reality of that shift hasn't been fully understood or clocked from this bubble of freedom and independence to a place where you're effectively surveilled and you've got data collected at every point.

BJ: Yeah, you know, it's fascinated to me. So I'm a history nerd. And so I've been doing a deep dive into the history of Los Angeles, which is our home city. And for us, you know, it's a city that was built for the car. It was built for better or worse, I mean, there's definitely some downsides to that too, but it was built with this idea of adventure. You know, you're driving along the coast, come on out here, it's accessible, you can get anywhere with your car. And now it feels like getting into the car is now a chore. And it's almost as if you're working for the tech companies , with all the data that you're generating for them.

Lauren: So interesting when I think about how the car started and where it is now, just to take a little tangential walk with you about this. It started as something where you were free and the roads were open and as cities clustered, as mega highways happened, you had this sense that being in the car was a chore already in terms of engaging with the external world. But at least the bubble inside the car was your moment. You might be stuck in traffic, but you can sing along, you can have your quiet time, and even that's being eroded.

It's an interesting parallel to think about, the loss of joy, of the act of driving the car through certain landscapes, as well as the loss of joy or freedom while you're inside the car.

BJ: Right. I know ... It'll sound like a cognitive leap for people who are just coming to listen to this episode, but something we've talked about on previous episodes is the loss of joy, the loss of imagination is what fuels fascism.

And the whole undercurrent of our show is how to fight fascists and weirdos. And so that's why we thought it was so important. Even though we already did a car episode, we aired our original car episode about a month ago.

And we said, we probably have to go back to this, for that reason, and for all of the things that are going on.

So let me ask you little bit more specifically. So I know when we first put the car episode together back in 2023, we relied on a lot of the information that Mozilla had put out, considering the privacy policies.

But for people who maybe missed that episode or just are coming into this new. Do I have it right that the Mozilla Foundation went through all the privacy policies for the different car manufacturers?

Lauren: That's right. We went and looked at 25 different brands. We read their privacy policies. We went through how they handle data. We looked into all of the legal hoops that you need to jump through when you buy the car. Of all the 25 brands, we reviewed every single automotive brand in 2023 when we did the study. Collected more personal data than was necessary, and none of them met our minimum security standards.

And it's really interesting because unlike apps or devices where opt-outs are increasingly standard, car companies are defaulting to over collection. And what made cars uniquely terrible was the scale and sensitivity of the data was being collected. There were deeply personal details combined with fake policies and nearly no transparency or controls for users.

BJ: Was there any one that stood out as, I mean, they're all bad, right? In this context, but I'm wondering, was there any car manufacturer where you're like, this is the least worst of the bunch?

Lauren: The least worst ... were probably the cars that came out of Europe because they were being made and operated within a framework where already there's more of a mind towards being privacy preserving. So we saw Renault and Dacia, which had clearer privacy statements and more limited data collection claims. And that was likely due to the stricter data laws in Europe, like the GDPR. So those are the two brands where when we did the study, you were able to write in and request that your data be deleted.

So you had the right to get that back. But the worst offenders were companies like Nissan and Kia and Tesla. We flagged them all for either explicitly invasive policies or egregious data sharing and across the board, just really poor transparency. All of these privacy policies are dozens and dozens of pages long. Some of them as long as a small book.

And the reality is when you're at that point of buying a car, you're not going to pause likely and read through these reams and reams of paperwork. You're there, you want the car, you're not going to spend time reading over 35,000 words long. I mean that's more than the US Constitution.

BJ: Yeah. And you know, it's funny because I went through ... Before we went to pre-production to bring the show back, I went back to all the privacy books, about 30 of them. And everyone consistently says in each book that these terms of services and privacy policies are illegible. But they're designed in such a way that, unless someone like the Mozilla Foundation sits down and goes through it, it's a safe bet that no one is looking at it. It's like they're just not aware.

Lauren: Even if you do take the time to read them, the policies are often vague and overly broad or just unreadable. And if you dig into it, many policies were giving the companies permission to sell or share the data and they don't say clearly with whom or why or under what circumstances. And I think overall with these policies, even if you go in and look at your single car policy, there's a question of like, what can I do? And I think the fact that we took the time to look at the entire industry and see that this wasn't just one-off issues where it's between the consumer and the car company, but instead this is an industry-wide issue. It really highlights a gap and a need for regulation in order to make sure that people are protected. And if you're buying a car, you're not going to go and read 25 different privacy policies. Maybe you'll get through part of your own, but this is really where researchers need to step in and do the hard yards, or do the long hours, of reading just to pick up where those trends were; and to make it clear that this is a pervasive industry wide issue.

BJ: Yeah. And so I think one of the things we'll get to at the end of interview is we don't think federal policy is coming in the United States. Not just because of who's currently in charge, but because the tech companies spend a lot of money lobbying. For example, I talk a lot about California has the California Consumer Privacy Act, which has come up a bunch in the show. And then usually we get the question of, all right, well, what about big states like New York?

New York has no comprehensive privacy law on the books specifically because the tech companies don't want it. And this is a Democrat trifecta. They run the assembly, the Senate, and the governor's mansion. So it's not just a Republican and Democrat thing. It's just the tech companies here have a lot of influence.

Let me ask you ... What some of those politicians will hide behind is this. My dad, I said to him, your car was built before 2015, but you have Starlink, which means Elon Musk, everyone's least favorite billionaire, is collecting all sorts of information on you. And so his response was, "all my information is out there, why should I care?"

We hear this a lot. This is something that we spend a lot of time addressing, or people saying that we don't want to be a privacy Karen. That's a new one that we've heard. We don't want to be a Karen in complaining about privacy ...

Lauren: ... There's nothing like trying to shame people in order to get them to stop speaking up about something that actually matters. Isn't that fascinating?

BJ: Yeah. That one kind of threw me for a loop, right? Because usually what I've heard is what my dad is saying, which is all my stuff is out there, why should I care? But I don't want to be a privacy Karen was definitely a new one.

What is typically your response when people raise that kind of answer to you?

Lauren: It's a totally valid feeling. A lot of people are overwhelmed. There's so many different vectors, so many different devices, so many different pop-ups in your day demanding access to your info and maybe a sense of like, I haven't done anything wrong. I've got nothing to hide, but look, here's the thing. It's never too late to draw the line.

And it's, this isn't just about ads following you. It's about insurance companies, raising your rates, companies profiling your behavior or even future employers making decisions based on your data. That's real world impact. And a really interesting thing with cars, with cars, the risk is really unique. You're not going online. You're just driving to work or driving to the shops and your car is logging your movements. Maybe your mood, your contacts, like that level of surveillance used to be unthinkable. And I think people get into the trap of having black and white thinking, all or nothing. Caring now doesn't have to mean deleting everything. It can mean pushing for protections or making certain choices, choosing tools and systems that put you in control. And I think that it doesn't have to be overwhelming.

And the one other thing I'd say is beyond the car, it's not just about where you go, it's about who you are. The more that the data companies, the car companies, the data brokers that they sell the information, the more they learn about you, the more companies can infer and predict and ultimately manipulate. And the kind of data they're collecting, which is your location, your voice, your contacts, it's intimate. It paints a picture of your routines, your relationships, habits in a way that it's really hard to imagine until it's too late. You know, they know when you go shopping, they know who you're meeting up with. They know what your habits and choices are. And so the real risk isn't just the collection, it's the unknown. You don't know who sees your data, you don't know how long they keep it, you don't know at what point they might infer something or when a landscape might change, and suddenly the information that was inconsequential before suddenly becomes consequential.

Most recently, really, there's been a lot of conversation about insurance rates. So that's one way. You don't know if it's being used to set your insurance rate or target your ads or flag you for something you never agreed to. And on top of that, many of these systems are either leaky through lack of engineering and care or potentially leaky by design. They're built with weak security, vague rules, no clear way to opt out. And once it's out there, you can't pull it back. So you don't need to be black and white, but it is worth caring about.

If we do stand together and say, this is something that we care about, that matters, that we see how it impacts us, our community, we hope that the makers, the regulators will begin to listen. And we've seen in Nebraska, for example, that there is action being taken. Even if there won't necessarily be federal laws, there can be people who want to stand up and say, this doesn't sit right with us.

Ad Break For Our Cool New Book

BJ Mendelson: Hello Everyone, this is BJ Mendelson, and I am the writer and co-producer of Stupid Sexy Privacy.

When I’m not working on the show, I’m usually yelling at my television because of the New York Mets.

I want to take a moment to tell you about a book I co-authored with Amanda King.

It’s called “How to Protect Yourself From Fascists & Weirdos,” and the title tells you everything you need to know about what’s inside.

Thanks to our friends at DuckDuckGo, Amanda and I are releasing this book, for free, in early 2026.

If you want a DRM free .pdf copy? You can have one.

If you want a DRM free .mp3 of the audiobook? You can have that too.

All you need to do is visit StupidSexyPrivacy.com and subscribe to our newsletter.

That website again is StupidSexyPrivacy.com, and we’ll send you both the PDF and the MP3, as soon they’re ready.

Now, I gotta get out of here before Bonzo shows up. 

He’s been trying to sell me tickets to see the White Sox play the Rockies.

And I don’t have time to explain to him how Interleague Baseball is a sin against God.

I’ve got a book to finish.

Interview Resumed


BJ: I think a lot about China. I've been doing a lot of reading about how the American tech companies have essentially powered the authoritarian state that's over there. And a lot of the people that live there will tell you ... just knowing that someone can access your data changes your behavior, right? Like there's an observer effect that happens. And so I don't think we ... I think we're starting as Americans at least to understand a bit more. But we're not quite where people in China might be where they understand now the system that they live in. And even if you say, I don't care. All my data is out there. You start to realize, wait, this data can be used to shape your behavior. And just by feeling like someone is watching you fundamentally changes and shapes your behavior.

Lauren: And I think this also points to something else, which is forced consent and data agency. If you're being surveilled in this way or your data has been collected or even could be collected and used in future, even if it's perhaps not actively being collected, it takes away your choice about how your data is used. There's a sense of, you're told you agreed to this, even though you had no real choice. You bought the car. You've agreed to this. You've connected your phone at some point to the radio. You agreed to this. You probably clicked accept to drive the car or use the app. That's not consent. That's coercion. And so when we think about not just privacy, but data agency, the idea that you should be in control of what people are collecting about you, the information that they have, how they handle it, whether or not they delete it or keep it, who they sell it onto or who they pass it onto. Data agency is the opposite of that covert collecting. It's about giving people meaningful control over what data is collected and why and by who. And it means that you can say no. It means you can change your mind. It means you can see what's being collected about you. And in these situations where it's unclear, where you're not actively opting in, where things can change or be controlled from a central system, because you've got a computer hardware update in your car, you're not able to actually make a choice. Buying the car equals opting into whatever changes they make in the future, whatever decisions are made.

And I think it goes further. For those of you out there who are thinking, but I'm a passenger princess, I'm not a driver, this doesn't matter to me. I only take Ubers or I only get driven by friends. Passengers are caught in this space as well. This isn't just about drivers. It's about anybody in the car where they're picking up fragments of voice or where you've been, or if your phone automatically connects to the Bluetooth or has ever connected to the Bluetooth. And I think this is a real issue because a driver is also having to make decisions willingly or unwillingly for anybody who then gets in the car.

BJ: Yeah. It's, very easy to, to feel like, it only affects the driver, not realizing, hey, wait a second, some cars are creating basically a digital clone of everything on your phone. So it's not just the driver. It's anyone the driver's been in contact with, including someone else that might be in the vehicle.

So we tend to approach things in terms of tiers, right? So for us, like, tier one, for example, is getting people off of Chrome and into DuckDuckGo. Just so that they can experience, you don't have to use Google.

And then for most people in most cases, we're like, okay, DuckDuckGo's got the VPN, it's got data removal, so off you go, right? Like that's sort of taken care of.

But then we still tell people, okay, but there's more tiers, right? There's tier two, which is Firefox. And tier two is using Firefox with ProtonVPN so that you've got this additional layer of security, depending on what you might do for a living. So for me as a journalist ... I have DuckDuckGo on the phone, but my primary browser is Firefox. We've talked about this before where I've got like a very ... tricked out Firefox setting for maximum security.

It's true too with the cars, right? Like there's gonna be tiers of what works and what doesn't. So, what we've started to do with the cars is say, in the United States, if you can't import a vehicle and we've got tariffs and ... at the time that this interview is being recorded, who knows what's gonna happen with that. So ... It might be very difficult to purchase a vehicle from the EU. And it also might be difficult to get a Subaru Forester for that matter to come to think of it.

But what we told people is ... if you're tier one and you're just sort of like, I don't wanna think about this. I just want like the most basic privacy protection. We tell them get a Subaru Forester, buy it in cash. Make sure you purchase a second head mounted display so that you can swap out the one that has all of your data before you sell it. Also make sure that Subaru has, what I like about, the reason why we're talking about Subaru is that Subaru specifically, even though the CCPA only applies to California residents, Subaru recently has set up uh essentially a privacy opt out for everybody. So the CCPA guidelines that allow you to request your information ... It doesn't matter where you are. You can go to this website, you can opt out, and then you're good to go.

So that's sort of how we think about it. I'm curious about what are some of the things you recommend for people that are sort of like at this tier one level?

Lauren: First of all, I want to say I love an industry leader. I love to hear about a brand who says, here's a standard being set and whether or not it's being enforced across all of my jurisdictions or not. This is better for the consumer and I want to give them that choice and we're going to go out there and do that. So, just kudos to Subaru for stepping up and taking that step. It makes such a difference when industry players lead the way because it gives consumers a real option to choose that. They can vote with their dollars, they can vote with their feet and the kinds of workarounds that you're talking about or approaches you're talking about is so powerful and I hope that that sends a real message to the rest of the industry that there is demand. That people are prepared to vote with their money and to invest in a brand that is taking these steps to protect them and to give them options on that level.

I understand though that that's not possible for everybody and so I say very lightly, buy an older vehicle that does it, that is not computerized and in fact I have an old school computerized vehicle. I would think as much as possible about not connecting via Bluetooth, my personal device or my work device, to the car. As much as possible I would use my key fob rather than using my phone to unlock the car.

Where I could keep my devices that go with me everywhere from communicating with my car as much as possible I would. If I had a choice about whether or not I used my phone with an app as the insurance kind of like risk monitor. Sometimes with certain kinds of insurance, they ask you to have an app running in your phone at all times. If there was a way instead to get a separate device, a box that only has one purpose and that isn't connected to the rest of your digital life, I'd go for that as well.

And I'd be really careful about checking my settings if there was an app associated with my car use, where I had the option to make sure that my health data wasn't integrated, or other apps weren't speaking into my car app as much as possible. I would look at that and I, maybe this is a privacy Karen thing to do, but like, it's a real good time sometimes going to deep dive through your settings and just see what there is, turn things on and off, get curious, play around with it and just stay up to date knowing that this thing might be pushed to you more often than not.

And these things sound really simple and to a certain extent they are, but it's almost like hygiene. Like, can you set yourself in the same way when you change your oil, do you check your app at the same time? Is there something where when you're doing something physically with your car, you make sure that you do a kind of digital hygiene check-in as well in case any settings have been changed in the meantime.

And of course, if you are able to choose a brand like Renault or like Subaru where they let you remove your data, delete your data, have more control over your data as much as possible, do do that. And if you can bring yourself to read the privacy policy, here's what I always say, skim it and look for how they handle your data. Do they sell it to a third party? Do they handle it securely? Have they got open-ended promises? These are the kind of red flags that you can look for because sometimes just not knowing is part of the damage. It might be that you decide it's enough for me to know that it's happening. So I'm aware and it's not happening to me unexpectedly. And I want this car and it works for me and I'm going to live this way. But be informed. Know where your information is going, know where your data is being handled in. I'll also say it's a personal choice for everybody. Sometimes the convenience or the prestige or you want that particular engine, the trade-off might feel worth it to you. Like it's a decision for everybody to need to make, which is why I go back to the fact that we need basic standards whereby people are able across the industry to have that. So you don't have to choose between a powerful engine or a particular look of a car and being able to have power over your data.

BJ: If people think that this is tricky, wait until we start telling them to set up a trust and then have a friend go and buy the vehicle in cash. I promise you it's gonna get trickier as we go along. Is there anything else that you would recommend? So for example, we tell people to take a zero day every 90 days. And what that is is essentially you go out, you buy yourself lunch. And you use that time to go through your bank statements for the last 90 days. You go through the apps on your phone. Maybe you should also go through your car settings. But is there anything that usually you don't get asked about or suggestions that don't usually come up in these conversations that you would want to highlight?

Lauren: This might sound repetitious, but I really do think that information is power. It's about knowing what's happening with your data. It's about knowing what trade-offs you're making and making them actively. I think so often there's a sense of like, I won't bother reading it, I'm not going to try. Like where are the places where you can tolerate friction or you can tolerate inconvenience in order to better inform yourself or to be able to work some practices into your life that will incrementally improve things? And I always think about privacy, data agency, digital rights, very much in the kind of cybersecurity Swiss cheese model. Like there are so many slices, there are so many different gates that you're going through all the time when it comes to engaging with technology, that if you can just sort out three or four of the slices, you don't need to do the whole hundred slices, but every slice of activity or consideration that you can give to the way that your online experience is working and you can tweak it and optimize it in ways that are comfortable for you and that are the right choices or trade-offs for you, you overall will be in a better place.

And just when people are thinking about data, I think already we collectively understand that like our attention where we put our eyeballs equals money. Therefore it has a value, therefore it's worth manipulating. I really want people to think about the fact that data is a currency of sorts. Data is power. And right now too many people don't have a say in how this is used. And in the same way that you want to have control of your finances or, you know, control of the other parts of your life. This is something where it's worth understanding that it is valuable. It will only get more valuable. You are adding to that treasure trove every day and you are not the one benefiting from it. And that's kind of why Mozilla foundation, we do what we do. We think people should have a say in how their data is being used. We're investing in a world where data agency is the norm, not the exception. that means pushing for like better laws, supporting tools that put you in control and, you know, really encouraging people to work out what's under the hood of their favorite app or in this case, you know, favorite brand of car.

BJ: So besides telling people to download Firefox and Thunderbird, which is something we recommend, and we're going to be talking a lot about Thunderbird throughout this series, how can people support the Mozilla Foundation?

Lauren: I'd like to tell people about two different things that are going on. The first one is a really exciting project called Common Voice. One of the things that we know is that data has power and data sets, training sets decide how technology is made and who it serves. So Common Voice is a program, a platform, a technology which collects AI data sets of languages, underrepresented languages, in order to create open source, underrepresented language data sets that allow technologies to be developed for communities who are not usually served by those technologies.

It's put together with the power of community. We've had 750,000 donations to this where people come in and they read something or they use their voice. They're literally donating their words, their voice in these languages. And those data sets are available for technologists to download and to use, so that their technologies can engage in a multitude of languages, which makes technology and the information that's contained in it much more accessible for everybody.

So, I would say if that's something that you're interested in, if you're a technologist, hey, there are hundreds of incredible language data sets right there, ready to be used. And if you're somebody who has an interest in contributing in this way, you can jump over to Common Voice, you can read something, you can write something, you can help translate. And all of these things go towards making more equitable technology access for everybody around the world.

And of course, I'll also say from a Mozilla Foundation perspective, head over to our website, check out the campaigns and the stories that we're telling, because we are trying to fund, support, fuel a better technology future and the more people who come and join us and be a part of that the more we can change the world and the more we can shape the future that we want, that serves us.

Live Read #6 Evil Data Brokers!

Rosie: When your friends at Stupid Sexy Privacy are out in the world, we often hear two things that we want to address:

The first is, “I don’t want to sound like a Karen, because I’m concerned about my privacy.” 

You’re not a Karen if you’re concerned about privacy. 

Privacy is a fundamental human right.

So if you care about privacy, you care about other people. 

That makes you the complete opposite of a Karen.

The second thing we hear is, “All my stuff is out there already and there’s nothing I can do about it, so why bother?” 

And we totally get that feeling.

Most of us are burnt out, working multiple jobs, and taking care of kids or elderly parents. Or both.

Managing our information can feel like just one more thing to add to that list. 

The good news is, managing your privacy is not an all or nothing thing.

You can do a little at a time, and still take back control of your privacy, security, and anonymity. 

The easiest place to start is by cleaning up the data that’s out there.

This is especially important so that it can’t be used to deny your health insurance claims or make you pay more for rent.

Here’s how that works:

Data brokers collect and sell personal data, which they aggregate from public records, Internet trackers, and other sources. 

Many of these sites will display some amount of your personal information, for free.

And if a human can see this data? 

So can the bots and scrapers. Many of whom use that data to render adverse financial decisions against you, without you ever knowing.

That, my friends, is why you should always appeal a health insurance denial when you get one. But we’ll talk about that in a future episode.

For now, our friends at DuckDuckGo offer a solution that we want you to know about.

As part of the DuckDuckGo subscription plan, they offer Personal Information Removal services.

The kind that can help find and remove your personal information, such as your name and address, from data broker sites that store and sell it.

This helps to combat identity theft and spam.

Access to Personal Information Removal comes with the DuckDuckGo subscription plan.

And here’s the best part …

There are a lot of data removal services available. The thing is, you often have to give them personal information, including your driver’s license, which they then store on their servers.

DuckDuckGo doesn’t do that.

All of the information you provide them is stored locally on your device. This includes the monitoring and processing of removal requests.

It then scans these sites on a regular schedule to minimize the risk of your information reappearing. 

After the initial scan, you can track the progress of ongoing removals, keep tabs on the total number of records that have been removed, and see the site-scanning schedule on your personal dashboard in the DuckDuckGo browser.

While it doesn’t cover all of the databroker websites out there, DuckDuckGo monitors over 50 of those websites, with more being considered.

You can sign up for the subscription via the Settings menu in the DuckDuckGo browser, available on iOS, Android, Mac, and Windows.

Or via the DuckDuckGo subscription website: duckduckgo.com/subscriptions

This service is currently only available in the United States and on desktop.

Stupid Sexy Privacy Outro

Rosie: This episode of Stupid Sexy Privacy was recorded in Hollywood, California.

It was written by BJ Mendelson, produced by Andrew VanVoorhis, and hosted by me, Rosie Tran.

And of course, our program is sponsored by our friends at DuckDuckGo.

If you enjoy the show, I hope you’ll take a moment to leave us a review on Spotify, Apple Podcasts, or wherever you may be listening.

This won’t take more than two minutes of your time, and leaving us a review will help other people find us.

We have a crazy goal of helping five percent of Americans get 1% better at protecting themselves from Fascists and Weirdos.

Your reviews can help us reach that goal, since leaving one makes our show easier to find.

So, please take a moment to leave us a review, and I’ll see you right back here next Thursday at midnight. 

Right after you watch my episode of Comedy InVASIAN on Peacock, right?


Want to get in touch? You can contact us here