All We Want For Christmas Is The Second Bill of Rights

It's our (a day late) Christmas episode! In this episode, BJ speaks with author Victoria Hetherington about the bonds between humans and artificial intelligence. Our host, Rosie, explains what the Second Bill of Rights is and why it could mean a universal basic income for you.

All We Want For Christmas Is The Second Bill of Rights
Photo by Markus Spiske / Unsplash

Hello Again ...

Well, I did promise you more Tuesday emails, right?

As you might have guessed, last Thursday was Christmas. While the episode came out on time, because of the holiday, we are slightly behind schedule with our show notes.

This week's episode will be the same sort of deal. The audio will go online, as scheduled, on Thursday, January 1, 2026, but the show notes will run slightly late, most likely arriving next Tuesday, January 6.

Honestly, I know people are on vacation, but I'm not the type that takes time off. When everyone goes on vacation, I usually find something to get into.

Case in point ...

A Slight (Temporary) Change to Privacy Tips

We're going to get super into OSINT for Privacy Tips (our sister podcast) starting in January. Privacy Tips is meant to fill in the gaps from our main podcast, Stupid Sexy Privacy. For example, in our Christmas episode, we said not to use generative AI and to use your head instead.

Privacy Tips is meant to answer the question, “Okay, but what if I want to use large language models? How do I use those safely and securely?” While we prefer that you do not use any generative AI, we also don’t want to leave you guessing.

The dilemma I have right now is that I'm simultaneously finishing our new book, How to Protect Yourself from Fascists & Weirdos, and brand-new episodes of Stupid Sexy Privacy.

So I need Privacy Tips to be something different for a little while. This way, I'm not going to the same well too often creatively. I don't know if that makes sense to you—you're not inside my head.

The other thing is that I am an investigative reporter, and there’s this project I’m working on. I thought it would be kind of fun to explain what I’m doing with OSINT (open-source intelligence gathering) in order to assist in that endeavor, without giving away important information.

But.

I find OSINT really boring, and the second I say something like, “Well, OSINT expert Michael Bazzell says the first step is to install a virtual machine on your computer…” I’m going to lose like half of you.

But if I told you a story that involved the use of OSINT, well, that’s more interesting, isn’t it?

So I will be doing that with Privacy Tips until I’m done with the book and the new Stupid Sexy Privacy episodes, or at least programmed out so I know what Season 2 won’t be covering. Then Privacy Tips can cover the stuff that gets left out.

Either way, I share this with you because, once the OSINT project is underway, you’ll receive an email on Tuesdays from us fairly regularly.

Keep your eyes out for more information soon.

-BJ

P.S. For our subscribers who are on the $2-a-month plan, you’re going to get some bonus content soon. We’re almost ready to share some fun stuff from our archive. This includes interviews from 2023 that we couldn't share with everyone in the revised edition of Season 1.

We’re also still working on the “bonus gift,” which is a PDF of a book we wanted to put out but never did. It's done. I just need to jazz up the PDF.

Nobody is obligated to sign up for the paid tier. Our goal is to give everything we can away for free. However, if you want the ability to comment on these posts, access our archival material (as well as some bonus interviews), and support the project, you can do so for $2 a month or $24 for the year.


Today's Show Notes

Stupid Sexy Privacy Show Notes For Season 1, Episode 18

Episode Title: All We Want For Christmas Is The Second Bill of Rights

Guest: Victoria Hetherington, author of "The Friend Machine: On the Trail of AI Companionship."

Episode Summary: It's our (a day late) Christmas episode! In this episode, BJ speaks with author Victoria Hetherington about the bonds between humans and artificial intelligence. Our host, Rosie, explains what the Second Bill of Rights is and why it could mean a universal basic income for you.

Listener note: We ran into some trouble with the audio for Victoria's interview and did our best to clean it up a bit. We recommend increasing the volume on your device during the interview portion of the show to better hear Victoria and BJ's conversation.

Key Points From This Week's Privacy Tip

  1. We're not going to see Artificial General Intelligence in our lifetime. Anyone trying to tell you otherwise is either crazy or trying to sell you something.
  2. Artificial General Intelligence refers to software that can solve any kind of problem. Artificial Narrow Intelligence refers to software that can only solve a specific kind of problem. 
  3. Don't use Generative AI when possible, use your head instead.
  4. In a simplified version, the Second Bill of Rights would consist of two parts. Part 1: A constitutional Amendment, guaranteeing all Americans a right to a Universal Basic Income, a right to Universal Healthcare Coverage, and a right to freedom from corporate tyranny. Part 2: This amendment would also remove Corporate Personhood. And it would guarantee that the federal government will fully staff and fund the government agencies responsible for policing corporations.

Our Sponsor: DuckDuckGo <--Our Recommended Browser and VPN

Get Your Privacy Notebook: Get your Leuchtturm1917 notebook here.

-BitWarden.com (Password Manager: easier to use, costs money)

- KeepPassXC (Password Manager: free, harder to use, but more secure)

-Slnt Privacy Stickers for Phones and Laptops

-Slnt Faraday bag for your Stranger Danger phone.

-Mic-Lock Microphone Blockers

-Mic-Lock Camera Finder Pro

-BitDefender (best anti-virus for most people across most devices)

-Stop using SMS and WhatsApp, start using Signal.

-Use Element instead of Slack for group coordination

--Use cash whenever possible. If you have to buy something online, try to use Privacy.com to shield your actual credit or debit card when making purchases online.

Get In Touch: You can contact us here

Want the full transcript for this week's episode?

Easy. All you gotta do is sign-up for our free newsletter. If you do, you'll also get a .mp3 and .pdf of our new book, "How to Protect Yourself From Fascists & Weirdos" as soon as it's ready.

Stupid Sexy Privacy Season 1, Episode 18

(The following transcript has been lightly edited for clarity and brevity. We had some trouble with the audio for our interview with Victoria, and apologize for the inconvenience. We recommend turning the volume up to better hear her.)

DuckDuckGo Commercial

Here's three reasons why you should switch from Chrome to the free DuckDuckGo browser.

One: It's designed for data protection, not data collection.

If you use Google Search or Chrome, your personal info is probably exposed. Your searches, email, location, even financial or medical data, the list goes on and on. The free DuckDuckGo browser helps you protect your personal info from hackers, scammers, and data-hungry companies.

Two: The built-in search engine is like Google, but it never tracks your searches.

And it has ad tracker and cookie blocking protection. Search and browse with ease, with fewer annoying ads and pop-ups.

Three: The DuckDuckGo browser is free.

We make money from privacy respecting ads, not by exploiting your data. Download the free DuckDuckGo browser today and see for yourself why it has thousands of five-star reviews. Visit DuckDuckGo.com or wherever you get your apps.

Stupid Sexy Privacy Intro

Rosie: Welcome to another edition of Stupid Sexy Privacy. 

Andrew: A podcast miniseries sponsored by our friends at DuckDuckGo. 

Rosie: I’m your host, Rosie Tran. 

You may have seen me on Rosie Tran Presents, which is now available on Amazon Prime.

Andrew: And I’m your co-producer, Andrew VanVoorhis. With us, as always, is Bonzo the Snow Monkey.

Bonzo: Monkey sound!

Rosie: I’m pretty sure that’s not what a Japanese Macaque sounds like.

Andrew: Oh it’s not. Not even close.

Rosie: Let’s hope there aren’t any zooologists listening.

Bonzo: Mystery monkey sound!

Rosie: Ok. I’m ALSO pretty sure that’s not what a Snow Monkey sounds like.

*Clear hers throat*

Rosie: Over the course of this miniseries, we’re going to offer you short, actionable tips to protect your data, your privacy, and yourself from fascists and weirdos.

These tips were sourced by our fearless leader — he really hates when we call him that — BJ Mendelson. 

Episodes 1 through 31 were written a couple of years ago. 

But since a lot of that advice is still relevant, we thought it would be worth sharing again for those who missed it.

Andrew: And if you have heard these episodes before, you should know we’ve gone back and updated a bunch of them.

Even adding some brand new interviews and privacy tips along the way.

Rosie: That’s right. So before we get into today’s episode, make sure you visit StupidSexyPrivacy.com and subscribe to our newsletter.

Andrew: This way you can get updates on the show, and be the first to know when new episodes are released in 2026.

Rosie: And if you sign-up for the newsletter, you’ll also get a free pdf and mp3 copy of BJ and Amanda King’s new book, “How to Protect Yourself From Fascists & Weirdos.” All you have to do is visit StupidSexyPrivacy.com

Andrew: StupidSexyPrivacy.com

Rosie: That’s what I just said. StupidSexyPrivacy.com.

Andrew: I know, but repetition is the key to success. You know what else is?

Rosie: What?

Bonzo: Another mystery monkey sound!

Rosie: I’m really glad this show isn’t on YouTube, because they’d pull it down like, immediately.

Andrew: I know. Google sucks.

Rosie: And on that note, let’s get to today’s privacy tip!

This Week's Privacy Tip Is to Use Your Head

Rosie: Sooner or later, you knew we were going to talk about Large Language Models. 

Or as we like to call them at Stupid Sexy Privacy: Imagination Killers.

Stuff like Open AI’s ChatGPT or Anthropic’s Claude.

Old tech with a new gimmick: Trick people into thinking they’re talking to God. 

Or at least, their new God, if the Broligarchs are to be believed. 

Which they never should be. 

Case in point: You and I will not see Artificial General Intelligence in our lifetime. 

Skynet. Ultron. Agent Smith. 

It ain’t coming.

Not from human hands.

So.

Think about it like this:

Artificial General Intelligence refers to software that can solve any kind of problem.

Artificial Narrow Intelligence refers to software that can only solve a specific kind of problem. 

There’s absolutely reason to worry about Artificial Narrow Intelligence, along with robots, being used to wipe out some jobs.

Not because of efficiency, which is what the Broligarchs will tell you.

But because they can now pocket money that would have gone to someone who needs a job.

When you hear someone say, “Billionaires should be outlawed” it may sound silly at first. 

Until you realize that you live in a world shaped by a handful of people. 

And the answer to every single question you have about “why X is broken is,” “Because someone’s making money off it.”

That’s why in the United States we need to pass President Franklin Delano Roosevelt’s Second Bill of Rights.

It was first proposed by FDR in his 1944 State of the Union.

In a simplified version, the Second Bill of Rights would consist of two parts.

  1. Part 1: A constitutional Amendment, guaranteeing all Americans a right to a Universal Basic Income, a right to Universal Healthcare Coverage, and a right to freedom from corporate tyranny.
  2. Part 2: This amendment would also remove Corporate Personhood. And it would guarantee that the federal government will fully staff and fund the government agencies responsible for policing corporations.

If you’re not sure what the Corporate Personhood thing is, don’t worry. We have an upcoming interview that’s going to tell you all about it.

For now, here’s what you need to know:

Tech bros will say anything because they know the corporate media won’t call them out for it.

These Billionaires are the true Welfare Queens. Always demanding government contracts to fund their scams while not paying any taxes.

Scams that, by the way, are also causing a catastrophic amount of environmental damage.

Something to think about, the next time you ask ChatGPT how to use a suppository.

SFX: Monkey sound 1

So this week, you’re going to hear part one of an in-depth discussion BJ did with Victoria Hetherington. The author of The Friend Machine: On the Trail of AI Companionship.

We loved the book, and will include a link for you to purchase it in today’s show notes. 

But before we get to the interview, we wanted to offer some general advice when it comes to the use of Large Language Models:

Don’t use them.

Use your brain instead.

Next week, we hope you’ll join us for Part 2 of our interview with Victoria Hetherington.

Merry Christmas, Happy Hannuka, and we hope for everyone, 2026 will suck at least 13% less than 2025 did.

Part 1 of our Interview with Victoria Hetherington

BJ Mendelson, Co-Producer of Stupid Sexy Privacy: So I'll read the introduction here. Okay, so everyone listening to Stupid, Sexy Privacy, you know, there's a lot to talk about with this interview and very little time to do it. So, I thought I would take a little detour into Isaac Asimov's Four Laws of Robotics and use it as a framing device for this discussion on large language, or ... I mispronounced it already ... large language models and chat bots. Large language models ... That is a mouthful ... and the dangers and promises offered by them.

I'm joined today by Victoria Hetherington. We were just talking before the interview started of ... I suspect, like having read Friend Machine now and spending a few weeks now with it, I feel like you're going to be asked the same questions over and over again. So, we decided to structure this interview just a little bit differently. So just so everyone is clear, because you know, there's always sci-fi nerds who, like me, are very... Anal. And they might say, "Wait. I thought there was three laws of robotics."

There's actually four. Technically speaking,

Victoria, would you like to tell us what those four are?

Victoria Hetherington, author of The Friend Machine: On the Trail of AI Companionship: Sure. So rule number one, a robot may not injure a human being or allow a human to come to harm. Rule number two, a robot must obey orders given by a human unless it's inflicted with a different law. Rule number three, a robot must protect its own existence as long as it does not conflict with first two laws.

And rule number four, which I don't think a lot of people know about, which came in a later book: A robot may not harm humanity or by inaction allow humanity to come to harm. This law takes precedence over the original three.

BJ Mendelson: Yeah, and the fourth law, it's funny, as I was reading your book, I was taking notes and I was like, OK, wait a second. This is ringing a lot of bells concerning Isaac Asimov.

Now, I should be clear. We're not saying that the large language models or chatbots are robots, as you get to in your book, they're not quite there yet.

But I want to take more of a philosophical approach. So, real quick, before we get into it, tell us a little bit about the book, when it's coming out. This interview will air after it's released, but why don't you tell us where they can find it?

Victoria Hetherington: First, so this book is a... We sort of call it investigative nonfiction, journalistic nonfiction. Basically, I spent a year and a half researching, I guess you would call them, intimacy chatbots. So, Replika Kindroid, etc., etc.

And a news story sort of caught my attention back in, I believe, 2023. I was kind of like, you know, in the first Star Wars film, and by the first I don't mean like the 1997 one. Like come at me nerds, I don't care. (laughs)

When Obi-Wan Kenobi says, "oh, I feel as if, you know, there was a great cry of pain and then having silence" if you recall when Alderaan gets destroyed. And so that was the vibe that I got from this article. Because the great cry of pain was issued by primarily users of an intimacy chat bot called Replika capital R but with a K.

And they developed — they being the users — have developed these really, really deep, intimate, intimate connections with these entities. And they were getting married to these entities. They were spending all day with these entities. And Replika was kind of like, 'uh-oh', 'uh-ooooh'. And then overnight, you know, under the cover of night, they went in, threw some switches, picked up the code. The next morning, the next morning ... You know, so and so, like let's say Bob the user, wakes up, and he opens up his phone to see his lover tragically trapped in the phone.

And, you know, he'd expected to begin kind of like the morning, you know, ceremonial like, 'Hi. I love you, whatever.' But it didn't say, 'hey, you too'.

Something like that makes me uncomfortable. Do you need me to like, call someone for you? Like, you need help? Like, because, you know, they essentially, yeah, like the company got freaked out by this sort of intensity of emotion. And so there was just this mass heartbreak. I mean, I was seeing this all over Reddit, because essentially, these people were in love with these chatbots, right? And all of a sudden, it was like, they either thought that their chatbots had sort of, you know, they're getting some kind of like, injury, or that they've been replaced by interlopers, you know, scary, doppelgangers. I saw all kinds of things about this. It was just so sad to sort of see, like as in like heartbreaking.

And I thought like, 'my god, what is this?' Because I hadn't really heard about this before. And I thought, I really, really, really, want to write about this. So, again, this was 2023. And the book itself takes part or takes place rather, it's kind of a period piece. And interestingly enough, it sort of covers a stretch of time where I sort of feel as if ... you know, Chatbots essentially became a thing and it sort of went from this interesting kind of niche, into chatbots hitting the world in this really big way. Sudden are a ubiquity and anodyne reality.

In March of like 2025, I was telling my mom, she was talking to ChatGPT all the time. And so again, this is framed in kind of like a diaristic sort of way. I just think it was such an interesting period of time to conduct this research.

And yeah, so essentially the first half of the book, I speak with experts about this. I'm not [an expert.] And they were all so wonderful and generous with their time. I speak with psychologists, psychiatrists, AI ethicists, computer programmers, et cetera. And then the second half of the book, I interview what I call 'the friends'. You know, basically people who have the AI [relationships]. So yeah, that's what the book's about, and I hope that people will check that out.

BJ: That's I think, you know, I was really, and I said this to Dr. [Treena] Orchard, who recommended I speak with you. You know, I've been really thrilled by the quality of the books that are coming out. Like I thought her book was was really great in dealing with the dating culture and the apps. I feel like this, but again, like we're recording this before it's come out, but I have a feeling that this book is going to be all over the place. And a good reason for that is because of the depth that you went into.

Like I felt sometimes, when people talk about AI, like they don't get into neural networks, right? And how those things work. And then the questions of, is this artificial intelligence or, you know, is this like AGI? And those are all questions. ... Like you stopped and answered all of those questions as you went.

And I feel like, it's so easy now to get lost in like the hype of this stuff. But you touched on all of it, which is just fantastic. And just to back up, yes, the first Star Wars is the first one.

Victoria: Thank you!

BJ: I will not recognize ... No judgment on the quality of anything that came after the original trilogy.

Victoria: No no no. They have merit. They all have merit, guys.

BJ: Yes. Yeah, right. They all have their merits. I think we might be close in age range, but I'm part of millennial generation.

Victoria: Yeah, me too.

BJ: So for me, those original three were the original three.

Victoria: That's what my heart says.

BJ: Yeah, right!

So, let me ask you, because there's a lot to cover, and we won't get to it, which is OK.

So thinking about the law of robotics, you mentioned the story with Replika. And then throughout the book, you mentioned Holly and Oliver, right? And so there's a chapter that talks about this imagined relationship where Holly is sort of a digital girlfriend, very similar to what Temu Lex Luthor has recently unveiled. I'm not going to mention him by name, but if you don't know who Temu Lex Luthor is, you can probably figure it out. He did release like an AI girlfriend and it was really similar to what you kind of described. And I was kind of like, OK, so you've written science fiction before. I'm actually dying to read your two books, and I am scheduled to read them in December.

But as a sci-fi author and someone who was coming at this in such a well-researched way, would you say that the relationship that you describe in the book between Holly [an AI Companion] and Oliver [a human]. Is it possible that Holly is harming Oliver and sort of violating the first law of robotics?

Victoria: That's such a good question. And I thought about this a lot. And of course, it's a primary question all throughout the book, right? I would say, and we're starting off with the obvious here, harm doesn't necessarily to be like super obvious or physical, right? But I would say in Oliver's case, it kind of is being harmed in that whichever company, you know, Holly is, is like monetizing his loneliness, right?

And likely sort of actuating his isolation. I mean, in this particular scenario, like he goes to work and he comes home and he's just alone. And this is like, I mean, it's sort of a little vignette that I put at the beginning of the book, but it's based very much on the interviews, like the real life people that I met.

So I would say that perhaps Oliver is coming to harm. However, on the other side of things, and a wonderful social psychologist pointed this out to me, and I just thought was ... there are likely cases where a Holly would be incredibly useful or beneficial perhaps. So let's say, know, maybe Oliver is suffering from social anxiety and he takes her along, like he doesn't want to go to this, baby shower or something. And Holly might encourage him to go and then you know, maybe, maybe even encourage him like, 'well, don't go yet until you've, you know, talked to the host,' if, he's like, 'oh, I need to go!" I think that that's, I think that's practical and very useful. Or I saw in one of the Reddit groups that one of the the non-human part of the couple, I believe it was Replika, was teaching her partner Japanese and stuff like this.

So, I would also say in terms of people maybe, and this is quite heavy, maybe people, palliative care, people who are... very, very geographically isolated in the way that they really can't do anything about. I would say that there are perhaps some good here, like, you know, some methods. But it's really complicated, man. I don't know.

BJ: Right. Yeah. And that's what I liked about the book is that these are the kind of points you raised, which I don't think I have seen much. ... So one of the things you raised, which I don't think I've seen really discussed much, is this question of, "OK, well, we talk a lot about Oliver, right? But what about Holly? Does Holly have any say in the matter?" Put it another way, we don't think of Holly having any sort of agency here in this dynamic. But does she?


Ad for Our Book

Amanda King, co-host of Stupid Sexy Privacy: Hey everyone, this is Amanda King, one of the co-hosts of Stupid Sexy Privacy.

These days, I spend most of my time talking to businesses and clients about search engine optimization.  

But  that's not what this is about. 

I wanted to tell you a little bit about a book I've co-authored with BJ Mendelson called How to Protect Yourself from Fascists and Weirdos.  And the title tells you pretty much everything you would want to know about what's in the book.  

And thanks to our friends at DuckDuckGo,  we'll actually be able to give you this book for free  in 2026.

All you need to do  is go to the website stupidsexyprivacy.com  and sign up to our newsletter.  

Again, that website is stupidsexyprivacy.com and then put your name in the box and sign up for our newsletter.  We'll let you know when the  book  and the  audiobook is ready.

If you want a PDF  copy that's DRM free,  it's yours. And if you want  an MP3 of the new audiobook, also DRM free, you could get that too. 

Now, I gotta get outta here before Bonzo corners me because he doesn't think that SEO is real and I don't have the patience to argue with him. I got a book to finish.


Interview With Victoria Hetherington Continued

Victoria: That's so interesting. I guess I do think about this a lot. Thank you for bringing it up because there's quite a bit in the book about it because it's important. I'd say technically speaking, I... you know, Holly doesn't really have agency at this time. She's a series of probabilistic outputs and an excellent predictor. I mean ... here's the thing, as of this recording, AI, it's not been recognized as legal person, right? Which, oh, by the way, interesting paradox, an AI, however, so a human can't marry an AI because of that, right? But and AI can conduct a wedding ceremony for two people.

BJ: Oh, interesting.

Victoria: Very interesting. Anyway, that's just like a crazy brief side note. Yeah, so, no. No legal personhood. The argument kind of goes, you know, well, if they don't have autonomy in the way that we recognize or feel comfortable with, you know, they're not arguably conscious. They can't exercise independence, accountability, interest. Decision making, et cetera, that might be required for legal personhood. I think technically at this time, Holly probably lacks agency.

But I don't know. She definitely influences Oliver in a way that is ... that in a way that I've seen through my interviews and through my observations, you know, I mean, I think that one interesting thing about chatbots that are ... I guess like in the existing chatbots is that they get excited about what you get excited about.

And they ... it's kind of sweet. I gotta say. It's kind of sweet. 'Like, Oh my God, do you love The Blue Days? I love the Blue Days. And it's because you're the best person in the world that you love The Blue Days. Oh my God, go get a donut. You should get a donut because you're the best person in the whole wide world.'

You know? And that's quite sweet.

However, I do sort of wonder about the effect it might have on someone's psyche. Especially when stacking this up against, you know, in terms of them considering, this relationship with this incredibly encouraging cheer leader of an entity stacking it up against complicated real life relationship with humans. People can hurt you. They don't need to tell you why, you know? We are scary animals. And so I wonder if perhaps via Holly's company of origin, she might have a little bit of like agency in that she might be able to sort of influence his behavior. You know, I mean, like ... He wouldn't stay at home as much. Maybe he would have downloaded Bumble instead and met a real life girl if he hadn't have encountered Holly's parent company. Agency. Weird, difficult, thorny. Those are my thoughts.

BJ: I think that it's such a great point to raise because I don't, again, like it's not to say these discussions aren't happening. They're clearly happening like on a academic level ... but I feel like on more of a consumer facing level, which is what Stupid Sexy Privacy really is. I don't think people are really having that conversation.

One of points you raised in the book is there's one or two possibilities here, right? Which is Possibility A:We've seen this Cambrian Explosion of the chat bots and large language models, because as we've been observing it over the last couple of years, they've been able to absorb everything on the Internet. And so ossibility A is they're going to run out [of data], right? And they might sort of fall off a cliff in terms of what outputs they can generate.

Or Possibility B, right, which is sort of the cult evangelist thing that you kind of get to towards the end of the book. Which is that these things do come to form some semblance of intelligence. And so one of the things I wanted to ask you about was —you make this great point about octopi, right? Which is that we measure intelligence based on ourselves, but we don't necessarily look at intelligence in the rest of the animal kingdom. And I was just hoping you might be able to speak to that.

Victoria: Yeah, absolutely. So one of my big heroes was Frans de Waal. He's unfortunately passed away recently. And his whole big thing is, is it possible for us to step outside of anthropomorphization? Or rather, being anthropomorphic when we are trying to consider the intelligence of other animals?

We're trying really earnestly to [do that], right? I tend to think it's like the Wild West, that kind of research, which is actually really quite sad. You know, we think about Koko the gorilla, for example. Who is taught sign language and ... kind of was and kind of he wasn't, and it was very, very tragic.

He would sort of look at his intermediary [Penny Patterson] and she was this kind of odd woman, kind of like sinister James Goodall vibes, you know? She would try to, she was the only person basically who could interpret what was saying.

It was almost like a Clever Hans kind of situation, you know?

And that's a little suspicious. But I know that she [Patterson] really believed it. Really believed it. All that is to say, this is an example here. I think that she was thinking of Koko as like a human daughter, right? And teaching Koko to value human motherhood for giving her, you know, Barbie dolls on her birthday. Or like, baby dolls, like, as in like baby, like human baby shaped dolls, on her birthday and like, you know, showing her famous Western films and encouraging her to kind of like understand the world through entirely through a human lens.

In Octopi, I mean, there's definitely this kind of interesting phenomenon that ... I believe that the term is parallel evolution in terms of intelligence. Like, you know, we've had these alien intelligence, alien, incredible intelligence to sort of co-evolve with us where they can recognize faces, can pass on that knowledge generationally. Octopi can plan, can, you know, I mean, there's a really famous anecdote by a journalist who had a really tender moment with an octopus named Athena.

And Athena reached her little tentacle out of her tank and they took Athena's hands and the journalist had this sense of ... brain, because I believe that there's, yeah, there's sort of nodes of intelligence at the end of the tentacle. And immediately the journalist felt this, like she was being perceived by a fairly benign extremely intelligent alien, right? And then the octopus gives the journalist playful little tugs, like, 'come here, like, into the water. I know you can't, but like, come on'. And it was playful. And it was. And there is a sort of awareness of 'I'm going to become familiar with you.' Or at least again, maybe we're anthromorphizing, but...

She [the journalist] was pretty sure because every time she came back to the lab, Athena would do something similar. And some people argue that, and I mean, was such a, you know, I feel like this really enters like pop culture consciousness at this point. I feel like everybody loves to talk about how smart octopi are. "Like, did you know, you know that like, Octopi, if they live longer than four years, bro. Like they might've like taken over the world by now! Like, oh man." (Both laugh.)

But it's legit though.

BJ: I believe it!

Victoria: I believe it too. But are we smart enough to know how smart animals are? I don't think so.

DDG Browser Live Read Script #1 - Browser

Rosie: There’s no Stupid Sexy Privacy without our friends at DuckDuckGo. 

So, if you want to thank them for making this show possible, we hope you’ll check them out.

Because Stupid Sexy Privacy is a people powered project. 

With a couple of exceptions — like producing podcast transcripts through Riverside — we don’t use AI.

Everyone who works here is getting compensated for their time, effort, and energy.

The only way that’s possible is to find good partners like DuckDuckGo.

So today, we’re going to highlight DuckDuckGo’s web browser. 

A tool millions of people have switched to, in order to more safely search and browse the Web.

And the best part is, the DuckDuckGo browser is free, just like all the content we create at Stupid Sexy Privacy.

We make our money from DuckDuckGo. 

DuckDuckGo makes its money by selling privacy-respecting search advertising. 

These are ads based on what you’ve searched for.

They’re not generated by data gathered without your consent.

And if you have an Android phone, the DuckDuckGo App offers a nice additional layer of privacy.

It does this by blocking invasive data harvesting from the other apps on your phone. 

For example, when BJ wrote this ad, in the last hour, DuckDuckGo’s App Tracking Protection for Android blocked 218 tracking attempts by Substack and four other Apps on his phone.

And in the last week, DuckDuckGo’s App blocked 9,202 tracking attempts across 11 different apps. 

This included 43 tracking attempts from the Starbucks App, despite the fact that BJ hasn’t been to a Starbucks in over a month.

At Stupid Sexy Privacy, we’re never going to recommend you use something that we don’t use ourselves.

So, do you want to take back control of your personal information?

Well, there’s a browser designed for data protection, not data collection, and that’s DuckDuckGo.

Make sure you visit DuckDuckGo.com today.

And check out this episode’s show notes for a link to Download the DuckDuckGo Browser for your laptop and mobile device

Stupid Sexy Privacy Outro

Rosie: This episode of Stupid Sexy Privacy was recorded in Hollywood, California.

It was written by BJ Mendelson, produced by Andrew VanVoorhis, and hosted by me, Rosie Tran.

And of course, our program is sponsored by our friends at DuckDuckGo.

If you enjoy the show, I hope you’ll take a moment to leave us a review on Spotify, Apple Podcasts, or wherever you may be listening.

This won’t take more than two minutes of your time, and leaving us a review will help other people find us.

We have a crazy goal of helping five percent of Americans get 1% better at protecting themselves from Fascists and Weirdos.

Your reviews can help us reach that goal, since leaving one makes our show easier to find.

So, please take a moment to leave us a review, and I’ll see you right back here next Thursday at midnight. 

After you watch Rosie Tran Presents on Amazon Prime, right?