Drinking Raw Milk Will Kill You & Other Things You Should Probably Know
Bioethicist and NYU professor, Dr. Arthur Caplan, joins BJ to talk about the disinformation and misinformation spread by wellness influencers. Meanwhile, Rosie gives you a new privacy tip on how to protect your most sensitive medical information.
Back in 2023, I found Dr. Arthur Caplan's interview for our show to be an eye-opener. Having him back in 2026, especially with the quackery flowing down from the top of some governments these days ... Well, it was a breath of fresh air.
We didn't get to record new interviews with all of our original guests — mostly due to scheduling issues — but we were glad to have Dr. Caplan back for Round 2, which you can hear this week.
I usually talk a bunch here, but I really need to shut up and finish our book by the 31st. So I hope you don't mind if I let this week's episode speak for itself.
And if you enjoy this week's episode, go back and listen to Episode 25 when you're done. This interview with Dr. Caplan goes really well with Caitlin Dow's interview from the Center for Science in the Public Interest.
-BJ
Hello. Farewell. Hello. Farewell.
-BJ
You can follow me here on Bluesky
Show Notes
Stupid Sexy Privacy Show Notes For Season 1, Episode 30
Episode Title: Drinking Raw Milk Will Kill You & Other Things You Should Probably Know
Guests: Dr. Arthur Caplan, professor, bioethicist, and the only person we trust to tell us what milk to drink.
Episode Summary: Bioethicist and NYU professor, Dr. Arthur Caplan, joins BJ to talk about the disinformation and misinformation spread by wellness influencers, as well as how HIPAA completely fails to protect us from tech companies. Meanwhile, our host Rosie gives you a new privacy tip on how to better protect your most sensitive medical information.
Resources (NO Affiliate Links. This is what we use.)
Our Sponsor: DuckDuckGo <--Our Recommended Browser and VPN
Get Your Privacy Notebook: Get your Leuchtturm1917 notebook here.
-BitWarden.com (Password Manager: easier to use, costs money)
- KeepPassXC (Password Manager: free, harder to use, but more secure)
-Slnt Privacy Stickers for Phones and Laptops
-Slnt Faraday bag for your Stranger Danger phone.
-BitDefender (best anti-virus for most people across most devices)
-Stop using SMS and WhatsApp, start using Signal.
-Use Element instead of Slack for group coordination
-Use StopGenAI's Guide to getting Generative AI out of your life.
--Use cash whenever possible. If you have to buy something online, try to use Privacy.com to shield your actual credit or debit card when making purchases online.
Get In Touch: You can contact us here
Want the full transcript for this week's episode?
Easy. All you gotta do is sign-up for our newsletter. If you do, you'll also get a .mp3 and .pdf of our new book, "How to Protect Yourself From Fascists & Weirdos" as soon as it's ready. It's free.
But if you'd like to comment on StupidSexyPrivacy.com posts, or if you just want to support our work, you can support us by becoming a paid subscriber. It's $2 per month or $24 for the year.

Stupid Sexy Privacy, Season 1, Episode 30
Below is the full audio transcript of this week's episode.
DuckDuckGo Commercial #2
Announcer: Hey, here's a joke. Knock knock.
Announcer 2: It's Google Chrome, and I don't need to ask who's there. I already know it's you. I know your search history, your email address, location, device settings, even your financial and medical data.
Announcer: Wow, that's not funny. Now I'm definitely switching to DuckDuckGo.
Announcer 2: That's smart. If you use Google Search or Chrome, your personal information is probably exposed. And that's no laughing matter. The free DuckDuckGo browser protects your personal information from hackers, scammers, and data-hungry companies.
DuckDuckGo has a search engine built in, but unlike Google, it never tracks your searches. And you can browse like on Chrome, but it blocks most cookies and ads that follow you around.
DuckDuckGo is built for data protection, not data collection. That's why it's used by millions to search and browse online. Don't wait. Download the free DuckDuckGo browser today. Visit DuckDuckGo.com or wherever you get your apps.
Show Intro
Rosie: Welcome to another edition of Stupid Sexy Privacy.
Andrew: A podcast miniseries sponsored by our friends at DuckDuckGo.
Rosie: I’m your host, Rosie Tran.
You may have seen me on Rosie Tran Presents, which is now available on Amazon Prime.
Andrew: And I’m your co-producer, Andrew VanVoorhis. With us, as always, is Bonzo the Snow Monkey.
Bonzo: Monkey sound!
Rosie: I’m pretty sure that’s not what a Japanese Macaque sounds like.
Andrew: Oh it’s not. Not even close.
Rosie: Let’s hope there aren’t any zooologists listening.
Bonzo: Monkey Sound!
Rosie: Ok. I’m ALSO pretty sure that’s not what a Snow Monkey sounds like.
*Clear hers throat*
Rosie: Over the course of this miniseries, we’re going to offer you short, actionable tips to protect your data, your privacy, and yourself from fascists and weirdos.
These tips were sourced by our fearless leader — he really hates when we call him that — BJ Mendelson.
Episodes 1 through 33 were written a couple of years ago.
But since a lot of that advice is still relevant, we thought it would be worth sharing again for those who missed it.
Andrew: And if you have heard these episodes before, you should know we’ve gone back and updated a bunch of them.
Even adding some brand new interviews and privacy tips along the way.
Rosie: That’s right. So before we get into today’s episode, make sure you visit StupidSexyPrivacy.com and subscribe to our newsletter.
Andrew: This way you can get updates on the show, and be the first to know when new episodes are released in 2026.
Rosie: And if you sign-up for the newsletter, you’ll also get a free pdf and mp3 copy of BJ and Amanda King’s new book, “How to Protect Yourself From Fascists & Weirdos.” All you have to do is visit StupidSexyPrivacy.com
Andrew: StupidSexyPrivacy.com
Rosie: That’s what I just said. StupidSexyPrivacy.com
Andrew: I know, but repetition is the key to success. You know what else is?
Rosie: What?
Bonzo: Another, different, monkey sound!
Rosie: I’m really glad this show isn’t on YouTube, because they’d pull it down like, immediately.
Andrew: I know. Google sucks.
Rosie: And on that note, let’s get to today’s privacy tip!
This Week's Privacy Tip
Rosie: You need to see a doctor.
A weird fungus has infected your brain.
You haven't bit anyone yet.
You want to.
But the right moment hasn't come.
We've all been there.
And as you know, when there's a fungus among us, the correct people to call are the Super Mario Brothers.
As it turns out, being controlled by a fungus can also cause flowers to bloom on your face.
And not the pretty kind.
You finally get to see a doctor. A full two months after you requested the appointment.
You get called into the exam room.
An hour and forty-five minutes go by.
Finally, your doctor enters to take some pictures.
She does this to figure out how to best treat you, because the Super Mario Brothers don't take your insurance.
SFX: Monkey sound!
Now. As far as you and your doctor are concerned?
Nobody will see these embarrassing. Just you and the medical professionals treating you.
That's how things are supposed to work.
But that's not how it actually works in America, thanks to Epic System's MyChart and their competitors in the electronic health records space.
The odds are pretty good, some of your most sensitive medical information will be, or has already been, used to train chat bots, large language models, and other data hungry programs.
Most of you are probably thinking, "But wait, Rosie. HIPAA will protect me!"
Nope. It doesn't. HIPAA is woefully out of date, as our guest this week explains.
Dr. Arthur Caplan is an ethicist and a professor of bioethics at New York University.
We interviewed Dr. Caplan back in 2023.
He was kind enough to join us again here in 2026 for a brand new interview.
But before we get to that, we want to let you know a couple of things.
First, some of these apps, like Phreesia, allow you to opt-out of sharing your medical information.
You can ask Phreesia not to do so, for example, by emailing them at privacy@phreesia.com.
There doesn't seem to be a similar opt-out process for MyChart.
Your friends at Stupid Sexy Privacy have reached out to Epic Systems to confirm that.
We'll update the show notes that accompanies this episode with what they say, if anything.
Second, if your doctor's office is using an app like MyChart, you can also call your doctor's office, and ask to opt out of any data sharing arrangements they may have.
The doctor's office may or may not cooperate with you.
But.
Depending on what state you live in, you have a lot of legal protection to back your request up.
Like with the California Consumer Privacy Act.
If you live in a State without a similar law, you'll have to convince your doctor that protecting your medical data is important to you.
That totally sucks.
But if you go back and listen to Season one of Stupid Sexy Privacy?
We'll give you plenty to work with to craft your argument.
Like how privacy is a fundamental human right.
And if do you live in a State without a strong privacy law, call your state representatives.
Do this once a week.
Don't send emails. Nobody reads them.
Call.
And tell the person who answers that you are requesting an update each time you call them.
Give them your contact information.
And insist that they get back to you.
Maybe they will. Maybe they won't.
But the simple act of calling your representatives consistently is just one of the free and easy things you can do, right now, to help take your country back from the fascists, weirdos, and broligarchs who fund them.
Now let's get to our interview with Dr. Arthur Caplan.
Dr. Caplan Interview Part 1
BJ Mendelson, co-producer of Stupid Sexy Privacy: Dr. Kaplan, thank you so much for joining us again at Stupid Sexy Privacy. I'm hoping for people who missed the 2023 interview that we did that you might be willing to take a moment just to introduce yourself.
Dr. Caplan: Sure. So I'm Art Caplan. I'm a professor at NYU Grossman School of Medicine, and that's in New York City, where I'm a member of the section on medical ethics. And I've been here about 11 years.
Primarily, my job has been to set up a program, which we have, and we have a master's degree in bioethics, teach medical students, public health students, dental students, nursing, residents, fellows. Big footprint within the institution for teaching. We do research in many different areas.
I'm particularly interested in vaccination, reproductive health, end of life care issues. We also have people who are working on many, many other topics, especially in the research ethics area. And we're pretty committed to working to try and not only stay on campus, but trying to change policy and practice off campus.
So we have a pretty big media footprint. We will write op-eds, we do lots of press meetings, we will talk to legislators about things we think ought to happen or shouldn't happen. So not boring, kind of a fun job. And I've been at it for a very long time. I Was at Penn, Minnesota, Columbia, and Pittsburgh before getting here.
BJ Mendelson: And just for people who might not understand what bioethics is, I know we talked a little bit about it last time, but. I think it's even more important these days to kind of just clarify what it is. Could you give us an example?
Dr. Caplan: Sure. So there are two terms we use. Medical ethics pays attention to ethical issues that arise at the bedside of the patient, usually involving a doctor or a nurse, occasionally administrator. And you might get cases at the bedside like, can we shut off a ventilator where two sisters disagree? The person who is ill isn't really competent or can't speak up and people aren't sure what to do when a family is fighting. Other bedside issues might involve, are we going to tell third parties if there's a diagnosis of a sexually transmitted disease like HIV? How do we tell them? How do we find them? Sometimes there are things that are reportable like gunshot wounds, but how do you do it?
Other issues that pop up at the bedside ... You certainly pay attention to rationing and who might get access to a scarce resource, like an organ transplant or maybe during COVID a scarce ventilator. And those are tough allocation questions.
You also hear the term bioethics and that refers to the broader area, not just clinical medicine, not just bedside issues, but everything involving the biological sciences. So there you get into the ethics of things like GMO foods, animal experimentation. We pay attention these days to AI in the bioethics part of the field. And we also are pay attention to things like nutrition, food, climate change. So bioethics, the big, big sort of 30,000 feet questions. Medical ethics right down to the nitty gritty of day to day practice.
BJ: And I'd love to hear your thoughts a bit about something we've covered recently on the 2026 edition of Stupid Sexy Privacy are wellness influencers. So I'd to hear your thoughts about how bioethics plays a role, if at all, when it comes to wellness influencers pushing dangerous substances or iffy alternatives like raw milk.
Dr. Caplan: Well, bioethics has a big role to play in battles about the aims and goals of healthcare. R.F.K. Jr., the would-be Surgeon General nominee, Casey Means, Dr. Oz, many people in the administration are hoping to get into the administration that promoted the idea that mainstream science, the way it's been done historically, is flawed and in some ways dangerous. Kennedy will say things like, 'don't listen to the experts.' He means your doctors. He means your nurses. What they want you to do is instead of, if you will, trying to worry about your individual health and getting things fixed, they have a notion that medicine, public health should be about wellness, meaning health promotion, trying to keep people from getting sick. And, that would be fine. And by the way, I think that always has been a part of the goals of medicine, certainly. Historically always been part of the goals of public health. But where they go off the rails is they quickly shift from let's be well to all kinds of crazy ways to achieve being well.
BJ: Right.
Dr. Caplan: Those might include things like drink raw milk says RFK Jr. A very dangerous thing to do. Louis Pasteur became famous with pasteurization last, well actually two centuries ago now because so many people were dying when they drank raw milk, which is, to put it not disgustingly, filthy, when you drink it directly from where cows are ... And where they're wallowing around in all kinds of bacteria and dangerous things.
Other bright ideas take nutritional supplements, unproven, untested, frequently sold by people who make money by selling them to you.
Casey Means is an example of that would be Surgeon General candidate.
Many other things are sort of out of bounds in the wellness movement area. All kinds of ideas like that you don't need to vaccinate. You can breathe fresh air and take bee pollen. You can chelate your blood, which means putting yourself in a kind of dialysis machine. Filtering the blood from dangerous substances, which by the way doesn't work and is completely dangerous. A waste of time. I'm not going go through the long list of cuckoo crackpot and disturbingly dangerous things that the wellness industry is selling. I'm talking about you, goop purveyor, Gwyneth Paltrow, as being a prominent example of this.
So, bioethics job is to make sure that if people are going to consent to use things, they have accurate, valid information. Informant consent is something we constantly fight for. The wellness industry is a cesspool of misinformation. Liability, if you get hurt drinking raw milk, who's going to be accountable for that? The farm that sold it to you? RFK Jr.? Who's to blame if you do what the head of Health and Human Services, the top health official, says you should do? If you're told to follow a diet that includes barbecue, lard, butter, as examples of things that RFK Jr. thinks are eating real food, that's not what nutritional experts are going to advise you to do. And you need to know what the risks are for your heart health and blood pressure of following that kind of a diet.
So keeping the wellness industry on the straight and narrow, calling them out when they lie, and pointing out conflicts of interest, those are all tests for bioethics.
BJ: Right. And I think about Alex Jones making so much money selling supplements.
Dr. Caplan: Yeah. You know, the entire wellness industry worldwide, dwarfs, I can't even believe this is true, but it's true, mainstream medicine. It is in the multiple trillions of dollars worldwide.
BJ: Yeah, and so much of the content creation business is propped up by that money. I think people don't realize the connection between the two. So you get the far right influencers out there that are always selling supplements. And you've also got people that are, you know, I'm pretty far to the left, but you got people further to the left than me who are also out there selling these supplements, and as you pointed out, very little of them if any have actually been verified as effective.
Dr. Caplan: Yeah, I would say if we went to one of the nutrition stores or went to one of the spas that promotes wellness, somewhere between zero and nothing has ever been shown to work.
BJ: That's right. Let me ask you about a related subject, dealing with bioethics, but this is something that ... You mentioned at the end of our last interview, which was neuroethics, and that sort of gets us onto this path of talking about HIPAA. But before we do, I just wanted to clarify what neuroethics was, and why people should be concerned about who gets their brain data and who that data is shared with.
Dr. Caplan: Well, you know, we've mapped the human genome probably 25, 30 years ago now and found out a lot about our genes. The next project has been to start to map the human brain. And that's fueling all kinds of interest in how the brain works, where brain diseases occur, how do they develop, everything from ADHD to autism to Parkinsonism to Alzheimer's to lewy body syndrome. So we have started to learn something about the brain, not a lot yet. Still pretty primitive in our understanding there, but that has led people to want to do all kinds of research on the brain.
So the first big area of ethics, in neuroethics. has been what's permissible to do to the brain in order to learn, not to treat it, but to study.
What drugs can you give? Do you risk permanent damage if you do surgical interventions? What could you implant into the brain? That sort of thing. Another problem with neurological research is, it's often on people who are mentally or cognitively impaired.
So a key standard of ethics in healthcare is you have to choose and you have to consent. But if you have a mental illness, if you're a child, if you're elderly who's demented, are they off limits for research or can surrogates give consent for them? And if they do, on what basis? Are they trying to guess what the person would have wanted? Or are they trying to say, it might be in their best interest if we put them in a research study? So those research ethics questions are very tough.
More recently, we've started to see people try to manipulate the brain for therapies. And probably most familiar interventions for this audience are those who are putting needles into the brain and trying to stimulate different parts of the brain that has proven somewhat helpful for Parkinsonism. Allowing people to move who had lost the ability of movement due to the disease. And then there are big projects, usually fronted by people like Elon Musk, who are looking to put very large connectors into our brains, saying that they could help people who are paralyzed move objects with their brain waves. Saying that they might be able to augment or improve mental function. Is that something we ought to try?
Not just treating disease, but trying to make ourselves smarter or learn languages more quickly or have a better temperament or whatever. Those are really tough ethics questions because, in my opinion, the science isn't there yet to justify those kinds of efforts.
We're just at the beginning of trying to learn how to handle disturbingly bad mental diseases like Parkinsonism.
We don't really even have a good understanding yet about what to do with Alzheimer's.
So talking about putting things in our heads and attaching them to computers as a neuroethics challenge, I wouldn't be doing that now, but Musk is.
BJ: Right, yeah. And it's scary to me because I've read stories about people getting, for example, implants in their eye to improve their vision and then that company going out of business and then you're sort of like, I guess you're out of luck. Because no one's around to service the camera that you've put in your eye.
Dr. Caplan: Correct. Or take it out.
BJ: That's right.
Dr. Caplan: So people just do things and they don't really plan for failure or an exit strategy if the thing doesn't work. Bioethicists should be insisting that's always part of any research protocol. But you know, more and more we're also starting to see people do things like, well, I'm not going to do my research in the US, too much regulation, too many bioethicists, too much oversight. I'm going to set up something in St. Kitts. Or I'm going to do this research in Tonga or I'm making these things up as places, but some offshore place or very poor country like Guinea-Bissau where RFK Jr. is proposed doing some very unethical vaccine experiments. We call that country shopping in the research area neurological research, particularly with very rich billionaires like Musk, you can see them trying to hide from transparency.
By the way, Musk for all his talk about neural link and putting things in people's heads, he hasn't published anything in a peer-reviewed.
BJ: Right. That's right. Well, I mean, it's also no surprise, right? This guy's been saying self-driving cars will be here any minute now, I think for a decade.
Dr. Caplan: And we'll all be living on Mars with our brain implants. That should happen too.
BJ: So yeah, he's got a long history.
Dr. Caplan: And don't forget about boring tunnels from Las Vegas to LA.
BJ: Which yeah, in Las Vegas, I think they now have a very expensive and fancy tunnel that goes nowhere. laughs
That's my favorite example I like to give people.
Advertisement For Our Cool New book
Hey everyone, this is Amanda King, one of the co-hosts of Stupid Sexy Privacy.
These days, I spend most of my time talking to businesses and clients about search engine optimization.
But that's not what this is about.
I wanted to tell you a little bit about a book I've co-authored with BJ Mendelson called How to Protect Yourself from Fascists and Weirdos. And the title tells you pretty much everything you would want to know about what's in the book.
And thanks to our friends at DuckDuckGo, we'll actually be able to give you this book for free in 2026.
All you need to do is go to the website stupidsexyprivacy.com and sign up to our newsletter.
Again, that website is stupidsexyprivacy.com and then put your name in the box and sign up for our newsletter. We'll let you know when the book and the audiobook is ready.
If you want a PDF copy that's DRM free, it's yours. And if you want an MP3 of the new audiobook, also DRM free, you could get that too.
Now, I gotta get outta here before Bonzo corners me because he doesn't think that SEO is real and I don't have the patience to argue with him. I got a book to finish.
Interview with Dr. Caplan Continued ...
BJ: I mean, so I think the question that comes up a lot is when people hear a discussion like this, they're going to say, oh, wait a second, doesn't HIPAA protect me if ... I get these brain implants. Or doesn't HIPAA protect me from let's say X.AI, which is also owned by Musk. He takes that Neuralink health data from your body and he packages it up and uses it to train Grok, for example.
Dr. Caplan: Yeah.
BJ: So there's this belief that people are, they seem to be aware of HIPAA, but not what it does and does not cover. So was hoping you can help clarify that.
Dr. Caplan: Well, HIPAA is the, Health Information Protection Act, and we hear about it a lot, but it is actually something that is already far, far out of date and doesn't give the protection that I think many people presume it does. HIPAA, in fact, get this, was a law enacted before the internet was invented. It was invented for a world of paper records, paper charts. When I was back at the Columbia Med School, I remember going in the basement, and we sometimes couldn't find a chart for a patient and they were at risk of dying because whoever it was in the record room couldn't locate the darn thing.
A very different era.
Almost Charles Dickens like, pen, paper, big folders, that sort of stuff. HIPAA said you can only share information about a patient in the hospital if it relates either to their care, so one doctor can tell another about your tests. The doctor can tell a nurse to give you certain treatments as part of bringing about your care program. And the other parties that can see your information are only in America insurance companies because they can bill. And so they want to know what you got and whether they approve it.
Whether they are going to reimburse it, or demand a co-pay. or whatever they're going to do. So even under HIPAA, non-medical people have the right to look at your insurance and the reason that's, excuse me, look at your information for insurance. And the reason that's important is, that means your employer through the health department might get a peek even under HIPAA, at the kinds of care you're receiving.
So it wasn't a very strong law.
It didn't really do a great job protecting at the time it was enacted, think, I want to say 1976, something like that. It's very old. But it only applies in hospitals, clinics, nursing homes, healthcare settings. If somebody is ... I don't know, putting you in an experiment in a psych lab and watching your eye movements in response to pornography, HIPAA doesn't apply. It's not a hospital. It's not clinical care. We may have put certain conditions on the data use as part of your informed consent, but there's no HIPAA. So HIPAA is narrow and weak and it needs to be updated.
Also, we don't have physical charts. We don't have things written down on paper anymore. Nobody uses that. It's digitized information flow all over the place. Shareable with private companies like Musks or not only insurance companies but companies that are trying to sell hospitals different programs to let them do a better job reading scans or doing pathology tests. It's not all bad, but it means that information is flying around at rates and speeds to parties that no one anticipated when HIPAA went through.
BJ: And so I wonder if ... have you or as something you've seen in the bioethics field, has there been a discussion around software like MyChart and Phreesie? Because I feel like that's something that's been pushed lately by a lot of the hospital systems. I know in New York City, Columbia Presbyterian, for example, uses MyChart, but MyChart does collect your data and move it beyond the realm of what people might expect thinking that HIPAA protects them.
So, is there a discussion that the bioethicists and the doctors have with the hospital systems about those softwares?
Dr. Caplan: There are, although they are very much institution specific. NYU, we have a whole ethics group talking all the time to our IT guys about when we need to disclose the presence, let's say, of digital information exchange. What third parties can and can't use. Again, it's not all bad. Look.
BJ: Yes.
Dr. Caplan: If I, if I get X-rays or a CAT scan or something, AI scanners can do as good or better job reading the scan than human skilled operators can. Why? They don't need sleep. They don't have a night out before they come to work. They don't have the flu and show up anyway, not feeling well. And the actual visual acuity on some of these scanning type machines is better than the human eye. So you'll see breast cancer detected earlier by using AI driven information algorithms. That's great. But it does have all this downside. And I'm going to say there are institutions out there selling information to third parties saying, well, it's anonymous. And so there's no risk to the patient. But in fact, you can track back that information using all kinds of skilled detective measures if you wish.
Remember 23andMe, the company that kept saying, let's uh send us your spit, your DNA, and we'll tell you whether you're Lithuanian or Ethiopian. Well, it turned out they were selling all that information to third parties. They didn't actually care if you were Ethiopian. That was just bait to get you to send in your DNA, which they kept, analyzed, and resold. And ultimately began to get sued, but not after many, many years of fruitful, profitable operation.
BJ: Right, and I have to clarify for listeners that I'm part of the class action settlement against 23andMe. I had my data stolen as one of numerous Ashkenazi Jews.
Yeah, who were for whatever reason that data was either made available or accessed and it's out there. So just disclosure for people listening.
Let me ask you, just in terms of solutions, do you find ... I'm going to bring two to you. So the first being, does a Medicare for all system sort of prevent some of the abuses that we've talked about?
Dr. Caplan: Partly, not totally. It helps. If you could at least standardize insurance payments, you know how many payers we have now in the US? Over 900.
BJ: Right.
Dr. Caplan: That's a lot of eyeballs looking at your data. If you could at least get it down to one, that means far fewer leaks, far fewer places that are having accidents and releasing information inadvertently.
So I do think it would reduce accidents, leaks, hacking, that sort of stuff, somewhat. So it would be helpful to have that. Having a standardized form form for insurance, which I can't believe we've never done this crazy.
BJ: Yes. Absolutely.
Dr. Caplan: That would also reduce or help protect privacy because everybody would know what's in there, what isn't in there.
And we could argue about what should be in there, what shouldn't be in there, and who needs to record certain things when they're not needed and everybody would be using the same thing.
We spend billions on administrative costs the way we run healthcare in the US for lack of a standardized insurance form. So Medicare for all would help that a great deal. So partly, it's still a situation though.
I'll just give you an example. Let's say you go on telemedicine and you decide you're going to get your mental health or erectile dysfunction or some ailment that you have looked at by an avatar robot, and that that company will then send you medicines or tell you what to do about your condition. Whether they're up to tough privacy standards, whether their security is good, whether if there was a leak or a release or a theft, they would be liable for harm to you, not clear at all.
So big parts of the healthcare system still are vulnerable in terms of privacy, even if you helped reduce that vulnerability by at least simplifying the payer side.
BJ: Right. And so that's sort of the second solution of, is this something that requires, let's say, a constitutional amendment that gives people the right to privacy? Like, we need to go that far as to just, because it's not, you know, it's implied in the constitution, but it's not actually said that you have a right to privacy. So would that be part of the solution?
Dr. Caplan: Well, it would help. You know, privacy is a concept that was invented by a Supreme Court justice. That's right. That's right. Not really legislated. Legislating it would help. I don't even know if you need it in the constitution. You could probably pass a kind of federal omnibus package of legislation to govern AI. The industry hates it, fights it all the time. They don't want to deal with the oversight and the regulation. There are many in our federal government who think regulating that way will slow down the speed at which AI is being introduced, hurt the economy, hurt jobs. The Europeans are a little more inclined to do what we're talking about that is regulate by legislation, but even there it's milk or toast. It's not strong stuff. It's pretty weak.
So between the lobbying and the laissez faire attitude that says this, we can't get in the way of this. We don't want any ethicists or privacy purveyors hindering the business. We haven't seen much.
BJ: Right. In the time I have left, what could people do that are listening to this discussion? When it comes to better protecting their personal privacy when it comes to their health data and the privacy of others when comes to the health data that's out there?
Dr. Caplan: Well, look, you should always be nervous if anyone says you're going to get care but you have to sign away your right to privacy. And those contracts are offered and they should not be accepted. I've seen them in individual doctor's offices, a plastic surgeon saying, you get your care here, you wave your right to privacy. I can use your pictures. I can use the outcomes in my advertising and marketing, social media, whatever I want.
Just watch for those, what we call contracts of adhesion. They're making you do things that you should not be requested to do.
Secondly, don't use these online places that say, share intimate data with us and we'll give you information about your ancestors. I don't trust that stuff. I don't think it's well protected. I'm not even sure the information they share about your ethnicity and family history is all that accurate anyway, but that's for another day.
But I've never done it. I've steered clear. tell people, don't go there. I think another thing to watch for is if information, if you're in research or research protocol, you have cancer or something and they want you to be in a study. Make them explain who gets to see the data.
Also make sure that you get to see the data. You have a right to it, your data, that will help you know what might have leaked out or what might have fallen onto the eyeballs of others. So you gotta fight a little bit to assert your rights.
BJ: Right, I think that's well said.
DDG Live Read Script #4- VPNs
Rosie: You know, it used to be we would recommend you use a Virtual Private Network, or VPN, only for specific circumstances.
Like say, if you can’t access certain content, because of the State or Country you live in.
But these days, we feel a VPN has gone from a nice to have, to a must have.
And while it’s true your circumstances may dictate what specific VPN you need, for most people in most cases, we recommend the one offered by DuckDuckGo.
Here are three quick reasons why:
First: This VPN is fast and secure. Connecting to the server that’s nearest to you. So you should experience minimal lag.
And if you want to adjust what server you’re connected to, you can do that at any time.
Second: If your VPN connection gets interrupted, DuckDuckGo will attempt to reconnect automatically — all while preventing data leaks.
So, there’s no need to constantly monitor your VPN connection.
Finally, DuckDuckGo’s VPN is easy to install.
There’s no additional software required.
All you gotta do is sign-up for DuckDuckGo’s subscription plan, and install the VPN right within their browser.
From there, you can activate, or deactivate, your VPN with just one click.
Here’s an example of when you may need to do that:
Sometimes, you may encounter a website that doesn’t like to cooperate with a VPN. For example, some banking and government websites.
In those cases, your identity is known to them, so there’s no reason to hide your identity. And using a VPN may keep them from working properly.
If you run into that problem, you can quickly deactivate DuckDuckGo’s VPN.
Do what you gotta do.
Then turn the VPN back on.
No harm. No foul.
Bonzo The Monkey: *Monkey Sound*
Rosie: (To Bonzo): I’ve seen the adult websites you visit Bonzo, you definitely want to use a VPN.
Bonzo The Monkey: Monkey Sound
Rosie: If you're like Bonzo, or if you want to protect yourself from fascists and weirdos, you can sign up for the DuckDuckGo subscription via the Settings menu in the DuckDuckGo browser, available on iOS, Android, Mac, and Windows.
Or via the DuckDuckGo subscription website: duckduckgo.com/subscriptions.
The DuckDuckGo subscription is currently available to residents of the U.S., U.K., E.U., and Canada.
Feature availability may vary by region.
So, make sure you check the website for further information.
Stupid Sexy Privacy Outro
Rosie: This episode of Stupid Sexy Privacy was recorded in Hollywood, California.
It was written by BJ Mendelson, produced by Andrew VanVoorhis, and hosted by me, Rosie Tran.
And of course, our program is sponsored by our friends at DuckDuckGo.
If you enjoy the show, I hope you’ll take a moment to leave us a review on Apple Podcasts, or wherever you may be listening.
This won’t take more than two minutes of your time, and leaving us a review will help other people find us.
We have a crazy goal of helping five percent of Americans get 1% better at protecting themselves from Fascists and Weirdos.
Your reviews can help us reach that goal, since leaving one makes our show easier to find.
So, please take a moment to leave us a review, and I’ll see you right back here next Thursday at midnight.