Connect with us

ChatGPT

ChatGPT doesn’t offer doctor-patient confidentiality

Sam Altman says its very screwed up and AI companies should offer the same kind of privacy users get from real professionals.

ChatGPT on laptop
Image: Pexels

Just a heads up, if you buy something through our links, we may get a small share of the sale. It’s one of the ways we keep the lights on here. Click here for more.

If you’re using ChatGPT as a kind of therapist or emotional support tool, OpenAI’s CEO Sam Altman suggests you might want to think twice, especially when it comes to sharing personal or sensitive information.

In a recent interview on This Past Weekend podcast with Theo Von, Altman explained that AI apps like ChatGPT don’t offer the same legal privacy protections you’d get with a real therapist, doctor, or lawyer. 

For example, when you talk to a therapist, your conversations are legally protected by “doctor-patient confidentiality.” 

That means your private details can’t just be shared. But if you share the same information with ChatGPT, there’s no such protection in place, at least not yet.

Altman noted that many people, especially younger users, turn to ChatGPT for help with personal issues like relationships or mental health, essentially using it as a life coach or therapist. 

The problem is, legally, AI doesn’t qualify for any kind of confidentiality, so if there’s ever a lawsuit, OpenAI might be required to share those chats in court.

He called this situation “very screwed up” and said AI companies should offer the same kind of privacy users get from real professionals. (Via: TechCrunch)

But the legal system hasn’t caught up with how people are actually using AI today.

OpenAI is already fighting back against a court order in its legal battle with The New York Times that would require it to save and possibly hand over millions of user conversations, except those from paid business accounts using ChatGPT Enterprise. 

The company calls the request an “overreach” and says it’s appealing the decision. 

This privacy issue matters more now than ever. 

After the US Supreme Court overturned Roe v. Wade, for example, people became more cautious about sharing private health data, switching to apps that offer better privacy protections.

Altman said it’s totally fair for people to want clear legal privacy rules before trusting AI with their most personal thoughts, and even the podcast host admitted that’s why he avoids using ChatGPT much.

Do you think AI companies like OpenAI should offer the same confidentiality protections as doctors and therapists? Or should users just avoid sharing personal information with AI until better privacy laws exist? Tell us below in the comments, or reach us via our Twitter or Facebook.

Follow us on Flipboard, Google News, or Apple News

Ronil is a Computer Engineer by education and a consumer technology writer by choice. Over the course of his professional career, his work has appeared in reputable publications like MakeUseOf, TechJunkie, GreenBot, and many more. When not working, you’ll find him at the gym breaking a new PR.

Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

More in ChatGPT