Subj : Using ChatGPT as therapis
To   : All
From : Mike Powell
Date : Mon Jul 28 2025 07:55 am

We havent figured that out yet: Sam Altman explains why using ChatGPT as your
therapist is still a privacy nightmare

Date:
Mon, 28 Jul 2025 11:22:17 +0000

Description:
Seeking therapy from an AI like ChatGPT could come with a lot of risks, says
OpenAI CEO Sam Altman.

FULL STORY

One of the upshots of having an artificial intelligence (AI) assistant like
ChatGPT everywhere you go is that people start leaning on it for things it
was never meant for. According to OpenAI CEO Sam Altman, that includes
therapy and personal life advice  but it could lead to all manner of privacy
problems in the future.

On a recent episode of the This Past Weekend w/ Theo Von podcast, Altman
explained one major difference between speaking to a human therapist and
using an AI for mental health support: Right now, if you talk to a therapist
or a lawyer or a doctor about those problems, theres legal privilege for it.
Theres doctor-patient confidentiality, theres legal confidentiality,
whatever. And we havent figured that out yet for when you talk to ChatGPT.

One potential outcome of that is that OpenAI would be legally required to
cough up those conversations were it to face a lawsuit, Altman claimed.
Without the legal confidentiality that you get when speaking to doctor or a
registered therapist, there would be relatively little to stop your private
worries being aired to the public.

Altman added that ChatGPT is being used in this way by many users, especially
young people , who might be especially vulnerable to that kind of exposure.
But regardless of your age, the conversation topics are not the type of
content that most people would be happy to see revealed to the wider world.

A risky endeavor

The risk of having your private conversations opened up to scrutiny is just
one privacy risk facing ChatGPT users.

There is also the issue of feeding your deeply personal worries and concerns
into an opaque algorithm like ChatGPTs, with the possibility that it might be
used to train OpenAIs algorithm and leak its way back out when other users
ask similar questions.

Thats one reason why many companies have licensed their own ring-fenced
versions of AI chatbots. Another alternative is an AI like Lumo , which is
built by privacy stalwarts Proton and features top-level encryption to
protect everything you write.

Of course, theres also the question of whether an AI like ChatGPT can replace
a therapist in the first place. While there might be some benefits to this,
any AI is simply regurgitating the data it is trained on. None are capable of
original thought, which limits the effectiveness of the advice they can give
you.

Whether or not you choose to open up to OpenAI, its clear that theres a
privacy minefield surrounding AI chatbots, whether that means a lack of
confidentiality or the danger of having your deepest thoughts used as
training data for an inscrutable algorithm.

Its going to require a lot of effort and clarity before enlisting an AI
therapist is a significantly less risky endeavor.

======================================================================
Link to news story:
https://www.techradar.com/ai-platforms-assistants/chatgpt/we-havent-figured-th
at-out-yet-sam-altman-explains-why-using-chatgpt-as-your-therapist-is-still-a-
privacy-nightmare

$$
--- SBBSecho 3.28-Linux
* Origin: capitolcityonline.net * Telnet/SSH:2022/HTTP (1:2320/105)