Subj : AI that seems conscious i
To : MIKE POWELL
From : Rob Mccart
Date : Sat Aug 23 2025 08:44 am
MP>AI that seems conscious is coming and thats a huge problem,
>says Microsoft AI's CEO
That reminded me of a story on the news the last few days.
A young woman (22) was using one if the AI systems to talk with
about emotional problems she was having to do with gender issues
plus a recent breakup with a girlfriend. She was using AI to get
advice on what to do, and later investigations showed that the AI
system (ChatGPT) just latched onto the negative feelings she was
showing and basically said she was right to feel that way which
increased the distress she was feeling and in the end the young
lady killed herself.
To be clear (as well as I can recall) the woman's girlfriend
was trying to appologize after a fight and the woman wondered
if that was 'enough' after whatever happened between them, and
the AI came back picking up on her mood saying that it wasn't
enough and she was right to feel betrayed and upset.
Of course those who hosted the ChatGPT service said that it
is not a therapist and shouldn't be taken seriously, but there
are apparently a lot of especially young 'unpopular'people out
there who use an AI Chat system as the only 'friend' they talk
to and many won't make a move without consulting it first.
A glimpse of the future?
---
* SLMR Rob * Nothing is fool-proof to a talented fool
* Origin: capitolcityonline.net * Telnet/SSH:2022/HTTP (1:2320/105)