Subj : Re: ChatGPT Writing
To : jimmylogan
From : Nightfox
Date : Tue Dec 02 2025 12:49 pm
Re: Re: ChatGPT Writing
By: jimmylogan to Nightfox on Tue Dec 02 2025 11:45 am
Ni>> One thing I've seen it quite a bit with is when asking ChatGPT to make a
Ni>> JavaScript function or something else about Synchronet.. ChatGPT doesn't
Ni>> know much about Synchronet, but it will go ahead and make up something it
Ni>> thinks will work with Synchronet, but might be very wrong. We had seen
Ni>> that quite a bit with the Chad Jipiti thing that was posting on Dove-Net
Ni>> a while ago.
ji> Yeah, I've been given script or instructions that are outdated, or just
ji> flat out WRONG - but I don't think that's the same as AI hallucinations...
ji> :-)
ji> Maybe my definition of 'made up data' is different. :-)
What do you think an AI hallucination is?
AI writing things that are wrong is the definition of AI hallucinations.
https://www.ibm.com/think/topics/ai-hallucinations
"AI hallucination is a phenomenon where, in a large language model (LLM) often a generative AI chatbot or computer vision tool, perceives patterns or objects that are nonexistent or imperceptible to human observers, creating outputs that are nonsensical or altogether inaccurate."
Nightfox
---
� Synchronet � Digital Distortion: digitaldistortionbbs.com