(C) Daily Kos
This story was originally published by Daily Kos and is unaltered.
. . . . . . . . . .
Star Trek open thread: LCARS and ChatGPT [1]
['This Content Is Not Subject To Review Daily Kos Staff Prior To Publication.']
Date: 2025-09-03
Star Trek predicted ChatGPT: the potential, the limitations, the over-reliance on it and the backlash to the over-reliance.
The way some people talk about ChatGPT, it sounds like it can solve all of your problems if you can just think of the right prompt. But if you’re writing Star Trek fanfiction, do you treat LCARS as if it were that idealized manifestation of ChatGPT?
The acronym LCARS stands for Library Computer Access/Retrieval System. The acronym was first used on Star Trek: The Next Generation; the computers on the original series are obviously not LCARS?
To be very clear about this very basic point: LCARS is fictional, it comes from Star Trek. ChatGPT is a real thing, though some of the things it has been said to be capable of doing are entirely fictional.
In the original series episode “Court Martial,” Captain Kirk (William Shatner) faces the prospect of a court-martial, so he seeks out Samuel T. Cogley (Elisha Cook), a famous lawyer.
COGLEY: You Kirk?
KIRK: Yes. What is all this? [Kirk is referring to a bunch of books Cogley has strewn around Kirk’s quarters at the starbase]
COGLEY: I figure we'll be spending some time together, so I moved in.
KIRK: I hope I'm not crowding you.
COGLEY: What's the matter? Don't you like books?
KIRK: Oh, I like them fine, but a computer takes less space.
COGLEY: A computer, huh? I got one of these in my office. Contains all the precedents. The synthesis of all the great legal decisions written throughout time. I never use it.
KIRK: Why not?
COGLEY: I've got my own system. Books, young man, books. Thousands of them. If time wasn't so important, I'd show you something. My library. Thousands of books.
KIRK: And what would be the point?
COGLEY: This is where the law is. Not in that homogenized, pasteurized, synthesizer. Do you want to know the law, the ancient concepts in their own language, Learn the intent of the men who wrote them, from Moses to the tribunal of Alpha 3? Books.
KIRK: You have to be either an obsessive crackpot who's escaped from his keeper or Samuel T. Cogley, attorney at law.
COGLEY: Right on both counts. Need a lawyer?
KIRK: I'm afraid so.
This suggests that the computer doesn’t just store all those legal precedents, like Lexis Nexis did before our current A.I. craze, but also processes them in some way, synthesizing.
The first time I watched this episode, I was a little skeptical of Cogley preferring paper books. But then, as I struggled to remember the title of a Star Trek: The Next Generation episode in which Lt. Commander LaForge (LeVar Burton) tries to narrow down a list of substances the Enterprise’s internal sensors don’t scan for, I found myself wishing I still had my copies of Phil Farrand’s Nitpicker Guides to Star Trek.
My Google searches were made all the more annoying by Google’s intrusive A.I., which kept trying to make sense of my seemingly random search terms with unhelpful suppositions. Maybe I was thinking of an episode with Lwaxana Troi? Having to scroll down the page to ignore the Google A.I. slop was very distracting.
However, when I managed to catch the title “Hollow Pursuits” in the A.I. slop, I realized that was indeed the episode I was looking for. In that episode, as Lt. Reginald Barclay (Dwight Schultz) retreats into a holodeck fantasy, a series of mysterious malfunctions in tenuously related systems threatens to tear the ship apart.
LAFORGE: None of the systems involved interact directly with each other. I don't see anything in common.
BARCLAY: What if, what if, what if one of us is the connection?
DUFFY: Us? How?
BARCLAY: I don't know, but we're looking for a systemic explanation and there isn't one. We work with all the systems that are affected. What if we're transmitting something ourselves by touching it, or something.
WESLEY: The computer sensors would've picked up anything dangerous.
BARCLAY: But if it were something, something that we couldn't scan, you might've passed it to the injectors when you were realigning the magnetic capacitors.
LAFORGE: It was your glass, Duffy and both of you were present in the cargo bay when the anti-grav failed.
DUFFY: So was O'Brien.
WESLEY: The transporter malfunction. That's a connection too.
COMPUTER: Danger. Approaching safety limits of engine containment field.
LAFORGE: Computer, list all physical substances that wouldn't normally be picked up by an internal scan.
COMPUTER: There are fifteen thousand five hundred twenty five known substances that cannot be detected by standard scans.
LAFORGE: Great. And how many of those can exist in an oxygen atmosphere?
COMPUTER: Five hundred thirty two.
LAFORGE: And could alter molecular structure when it comes in contact with glass.
COMPUTER: Five.
LAFORGE: On screen at this station. Duffy.
BARCLAY: Jakmanite has a half life of fifteen seconds. There wouldn't be enough time to spread it around the ship.
LAFORGE: Right.
WESLEY: Selgninaem and lucovexitrin are highly toxic.
LAFORGE: Yeah, we'd all be dead by now. That leaves saltzgadum and invidium, neither of which has been used for decades.
WESLEY: Could either one of them cause all these malfunctions?
DUFFY: Most of the affected systems weren't even invented when those substances were in use. Who knows what could happen with a transporter or a magnetic capacitor?
LAFORGE: Wait a minute, wasn't invidium used in medical containment fields?
WESLEY: Not for over a century.
BARCLAY: The Mikulaks might still be using it.
DUFFY: And one of those canisters was broken.
LAFORGE: La Forge to Bridge. We have a working theory, Captain. There's a good possibility we picked up some Invidium from a broken canister in the cargo bay.
So you see, the computer narrows down thousands of culprits to just five substances, but it is the humans, with their human knowledge, who finally figure out what the problem is.
The open thread question: Do people have a more realistic expectation of what can be asked of LCARS (which is fictional) as opposed to what can be asked of ChatGPT (which is real)?
[END]
---
[1] Url:
https://www.dailykos.com/stories/2025/9/3/2335203/-Star-Trek-open-thread-LCARS-and-ChatGPT?pm_campaign=front_page&pm_source=latest_community&pm_medium=web
Published and (C) by Daily Kos
Content appears here under this condition or license: Site content may be used for any purpose without permission unless otherwise specified.
via Magical.Fish Gopher News Feeds:
gopher://magical.fish/1/feeds/news/dailykos/