(C) Daily Kos
This story was originally published by Daily Kos and is unaltered.
. . . . . . . . . .
AI Copyright Rulings: Bad Analogies Lead to Bad Law [1]
['This Content Is Not Subject To Review Daily Kos Staff Prior To Publication.']
Date: 2025-07-10
While I was away, there were two copyright rulings regarding the taking of material for use in training. From the perspectives of artists, the rulings had some good, some bad, and some items that show that judges often are susceptible to the bad analogies that the tech world sometimes uses in its hype.
Before I get into looking at the rulings, I want to stress that morally this kind of uncompensated, forceful rejection of the moral rights of creatives is wrong. I do not care if this is eventually ruled to be legal — taking material from other people in order to replace and profit from their work, especially when your system can and does output direct copies of their work, is morally wrong. If you do this, you are a bad person, should feel shame, and no one should allow you to listen to anything other than Barry Manilow’s greatest hits. Okay, maybe that last is a bit too extreme, but some punishment is in order.
With respect to the legal cases, two rulings came out. The first is the simplest: the judge tossed the case largely because the lawyers were garbage. According to the judge, the litigants did not even approach making an argument relevant to the case at hand and practically invited other authors to bring a better argued case. Essentially, the judge said that the plaintiffs did not build a legal case for economic harm but that META getting off on this technicality should not be taken as a ruling that training with stolen material is fine. Bring me competent lawyers, the judge seems to say, and things could very well turn out differently.
The other case was much more favorable to the AI companies — but hides a potential grenade for them. The judge in that case, in part because of a poor understanding encouraged by a poor analogy, ruled that training with purchased material is fair use, that it transforms the material and is necessary. And here is where the faulty analogies come into play.
The judge wrote that “But Authors’ complaint is no different than it would be if they complained that training schoolchildren to write well would result in an explosion of competing works.” This is silly. Children do not retain perfect copies of all the works they have ever been exposed to, nor can they perfectly reproduce all those copies. And when children do create, even if they copy, they add something unique based on their personalities that AI does not. What imitative AI does — a token calculation based on perfectly recalled data — is not the same thing as what a person does — absorb information incompletely and add to it when reproducing based on their unique life and experiences. Pretending otherwise leads, like in this case, to a bad understanding of what is actually happening and potentially bad law.
All is not roses for the AI firms, however. Judge Alsup made it clear — to the point of scheduling a trial — that using pirated works for training was likely a violation. Buy one book and you are fine. Pirate a book, and you are not fine. And these firms pirated a ton of material. This could be a significant cost to them. And he also said that the plaintiffs did not make a serious claim about infringing copies: “If that were not so, this would be a different case. Authors remain free to bring that case in the future should such facts develop.” So, he, too, seems to be suggesting that they try again with competent lawyers. Or at least a different legal theory.
The verdicts are mixed, and not the wholesale victory that many AI firms are touting them as. But it is disturbing that so much of the framing, in Alsup’s trial especially, mirrored that of the AI companies themselves. These tools do not learn in any meaningful sense — otherwise they would not lose to an Atari 2600 when playing chess — and they are not analogous to students learning to read. They are reproduction machines, as proven by how often they plagiarize. We would be much better served if lawyers and imitative AI safety and ethics proponents would learn to counter those narratives.
[END]
---
[1] Url:
https://www.dailykos.com/stories/2025/7/10/2332505/-AI-Copyright-Rulings-Bad-Analogies-Lead-to-Bad-Law?pm_campaign=front_page&pm_source=more_community&pm_medium=web
Published and (C) by Daily Kos
Content appears here under this condition or license: Site content may be used for any purpose without permission unless otherwise specified.
via Magical.Fish Gopher News Feeds:
gopher://magical.fish/1/feeds/news/dailykos/