(C) Daily Kos
This story was originally published by Daily Kos and is unaltered.
. . . . . . . . . .



The Immorality of Making Imitative AI [1]

['This Content Is Not Subject To Review Daily Kos Staff Prior To Publication.']

Date: 2025-02-26

Yeah, a bit hyperbolic, but it fits, I think.

Recently, it came out that Meta trained its imitative AI on a database of pirated books. Now, these are not books that are available legally online for other uses, these are pirated books. Every book in this collection represents at least one lost sale for an author. There is absolutely no way to justify this in fair use terms. They knew this at the time, and yet they went ahead and used it anyway.

There is a reason that receiving stolen goods is a crime in most jurisdictions. By allowing people to take things acquired illegally, you incentive theft. Because that is what Meta did here — they used stolen material. And please do not waste your breath arguing with me about “It’s not theft since the cost of producing a new copy is almost zero!” That was nonsense in the Napster days, and it is nonsense now. Each and every one of those pirated books represented an amount of effort by their authors. Those authors should be rewarded for that effort in order to encourage art and research to continue to be produced. If you don’t want to pay for something, or cannot, go to the library.

Information doesn’t want to be free — it wants to be paid so it can continue to exist. Welcome to our capitalist hellscape. Wake me when you are ready to go full Space Communism, but until then, do not whine about people trying to make a living providing us art and research.

Meta, obviously, does not care if art and research continue to exist. In the short term, if they can profit from it, they will. Even the fair use argument falters. As a court recently ruled, the training material taken from news organizations, writers, and researchers, even if scrapped from legal sources, cannot be fair use since it is used to create material designed to compete/replace the taken material in the marketplace. And imitative AI companies know this.

From the start, OpenAI has claimed that it needs to take material for training, and that it cannot be profitable if it has to pay for the material it uses. They recently got the UK to propose changing their fair use rules so that material can be taken unless the artists go through an onerous process to opt out. Even Musk’s recent “send me five things you did” to federal workers appears to be an attempt to feed information about government jobs into an AI.

If that is not the perfect analogy for imitative AI, I don’t know what it. A person who has no legal or moral justification comes forward and demands that people stop their productive work and instead produce information that he is going to use to build an imitative AI that makes him money and that he will try to use to replace you. Oh, and he threatens to fire you if you do not comply. Truly, the morals of imitative AI in one handy example.

Being ethical in late capitalism is hard. No matter what choices you make, chances are the system is setup in such a way that no matter what you do, someone is going to get hurt. Imitative AI is one of the easier situations, however. Not only is the environmental impact immense, not only are the products largely not suited for widespread adaptation, but they are actively taking from real people. Not using imitative AI when you have the choice to avoid it is a fairly straightforward good.

And yes, the press and the tech bros would have you believe that imitative AI is inevitable, that if you do not give in you will be left behind. It’s just not true. Nothing is inevitable. Remember 3D televisions? For a couple of years all the TVs were going to be 3D TVs. Except it turned out that 3D television is a bit crap, and no one wanted to pay for it. No more 3D televisions. Something similar can happen with imitative AI. It’s kind of crap and not worth paying for, either in cash or in the moral compromises needed to justify its use. If people keep those simple facts in mind, we can avoid watching crappy 3D in the rest of our lives as well.

Okay the metaphor is a wee bit strained but just remember: nothing is inevitable, and our moral choices can and do matter. we don’t need to reward the people who would stela form the rest of us to make themselves a little bit richer.

[END]
---
[1] Url: https://www.dailykos.com/stories/2025/2/26/2306131/-The-Immorality-of-Making-Imitative-AI?pm_campaign=front_page&pm_source=more_community&pm_medium=web

Published and (C) by Daily Kos
Content appears here under this condition or license: Site content may be used for any purpose without permission unless otherwise specified.

via Magical.Fish Gopher News Feeds:
gopher://magical.fish/1/feeds/news/dailykos/