(C) Daily Kos
This story was originally published by Daily Kos and is unaltered.
. . . . . . . . . .
Kafka Made From Silicon: Algorithm Banning Parole [1]
['This Content Is Not Subject To Review Daily Kos Staff Prior To Publication.']
Date: 2025-04-14
The single most important quote in all of human history is probably the one most ignored: “A computer can never be held accountable, therefore a computer must never make a management decision”. One would think that this would be self-evident. A computer program is, even ones that “learn”, are the creation of human beings. They embody the choices, prejudices, assumptions, and simplifications of their programmers. They are no more purely objective than an NHL referee. And yet, people in power keep hiding behind them to make decisions. The latest example is the Governor of Louisiana.
Pro Publica, perhaps the best news organization in the country and you really should buy a subscription, has a disturbing new story about the use of TIGER, and algorithmic system that is denying people parole. Read the article for all of the ugly details but the gist is this: the algorithm looks not at the behavior of the person since arrest but at immutable factors. As a result, people who do everything thing are asked to do by the authorities are being denied even the possibility of parole. Get off drugs, learn a trade, behave perfectly? Fuck you — the computer says you are bad, therefor you must rot in prison. And the kicker? the program was never designed to make those decisions.
TIGER was originally developed to point out to prison officials the kinds of specific interventions people would need in order to better rehabilitate. But that word, rehabilitate, is the problem. The Governor of Louisiana doesn’t think people can be rehabilitated. He sees, like a lot of authoritarians, prison as a place to torture people, to make them pay for what they have done. Never mind the harm it does to society to create crime incubators by giving up on rehabilitation. Never mind that countries with better prison conditions and lower average incarceration time have better outcomes. All that matters is the bloodlust.
And I do not want to hear that I do not understand being a victim of a crime. I have been held up at gunpoint two- and one-half times, had three separate cars broken into, one house broken into, a young man shot in front of my house over a damn baseball cap, and been assaulted several times (to be fair, in many of those cases, I was probably asking for it). In only once case did the police even catch the perpetrator, and in only four did they even bother to come out and take a report. I have a lived a life intimately familiar with crime, unfortunately, and I am not out for blood. I am out for a system that lowers the chances of being victims of crimes without throwing away lives. I do not think that is served by having a blind seventy-year-old who has a place to live, and a marketable skill being kept in prison.
The algorithm is being used to make decisions, decisions it was never designed to do. And that is the point. The Governor found a tool to get what he wants. He passes the responsibility off of his policies and decisions and onto the algorithm, claiming it is fair and objective. His is the only state letting a machine decide the fate of people. And if the algorithm disproportionately affects the poor and minorities? Well, how is that his problem? The algorithm is “efficient” in the words of one of its supporters and was needed to clear the backlog in parole hearing. A backlog, you will be shocked to learn, that did not exist.
There are two primary lessons, I think. The first, of course, is that no one should work on these systems if there is a chance they could be used by red states to harm human beings. TIGER started out with great intentions that have been perverted by the State of Louisiana. But these systems are ripe for abuse. Anyone who works on them should think long and hard about how to structure the systems and contracts to prevent that abuse. And if you work for a firm that refuses to do so, you should find another job.
Second, it is clear that bad actors intend to use these systems as substitutes for accountability. Who is responsible for the harm done to the people that should have received paroles? Who is responsible to the families of these people and to the society at large that these decisions are making less safe? Not the Governor of Louisiana, no siree. He used an algorithm — the machine made the decision. And don’t we all know that machines are good and efficient and objective and never make mistakes? Don’t blame me for what the machine shows. You should have been born whiter, or richer, or never got caught in the first place.
Kafka would weep to see the silicon maze we have built for ourselves and happily run.
[END]
---
[1] Url:
https://www.dailykos.com/stories/2025/4/14/2316224/-Kafka-Made-From-Silicon-Algorithm-Banning-Parole?pm_campaign=front_page&pm_source=more_community&pm_medium=web
Published and (C) by Daily Kos
Content appears here under this condition or license: Site content may be used for any purpose without permission unless otherwise specified.
via Magical.Fish Gopher News Feeds:
gopher://magical.fish/1/feeds/news/dailykos/