(C) Common Dreams
This story was originally published by Common Dreams and is unaltered.
. . . . . . . . . .
Predictive policing algorithms are racist. They need to be dismantled. [1]
['Will Douglas Heaven']
Date: 2020-07-17
Students told reporters that police hit them with batons, threw them on the floor, and pushed them up against walls. The police claimed they were the ones getting attacked—“with water bottles, soda pops, milk, and so on”—and called for emergency backup. Around 25 students were arrested, and many were charged with multiple crimes, including resisting arrest with violence. Milner remembers watching on TV and seeing kids she’d gone to elementary school with being taken into custody. “It was so crazy,” she says.
"There's a long history of data being weaponized against Black communities."
For Milner, the events of that day and the long-term implications for those arrested were pivotal. Soon after, while still at school, she got involved with data-based activism, documenting fellow students’ experiences of racist policing. She is now the director of Data for Black Lives, a grassroots digital rights organization she cofounded in 2017. What she learned as a teenager pushed her into a life of fighting back against bias in the criminal justice system and dismantling what she calls the school-to-prison pipeline. “There’s a long history of data being weaponized against Black communities,” she says.
Inequality and the misuses of police power don’t just play out on the streets or during school riots. For Milner and other activists, the focus is now on where there is most potential for long-lasting damage: predictive policing tools and the abuse of data by police forces. A number of studies have shown that these tools perpetuate systemic racism, and yet we still know very little about how they work, who is using them, and for what purpose. All of this needs to change before a proper reckoning can take place. Luckily, the tide may be turning.
There are two broad types of predictive policing tool. Location-based algorithms draw on links between places, events, and historical crime rates to predict where and when crimes are more likely to happen—for example, in certain weather conditions or at large sporting events. The tools identify hot spots, and the police plan patrols around these tip-offs. One of the most common, called PredPol, which is used by dozens of cities in the US, breaks locations up into 500-by-500 foot blocks, and updates its predictions throughout the day—a kind of crime weather forecast.
Other tools draw on data about people, such as their age, gender, marital status, history of substance abuse, and criminal record, to predict who has a high chance of being involved in future criminal activity. These person-based tools can be used either by police, to intervene before a crime takes place, or by courts, to determine during pretrial hearings or sentencing whether someone who has been arrested is likely to reoffend. For example, a tool called COMPAS, used in many jurisdictions to help make decisions about pretrial release and sentencing, issues a statistical score between 1 and 10 to quantify how likely a person is to be rearrested if released.
The problem lies with the data the algorithms feed upon. For one thing, predictive algorithms are easily skewed by arrest rates. According to US Department of Justice figures, you are more than twice as likely to be arrested if you are Black than if you are white. A Black person is five times as likely to be stopped without just cause as a white person. The mass arrest at Edison Senior High was just one example of a type of disproportionate police response that is not uncommon in Black communities.
The kids Milner watched being arrested were being set up for a lifetime of biased assessment because of that arrest record. But it wasn’t just their own lives that were affected that day. The data generated by their arrests would have been fed into algorithms that would disproportionately target all young Black people the algorithms assessed. Though by law the algorithms do not use race as a predictor, other variables, such as socioeconomic background, education, and zip code, act as proxies. Even without explicitly considering race, these tools are racist.
That’s why, for many, the very concept of predictive policing itself is the problem. The writer and academic Dorothy Roberts, who studies law and social rights at the University of Pennsylvania, put it well in an online panel discussion in June. “Racism has always been about predicting, about making certain racial groups seem as if they are predisposed to do bad things and therefore justify controlling them,” she said.
[END]
---
[1] Url:
https://www.technologyreview.com/2020/07/17/1005396/predictive-policing-algorithms-racist-dismantled-machine-learning-bias-criminal-justice/
Published and (C) by Common Dreams
Content appears here under this condition or license: Creative Commons CC BY-NC-ND 3.0..
via Magical.Fish Gopher News Feeds:
gopher://magical.fish/1/feeds/news/commondreams/