(C) OpenDemocracy
This story was originally published by OpenDemocracy and is unaltered.
. . . . . . . . . .



How AI and big tech put trafficking survivors at risk [1]

[]

Date: 2023-06

Preventing trafficking is one of Amazon’s justifications for doing this. The presenter praised Amazon’s partnership with the National Center for Missing and Exploited Children, and boasted how easy it is for law enforcement to obtain video footage without a warrant from any neighbourhood where a kidnapping is suspected. Safety first, privacy second. There was no discussion of how easily this kind of unregulated access could be abused without the right oversight. Nor was the underlying assumption of all these tools – that law enforcement is always part of the solution rather than the problem – ever questioned. This is despite data showing that law enforcement officers are 40% more likely to be domestic violence abusers than the general public, and that many officers have been accused or convicted of human trafficking themselves.

Artificial intelligence and machine learning were also celebrated as potential tools for catching suspected traffickers earlier. That these systems would eventually be able to do this was treated as a given: it was only a matter of time – and enough data. Yet, human trafficking data is both limited and notoriously inaccurate. Bad data means bad learning, and while this wasn’t considered worthy of discussion at the summit, that can only mean one of two things. Either you end up with a system that is unhelpful at best and dangerous at worst because it was trained on inaccurate data. Or you use the problem of bad data as an excuse to collect more data. The pursuit of ‘safety’ trumps privacy once again.

The false messiah of Big Data

The tech industry is enamoured with large datasets. Preferring quantity of data over quality is in vogue, and the dangers of this trend are systematically downplayed by the world’s largest companies. This is why Google forced out Timnit Gebru, a renowned AI expert, after she criticised large, inscrutable datasets as having – among other problems – an ingrained potential for racist and sexist bias.

Similarly, nobody in this space seems to want to acknowledge that training AI on human trafficking data risks collecting data on survivors that can be later breached, misinterpreted, or sold for a profit. The most visible example of something like this happening is LexisNexis’s sale of immigrants’ information to the US’s Immigration and Customs Enforcement (ICE). At Amazon’s summit, one of the survivor panellists pointed out how data given to law enforcement had made their own family members more vulnerable to traffickers. In response, a director from one of the largest anti-trafficking NGOs in the world glibly said, “shit happens”.

[END]
---
[1] Url: https://www.opendemocracy.net/en/beyond-trafficking-and-slavery/how-big-tech-and-ai-are-putting-trafficking-survivors-at-risk/

Published and (C) by OpenDemocracy
Content appears here under this condition or license: Creative Commons CC BY-ND 4.0.

via Magical.Fish Gopher News Feeds:
gopher://magical.fish/1/feeds/news/opendemocracy/