(C) Daily Kos
This story was originally published by Daily Kos and is unaltered.
. . . . . . . . . .
The Supreme Court has laid the path by which social media can become terrorist launderers [1]
['This Content Is Not Subject To Review Daily Kos Staff Prior To Publication.', 'Backgroundurl Avatar_Large', 'Nickname', 'Joined', 'Created_At', 'Story Count', 'N_Stories', 'Comment Count', 'N_Comments', 'Popular Tags']
Date: 2023-05-19
The Supreme Court’s ruling yesterday that shields social media sites from liability for terrorist acts promoted on those sites has the potential of turning those sites into “terrorist launderers”. The Court ruled that social media owners are not aiding and abetting terrorist acts by spreading terrorist messages via social media algorithms, because those site algorithms are “agnostic”. In other words, since algorithms link all content to all interested site members, the fact that some of that content is to promote terrorist acts does not mean social media owners are promoting terrorism. As Justice Thomas wrote, “The fact that these algorithms matched some ISIS content with some users thus does not convert defendants’ passive assistance into active abetting.” Justice Thomas went so far as to compare postings inciting terrorism with postings of recipes. Thus, the Supreme Court has unwittingly created a legal means for laundering terrorist activity--simply post such communications on social media and they become the equivalent of sharing the perfect rice pudding recipe.
While one can and should debate whether social media sites must be required to self-censor, the Court’s opinion shows how ignorance of artificial intelligence as used by social media in this Age of Algorithms can unwittingly encourage the worst aspects of that AI. The Court was correct that social media algorithms treat all content the same. These algorithms try to link content with viewers, and thus create those infamous rabbit holes that result in multiple clicks and longer view times, and thus increased ad revenue for YouTube, Facebook and their competitors. But that ignores the fact that social media algorithms can also, if designed and used properly, weed out false and violence-promoting postings, and thus prevent those susceptible to such messages from being encouraged to join in criminal behavior. By focusing only on the “agnostic” nature of social media algorithms, the Court has ignored the fact that terrorist communications are promoted and spread by social media in the same manner as cute kitten videos, but with far more intentionally heinous results.
Ironically, in its analysis of what “aiding and abetting” means in regard to social media postings, the Court failed to draw an obvious analogy. The Court’s opinion analyzed a case in which the girlfriend of a thief was found to have aided and abetted her partner by, among other acts, melting down stolen jewelry in a forge in her home. This act of destroying stolen jewelry by turning it into simple ingots was proof in that case that the girlfriend had aided and abetted the thefts.
However, the Court has now opined that because social media algorithms “agnostically” treat all content the same, those social media sites can now, without ramifications, profit from the promotion of terrorism as much as from cute kitten videos. Social media AI thus strips away the legal ramifications of that terror via the alchemy of agnostic algorithms. In short, terrorist social media content can now be melted down to become just like any other social media content via the use of agnostic algorithms. But, while the girlfriend was aiding and abetting jewelry theft by melting down the heist into resaleable form, the Court has instead empowered social media to turn criminal communications into profit-making postings by way of agnostic algorithms, and concurrently ruled that such aiding and abetting is not a crime. The obvious analogy in the jewelry case was thus turned inside out by the legal reasoning of Justice Thomas.
Granted, the debate is legitimate about whether social media sites should be expected to self-censor, and if algorithms should be used to weed out troubling content just because some find that content troubling. Unfortunately, the Supreme Court not only dodged this bigger question, but actually complicated the question by over-simplifying the use of algorithms to promote social media content. The same power of AI that promotes terrorism on social media could be used by social media to interrupt terrorist communications and other hate-mongering. But the Court, by ignorantly focusing on only one use of social media algorithms, has now identified exactly how social media owners can spread terrorist communications for profit by using agnostic algorithms to convert that criminal activity into just any other social media posting. The Supreme Court has now given its blessing for social media owners to become terrorist launderers.
[END]
---
[1] Url:
https://www.dailykos.com/stories/2023/5/19/2170214/-The-Supreme-Court-has-laid-the-path-by-which-social-media-can-become-terrorist-launderers
Published and (C) by Daily Kos
Content appears here under this condition or license: Site content may be used for any purpose without permission unless otherwise specified.
via Magical.Fish Gopher News Feeds:
gopher://magical.fish/1/feeds/news/dailykos/