(C) Alec Muffett's DropSafe blog.
Author Name: Alec Muffett
This story was originally published on allecmuffett.com. [1]
License: CC-BY-SA 3.0.[2]


Matt Green breaks down (ahem) popular misrepresentation of “homomorphic encryption” (i.e. cleartext) CSAM detection that Thorn espouse; HT Douwe Korff for tl;dr

2024-01-25 20:30:00+00:00

This short thread is a blinder; linked as best I can, below, because NewTwitter does not permit serious unrolls any more.

For those unfamiliar, Thorn is a noted, Aston-Kutcher-founded critic of platforms which are enabling people to have secure private messenger conversations, on the basis that permitting people to have privacy leads to child abuse with presumably no positive outcomes.

They pitch their “solution” thusly:

Thorn’s CSAM Classifier is an incredible machine learning-based tool that can find new or unknown CSAM in both images and videos. When potential CSAM is flagged for moderator review and the moderator confirms if it is or is not CSAM, the classifier learns. It continually improves from this feedback loop so it can get even smarter at detecting new material. https://www.thorn.org/blog/how-thorns-csam-classifier-uses-artificial-intelligence-to-build-a-safer-internet/

Short version (the long version is in this posting)

They are proposing to oblige/tell Signal how to write software (see diagram)

They are talking about using (ostensibly privacy-preserving) homomorphic encryption, but not actually using it in any way that would preserve privacy

Doing so is not exactly honest

Quoth Matt

Interesting article on the CSAM detection software offered by Thorn. Despite public claims to the contrary, it apparently requires sending plaintext to servers, since the use of homomorphic encryption is too expensive. https://t.co/QyfvtZjuRu pic.twitter.com/pxyiECnfrM — Matthew Green (@matthew_d_green) January 25, 2024

Final note from the documents. This one, also from Thorn, seems to handwave at the idea of using encrypted content detection technologies for other types of crime beyond CSAM. I mean, why not? https://t.co/t9OCUpjm63 pic.twitter.com/cFZKpOdNWm — Matthew Green (@matthew_d_green) January 25, 2024
[END]

[1] URL: https://alecmuffett.com/article/108995
[2] URL: https://creativecommons.org/licenses/by-sa/3.0/

DropSafe Blog via Magical.Fish Gopher News Feeds:
gopher://magical.fish/1/feeds/news/alecmuffett/