(C) Alec Muffett's DropSafe blog.
Author Name: Alec Muffett
This story was originally published on allecmuffett.com. [1]
License: CC-BY-SA 3.0.[2]
Reactions and foreshadowings of #noplacetohide (“live” blog post, will update occasionally)
2022-01-17 22:42:29+00:00
My commentaries will be picked-out in blue blocks like this; obviously I don’t necessarily agree with anything that is quoted on this page…
Andy Burrows, NSPCC
Interesting to see NSPCC kick-off with the assumption that online anonymity (not to mention: totalitarianism and political/religious/gender oppression) is “not a thing” and therefore it is reasonable to gatekeep access to security technologies via “risk profiling” all the users. “Risk” is not a meaningful concept without an attack surface / known actor to measure it against.
A quick thread on end-to-end encryption and where @NSPCC thinks we should go next.
TL: DR: never ending debates based on a binary trade-off between privacy and safety get us nowhere, let’s think about risk profiles, and recognise children have a voice in this debate
Firstly, let’s recognise private messaging is at the forefront of child sexual abuse. Three-quarters of U.K. children contacted by people online they don’t know are initially contacted by private message. Grooming, sharing child abuse images and self-generated images are all ??
Too often, this gets drilled down into a fixed binary between adult privacy and child safety. The arguments are shrill. That’s flawed. We should be having a public debate about how we balance privacy and safety, and ask companies to invest in solutions that meet this balance
You mean we can’t have both, Andy?
Arguments too often privilege certain fundamental rights over others. And too often the concerns relating to children are delegitimised. But children are 1 in 3 internet users. In pluralistic terms, their voices *matter*
How about the 2 in 3 who are not children, Andy?
Let’s also recognise there’s clear public consensus for solutions that balance the fundamental rights of all users, including children. @NSPCC research shows where the public stand on this matter: only 4 per cent privilege privacy over other rights. @ECPAT shows similar in the EU
And let’s also acknowledge this isn’t *about* E2E per se, it’s about how companies protect the rights of users to enjoy fundamental privacy and safety rights. Different products/choices mean different risk profiles. Companies, govts and regulators should start from this approach
So the WhatsApp and Signal which Boris Johnson uses, are somehow different from the ones which everyone else uses, Andy?
For example, Meta’s #E2E proposals are concerning precisely *because* of the interplay with other high-risk design choices & well-established grooming pathways. It’s that overall risk profile that causes such concern about the impact on child abuse volumes & detection.
@Apple deserve lots of credit for focussing on tech solutions to mitigate harms. We should debate the efficacy and robustness of the neural hash solutions, but not their decision to move beyond a ‘winner takes all’, zero-sum worldview
Apple? That ended well. Not.
In conclusion, we should debate – & invest in tech solutions – that balance the fundamental rights of all internet users, inc children. And we should be focussing on risk, not ideological or polarised binaries. Read our full @NSPCC position paper:
https://www.nspcc.org.uk/globalassets/documents/news/nspcc-discussion-paper-private-messaging-and-the-roll-out-on-end-to-end-encryption.pdf ENDS
Originally tweeted by Andy Burrows (@_andyburrows) on 2022/01/17.
[END]
[1] URL:
https://alecmuffett.com/article/15845
[2] URL:
https://creativecommons.org/licenses/by-sa/3.0/
DropSafe Blog via Magical.Fish Gopher News Feeds:
gopher://magical.fish/1/feeds/news/alecmuffett/