(C) Alec Muffett's DropSafe blog.
Author Name: Alec Muffett
This story was originally published on allecmuffett.com. [1]
License: CC-BY-SA 3.0.[2]
Why we need #EndToEndEncryption and why it’s essential for our safety, our children’s safety, and for everyone’s future #noplacetohide
2022-01-17 14:25:06+00:00
Privacy-enabling technologies like End-to-End Encryption are under threat from Governments that want to undermine peoples’ ability to have a private conversation.
This has been underway for many years. Having been previously framed around “drugs” and “terrorism” the current round of attacks is focused especially upon child safety, with the co-operation of concerned but misguided and narrowly focused charities.
this is a UK Government initiative
This childrens-charity-astroturfing effort is being orchestrated on behalf of the Government by MC Saatchi with £534,000 of funding from the Home Office, possibly more.
But attacking encryption won’t solve the problem of child abuse — especially not the estimated 90%+ of the time where the abuser is “known to the victim” — and also we suffer a societal challenge with parenting, where parents must walk a line between supervising children versus giving them agency, autonomy, space to grow, and frank awareness of how to spot danger. Parents could benefit from Government help and constructive resources regarding that, but they appear to prefer spending their money on advertising against encryption.
big numbers. misleading sentiments.
Matters are not helped by the Government and children’s charities often conflating “numbers of reports” with the numbers of actual children at-risk, abused, and harmed. The claimed numbers of “reports” are huge — usually in the millions [see errata] — but after removing duplicates, old, and previously-solved crimes (itself a massive challenge for police) the resulting counts of abusers and of children being abused is comparatively small — obviously never small enough, but still small, especially compared to populations of millions or billions of internet users.
An impactful approach to child protection should focus upon preventing the crimes before they happen, rather than pivot upon detecting — presumably thereby hoping to deter — crimes after-the-fact. If we want to protect kids we need to invest to fix society, not surveil it.
Critics sometimes claim that encryption makes it impossible to subpoena or obtain a warrant for information from people’s phones — this is bizarre because governments already demand such data. What they are actually complaining about is that the “platform” — for instance Facebook — no longer wants to be able to see the content themselves. The warrant will have to be served upon the device owner, not upon the (social) network provider.
Good security demands that data that we share amongst family and friends should remain available only to those family and friends; and likewise that data which we share with businesses should remain only with those businesses, and should only be used for agreed business purposes.
Network providers — and, importantly, messaging-network and social-network providers — are helping their users obtain better data security by cutting themselves off from the ability to access plaintext content. Simply: they don’t need to see it, and it’s not their job to police or censor it. Their adoption of end-to-end encryption makes everyone’s data safer.
The world needs end-to-end encryption. It needs more of it. We need the privacy, agency, and control over data that end-to-end encryption enables. And encryption is needed everywhere and by everyone — not just by politicians and police forces.
Child protection is a huge and important issue, but it cannot and must-not carry the debate; not least: your children should grow to adulthood, and when they do they will need the privacy of end-to-end encryption in order to navigate this increasingly complex, increasingly online world.
Further Reading
Errata regarding “Report Counts”
The original version of this post said that report numbers are inflated by counting individual files; that is incorrect , they are apparently only inflated with duplicates, stale information, non-infringing content, and redundancy, evidenced on this page, see graphic below.
, they are apparently only inflated with duplicates, stale information, non-infringing content, and redundancy, evidenced on this page, see graphic below. The problem exists partly as described in this blogpost: … in 2020 [NCMEC] received 21.8 million reports comprising 65 million files (slide 17 of this PDF, and elsewhere) — yet a NCMEC source tells me that in that same year they only added 1.7 million “triple-vetted” images hashes to their database of CSAM ; from this it might be possible to infer that more than 90% of NCMEC’s reports are in some way duplicates or irrelevant, but that would be unfair to do without more context…
(slide 17 of this PDF, and elsewhere) — yet a NCMEC source tells me that in ; from this it might be possible to infer that more than 90% of NCMEC’s reports are in some way duplicates or irrelevant, but that would be unfair to do without more context… See also this blogpost from Facebook: …we conducted an in-depth analysis of the illegal child exploitative content we reported to the National Center for Missing and Exploited Children (NCMEC) in October and November of 2020. We found that more than 90% of this content was the same as or visually similar to previously reported content. And copies of just six videos were responsible for more than half of the child exploitative content we reported in that time period. While this data indicates that the number of pieces of content does not equal the number of victims, and that the same content, potentially slightly altered, is being shared repeatedly, one victim of this horrible crime is one too many.
While this data indicates that the number of pieces of content does not equal the number of victims, and that the same content, potentially slightly altered, is being shared repeatedly, one victim of this horrible crime is one too many. And also this other blogpost from Facebook: …We worked with leading experts on child exploitation, including NCMEC, to develop a research-backed taxonomy to categorize a person’s apparent intent in sharing this content. Based on this taxonomy, we evaluated 150 accounts that we reported to NCMEC for uploading child exploitative content in July and August of 2020 and January 2021, and we estimate that more than 75% of these people did not exhibit malicious intent (i.e. did not intend to harm a child). Instead, they appeared to share for other reasons, such as outrage or in poor humor (i.e. a child’s genitals being bitten by an animal). While this study represents our best understanding, these findings should not be considered a precise measure of the child safety ecosystem. Our work to understand intent is ongoing.
for uploading child exploitative content in July and August of 2020 and January 2021, and we estimate that (i.e. did not intend to harm a child). Instead, they appeared to share for other reasons, such as outrage or in poor humor (i.e. a child’s genitals being bitten by an animal). While this study represents our best understanding, these findings should not be considered a precise measure of the child safety ecosystem. Our work to understand intent is ongoing. This is not new information, it has all been previously reported in Forbes …
The matter is clearly a lot more complex than the “millions” headlines would suggest.
Posters
Each links through to an individual image-post which can be separately shared on social media.
Abandoning #EndToEndEncryption would sacrifice the privacy of the adults who your children will grow-up to become. Directly & indirectly they’ll need and use #Encryption to protect their data and loved ones.
If you are giving your children their privacy — or not already protecting your children by supervising their internet usage — why would you ask a corporation to do it on your behalf? #EndToEndEncryption is not the problem…
Boris Johnson, Priti Patel, the Cabinet—they use encrypted messenger apps like @WhatsApp & @SignalApp for privacy & security. Strange, then, that they don’t want @Messenger to also be private & secure by using #EndToEndEncryption…
Remember #CambridgeAnalytica? Facebook does, & they don’t want to get burned again. But the Government is demanding that they must weaken #EndToEndEncryption to retain access to read your messages. Isn’t that odd?
Ever used an “incognito” window in a browser? If we have to stop kids using #EndToEndEncryption then we’ll also have to stop online anonymity; because to do it they’ll have to know who is, and is not, a kid…
People’s passwords get guessed & stolen. Their accounts get hacked. It happens every day. But if sensitive information is sent via #EndToEndEncryption there’ll be very little data to steal…
“I don’t need #encryption, I’ve nothing to hide!” — if you’re paying a mortgage, getting a diagnosis, or escaping an abusive partner, you don’t want that information to proliferate. #EndToEndEncryption gives you control over where the data goes.
How much harm would happen for want of #EndToEndEncryption? How many bank & medical details, business deals, or entirely innocent (but very embarrassing) photographs will leak from unencrypted chats & hacked accounts? How shall we measure that cost?
Shelters from domestic violence & abuse. Dissidents. LGBTQ discrimination. Democracy activists. Opressed religious minorities. People all over the world need strong privacy, for all kinds of reasons. We can’t & shouldn’t take hope away from them.
[END]
[1] URL:
https://alecmuffett.com/article/15742
[2] URL:
https://creativecommons.org/licenses/by-sa/3.0/
DropSafe Blog via Magical.Fish Gopher News Feeds:
gopher://magical.fish/1/feeds/news/alecmuffett/