(C) Alec Muffett's DropSafe blog.
Author Name: Alec Muffett
This story was originally published on allecmuffett.com. [1]
License: CC-BY-SA 3.0.[2]


The strange arguments of Professor Landau

2023-11-18 00:00:00

It is with some trepidation and humility I choose to enter the ring with Professor Susan Landau, a doyenne of techno-world. This does not make her immune from error. On the contrary, her recent article on encryption, in particular End-to-End Encryption (E2EE), shows she is capable of making several errors all at once.

Messaging services: a major conduit for child sexual abuse

Online messaging services have become a leading means of distributing still pictures and videos of children being sexually abused. These services also facilitate other sexual offences against children but for now let’s stick with the still pictures and videos. They are all illegal.

The process is simple enough. You have an image or video clip that you want to send to someone or a group so you just make it an attachment or an upload then click.

As you will see below, the distribution of this kind of material is happening on a very large scale. We have sight of the numbers because at the moment internet platforms which provide messaging services, in particular Meta, have voluntarily operated systems which allow the images to be swiftly identified, deleted and reported to the authorities for further action. One of the crucial further actions the reports make possible is locating and safeguarding the child or children depicted.

As things stand if E2EE extends its reach into existing and any new messaging services, a proposition Landau strongly supports, the reports referred to will dry up, if not completely then very substantially.

Meta has already said it intends to go down the E2EE route with two of its major mesaging Apps, Facebook Messenger and Instagram Direct. It has been estimated this will lead to a 70% drop in reports reaching the authorities.

Losing reports means losing the possibility of acting rapidly to safeguard a victim. Losing reports means losing the ability to track down perpetrators, meaning child rapists and those who exchange the abusive images.

E2EE will likely also mean even where a suspect is identified it is impossible to secure evidence which might lead to an arrest or a conviction. Subpoenas and courts orders will not be worth the paper they are written on.

In such circumstances, the number of victims the authorities are unable to help, the number of perpetrators against whom they can obtain no or insufficient evidence, can be expected to increase as the volume of images being distributed also increases.

For these reasons, in the Online Safety Act 2023 the UK Parliament has included provisions for countervailing measures. These should ameliorate the position in environments which deploy E2EE, but without seeking to break the encryption. The EU is considering something similar.

In her article Professor Landau criticises these measures.

The first alarm bell

According to Landau, policies such as those adopted by the UK Parliament and being considered by the EU represent an attempt by law enforcement to win arguments on encryption they had previously lost. Landau describes this as a “cynical ploy”. At least by implication she thinks the cops are insincere, using children as an intentionally deceptive battering ram to hide their real, long-standing, never-changing motives.

Coming so early in the article, or maybe anywhere, referring to the current discussion on E2EE as a “cynical ploy” did not instil in me a strong sense Landau has an open mind.

People who were not around, or at any rate not involved in previous arguments about encryption, are today expressing genuine concerns about child sexual abuse on the internet both generally and specifically in relation to the role E2EE is already playing and could play in the future. They should be listened to respectfully, taken seriously at face value and not, in effect, rubbished and ignored, written off as dupes or willing agents of the cops.

Anyway, to paraphrase George Orwell, just because a police officer says something it doesn’t necessarily mean it is wrong or has no merit. Landau should play the ball not the person.

New factors

At least two things have changed in recent years. Landau might want to reflect on them.

Firstly, as we will see below, we now have empirical data which at least hints at the consequences of the growth of the use of E2EE. We have no need to guess or speculate as to what might happen, specifically to children but there is no way of avoiding the conclusion it will apply to other groups as well. Once again, children are the canaries in the coal mine.

Secondly, de facto, the promotion of E2EE looks like tech choosing to elevate privacy to a right which, maybe not in theory but in practice, trumps all others. Tech has therefore, in effect, taken Lawrence Lessig at his literal word. Code has indeed become law. In this case it is not a law which has any legitimacy.

No legal instrument has ever been expressly debated and discussed in any democratic, law-making forum which endorses the idea an order of a court can be rendered mute and impotent because geeks and private entities have so decided. And for no other reason. Yet this is already happening.

Justice delayed is justice denied. Justice denied in perpetuity is oppression.

Perhaps when E2EE was being deployed on a comparatively small scale its negative impacts could be managed, contained or overlooked. If it becomes massively available that takes us to a new place entirely. If the consequences of what the mass roll out of E2EE could mean are properly understood, I would not bet money on a majority of voters saying they were OK with it.

Put crudely this is how it might pan out

“We know and accept the internet and its associated digital technologies have brought the world many benefits. We also know they have come with major disbenefits. Those who built, own and became rich off the back of the internet seem unwilling to address those disbenefits and now here’s another one. If E2EE is engaged in the commission of a crime or a civil wrong we may never be able to obtain redress or justice for a harm we have suffered. We get the upside of E2EE but the price we are being asked to pay for that upside is too high. Think again. Find another way.”

Indiscriminate and intimate mass scanning

We put up with our bodies being scanned, our luggage and personal belongings being examined at airports, at the entrance to a wide range of buildings, sports venues and similar. We allow dogs to sniff both us and what we are carrying. Why? Because even though such intimate scanning and sniffing is indiscriminate, taking place on a massive scale, we understand, support and value its underlying social purpose.

Similarly, as far as I know, since postal services began, every postal company has reserved the right to scan packages and envelopes as they go through their sorting system. They employ electronic devices which can detect drugs or other forbidden items and they reserve the right to open or reject a package which triggers an alarm.

Public opinion in 15 European countries has reached the same conclusion in respect of the online world and children.

Here are some selected headlines:

95% of European respondents say it is important there are laws to regulate online service providers to prevent and combat child sexual abuse and exploitation online;

to prevent and combat child sexual abuse and exploitation online; 91% of European respondents say that online service providers should be required to design and adapt their services to prevent child sexual abuse and exploitation online;

81% of European respondents support moves to oblige online service providers to detect, report, and remove child sexual abuse online;

In an earlier survey in eight EU Member States

76% of adults have indicated a willingness to allow automated technology tools to specifically scan and detect child sexual abuse material online – even if this means giving up some of their privacy!

Another alarm bell

Landau accurately describes the areas of concern for child protection advocates such as myself. In doing so she builds on an earlier blog she wrote.

Referring to these areas of concern she says

“Some … horrific (child sexual abuse) activities appear to have greatly increased in recent years—although some of that increase may be due to better reporting.”

It’s that word “appear”. We are being prompted to think there could be some reasonable doubt about whether or not there has been a “great increase”. The situation may not be as bad as it might otherwise seem at first sight because this “appearance” could be explained “simply” by better reporting. And anyway let’s not forget we are all fall guys in someone else’s “cynical ploy”. Using dodgy, probably inflated numbers is just the sort of thing such schemers would try in order to win their dastardly way.

An alternative (but sadly missing) perspective

Landau could have written that differently:

“NCMEC is the principal global focal point for receiving reports of a child or children being sexually abused where the abuse has manifested on the internet, typically in the form of a video or still picture.

NCMEC has repeatedly voiced concerns about how few online businesses (c.1500) are reporting to them. Could there be a better way of determining the likely true scale of sexual offending against children where the internet plays a decisive part in facilitating the abuse? Is there an alternative method which might produce a better estimate of the scale of online child sexual abuse taking place before the current improved methods of detection and reporting were put in place? Or is it more likely the online sexual abuse of children has scaled as internet usage by children has scaled?”

The numbers are very large

Notwithstanding the above, the number of reports NCMEC receives annually is, on any reckoning, still substantial. Over the past four years there were around 100 million. Overwhelmingly these reports linked to messaging services with Facebook Messenger and Instagram Direct leading the pack by quite a margin.

Of the 100 million, in that same four year period the number of reports resolving to the UK escalated as follows: 2019 74,330; 2020 75,578; 2021 97,727; 2021 316,900. Over 550,000 in total.

How many of the 100 million or the 550,000 came from Apple? Nearly zero. Why? Because Apple already uses encryption on a large scale. This means they cannot see anything so they cannot report anything. When I say “nearly zero”, I actually mean 864. I have not missed any commas or zeros by mistake.

Mutatis mutandis, it seems highly improbable Apple’s user base is significantly different from the user base of many other large tech platforms that report to NCMEC. Thus, even if the level of child sexual abuse on Apple is only equal to that of Meta, which is otherwise the biggest entity reporting to NCMEC (and note Apple themselves said their numbers were likely to be larger than any other platform’s), then over the same period the number of reports that could have been made would be nearer to or greater than 200 million.

How much pain is Apple’s use of encryption causing to how many children? We may never know. And the other messaging Apps that are using E2EE? Ditto.

Yet twice in her essay Landau praises Apple for their “principled” stand in support of encryption. Clearly Landau did not follow events very closely. Apple came out with a marvellous plan to detect child sexual abuse content without compromisng encryption. To begin with they defended the plan stoutly, even saying it was “privacy enhancing”. According to a senior Apple source I spoke to, the company only retreated from their plan because they were met with a storm of criticism from, well, people like Professor Landau.

A brief spell of adverse publicity stimulated by a narrow band of a techno elite is what beat them back. Nothing else. Shame shame. They should and could have stuck to their guns.

What else might the good Professor have said?

Instead of seeking to cast doubt Landau could have written something like this:

“We know from innumerable prevalence studies that the incidence of child sexual abuse in the general population is substantial so it’s not as if a large number of reports to NCMEC are a complete and utter out-of-the-blue astonishing shock, although it is impossible to establish any kind of meaningful link between child sexual abuse in the general population and child sexual abuse which finds a manifestation or takes place on or in an online environment.

At the moment, only a minority of child sexual abuse incidents are likely to have any kind of link to the internet but those that do, if they are reflected in a still picture or a video, are uniquely harmful both to the victims depicted and to wider society, both economically and in other ways. What the abused children want is for those images to be located and removed from public and private view as swiftly as possible. That cannot happen if E2EE makes them impossible to find.

In addition, for as long as the pictures or videos continue to circulate on the internet they threaten children as yet unharmed anywhere in the world where the internet is available, which pretty much means everywhere in the world. Children in countries with a less well developed educational, social or law enforcement infrastructure are more likely to be at greater risk.

I say ‘at the moment’ most instances of child sexual abuse are unlikely to have any kind of link to the internet. But could that change? Other things being equal if it becomes more widely known that, thanks to the increased availability of E2EE, it is now possible to commit sexual crimes against children and never get caught or convicted, can we safely assume the wider application of E2EE will have no appreciable impact? I do not think we can. Isn’t the exact opposite more likely to be true?”

An imaginary Professor Landau might have asked

“How can we as technologists get a better insight into this problem? Is it sufficiently interesting or important enough for us to turn our great minds to it and start looking for funding to pay for the necessary work? The sort of funding that went into developing, say, Candy Crush, Super Mario or Tik Tok. Important stuff like that.”

As a rider Professor Landau might then finally have added something like this

The ethical obligations of technologists

“I hope I teach my students to consider their ethical responsibilities when they go out into the world where they create or maintain online systems. Tech is not value free or neutral. It reflects the values and above all the priorities of those who pay for systems to be built or maintained. How far should we encourage those who pay our wages to consider this or that decision in terms of its likely impact on children? Children constitute one third of the entire body of internet users across the whole world. In parts it rises to around one in two. That is a considerable and ever-present constituency. Granted in some areas of technology there is close to zero possibility that a particular decision could have any discernible impact on children, but in a great many others it will.

We know tech cannot be held responsible for the fact that child sexual abuse happens but we also know tech plays a part in furthering or facilitating it. It is a phenomenon which deserves attention in the same way online fraud, identity theft and misinformation do.

While it is ok for us as technologists, as citizens, to have opinions on how best to combat child sexual abuse, even if we have no special expertise in that area, we also have to reflect on what we can do to reduce the incidence of evil in areas where we do have expertise.

We can’t keep saying, as I did in my earlier blog that the answer is better prevention. That is true and obvious for any and every crime type, on and off the internet. Such a view essentially seeks to shift the burden on to everybody else. But we need to do our bit. We need to see what we can do as technologists to address the downstream consequences of society’s failure to prevent a child being sexually abused in the first place, at least insofar as technology plays a part in facilitating or perpetuating the abuse.”

Facebook/Meta is not a reliable source

Landau tells us she wants to “reprise” the situation pertaining to online child sexual abuse and exploitation. However, aside from NCMEC data all of the references or sources Landau uses are linked to Meta. The studies were either carried out internally by Meta and published by them or were funded by Meta.

As previously noted, Meta has said it intends to introduce E2EE into Facebook Messenger and Instagram Direct thereby abandoning the highly effective way it has been detecting child sexual abuse since 2018, with no known ill effects, and others have been doing since 2009, likewise with no known ill effects.

Might a more balanced appraisal have commented on Meta’s possible reasons for shifting to E2EE? I am struggling to find the right words here. How about these?

“Meta is moving to E2EE to make more money and help it compete against Apple”

That is at least one possible explanation. Meta is a business after all. That’s what they do.

For some time Meta has been systematically playing down and doing all it can to convince people the problem of child sexual abuse linked to its platforms isn’t as bad as it might “appear” (that word keeps cropping up). It has been doing this as part of a campaign to prepare the world for the tragic but (to them) financially advantageous misstep of moving to E2EE for two of its major messaging platforms.

For all that Meta say they will initiate “other measures” to protect children, there is no avoiding one simple fact: by wilfully blinding themselves, by “going Apple”, they are condemning an unknowable number of children to prolonged pain and misery, possibly even death. The police cannot begin an investigation, nobody can act to safeguard a child, if the initial reports are just not there. Shame on Meta.

I can see why Meta might feel a little aggrieved that they get so much bad publicity simply because, several years ago, they decided to do the right thing by introducing technology to detect child sexual abuse on their platform and publishing the numbers. But that does not justify what they now seem intent on doing. They should be joining with us demanding Apple and other messaging services come up with acceptable solutions which will minimise or reduce harms to children associated with the operation of messaging services.

Haugen and Bejar

Landau’s reliance on data from Meta seems all the more surprising because, if Frances Haugen is to be believed, they have actively suppressed data showing how its services harmed children.

Only last week (7th November) before a Congressional Subcommittee on Privacy, Technology and the Law fresh but similar accusations were made against Meta.

At this hearing Arturo Bejar gave his testimony. From 2009 to 2015 Bejar had been the senior engineering and product leader at Facebook responsible for its efforts to keep users safe and supported. He ran a group called “Protect & Care.” Reading his testimony is instructive and very much chimes with Haugen’s earlier statements. Here are a couple of choice quotes from Bejar.

“Meta continues to publicly misrepresent the level and frequency of harm that users, especially children, experience on the platform, and they have yet to establish a goal for actually reducing those harms and protecting children. It’s time that the public and parents understand the true level of harm posed by (Meta) ….”

“…. senior management was externally reporting different data that grossly understated the frequency of harm experienced by users.”

“When I left Facebook in 2015, after six years at the company, I felt good that we had built numerous systems that made using our products easier and safer……

… in 2019 I … went back to Facebook as an independent consultant. I stayed for two years, working with the team at Instagram that focused on “well-being.” It was not a good experience. Almost all of the work that I and my colleagues had done during my earlier stint at Facebook through 2015 was gone……. People at the company had little or no memory of the lessons we had learned earlier.”

This does not speak of a company with children’s interests front and centre of their minds.

Copies schmoppies

Landau acknowledges

“Each occurrence of a photo or video showing a child being sexually abused, even if it is a previous one shared hundreds of thousands of times, is harmful, for such showing increases the chance that an abused person will be recognized as having been the subject of (child sexual abuse or exploitation).”

Even so we are once again asked to minimise the problem by accepting Meta’s claim

“90 percent of the reports the company filed with NCMEC in October and November 2021 were “the same as or visually similar to previously reported content.” Half of the reports were based on just six videos. “

Let’s not get into why those two particular months were taken as the basis for making estimates of that kind. But “just six videos”? If, as Landau previously asserted, each and every one of the 29 million reports made that year to NCMEC truly is important then what are we meant to read into “just six videos”? How many reports from how many sources were not linked to the six videos?

In 2022’s report we are told NCMEC was notified of 49.4 million still images of which 18.8 million (38%) were unique and of the 37.7 million videos 8.3 million (22%) were unique. That’s a lot of unique stuff. And btw 49% of all reports received were “actionable” by law enforcement while NCMEC classified the remainder as “informational”.

Landau’s continued unquestioning reliance on data from Meta does not end there. She tells us in a (small) study conducted by Meta

“more than 75 percent of those sharing CSAM ‘did not do so with … intent to hurt the child’. Instead, they were sharing the images either out of anger that the images existed or because of finding the images ‘humorous’.”

From the victim’s point of view the harm done is done. For very good reasons the law allows no distinction which is based on the motives or sense of humour of the entity posting the criminal content. Landau cites approvingly recommendations from another study (funded by Meta). These would see “simplified methods of reporting child sexual abuse material” and the introduction of a system of alerts about the potentially serious legal consequences of posting it. The only drawback with that Meta-funded study is it begins as follows

“This report……(assumes) without judgment, that end-to-end encryption is here to stay, and asks, how are we going to combat online child sexual exploitation and abuse?”

Not quite but not far off

“OK Mrs Lincoln but, apart from that, how did you enjoy the play?”

Landau might have mentioned if Meta simply carried on doing what it is currently doing there would be no need for improved reporting as we are given to understand (no external corroboration has been provided, at least not that I know about) already known child sexual abuse material will be detected in milliseconds and images which are likely to be child sexual abuse material will take a millisecond longer. In other words Meta’s automated systems work so fast nobody but the entity trying to post, and the cops, would ever see anything. Bravo.

The word “victim” does not appear once

Having copied Landau’s article into Word I can tell you it consists of 4,480 words and “victim” is not one of them.

Landau tells us

Good public policy decision-making requires weighing competing interests.

But at no point is it even mentioned that the children depicted in the pictures and videos have a legal right to privacy and a legal right to human dignity. Not mentioned. Not weighed in any balance. Why?

Moreover, lest we forget, in Illinois v Caballes the US Supreme Court decided there was no “legitimate privacy interest in possessing contraband”. Quite so and that must be even more the case if someone else’s privacy rights have been infringed within or by the contraband.

And on the tech itself

I must now put on one side Landau’s evident lack of serious intent when it comes to considering how tech might be able to help solve one or more of the problems discussed so far. She seems only to want to advance reasons why tech is the last place you should look for help.

When Apple published its client-side scanning solution (CSS) it got heavyweight approval from experts. They agreed that it could do what it said it wanted to do: in other words, as stated earlier, detect material before it enters the encrypted tunnel to check if it contained child sexual abuse content, with no expectation or requirement for anyone even to try to break any kind of encryption.

Part of the attack against CSS came from quarters which seemed to suggest it opened up new “attack surfaces” or “attack vectors”. As far as I could see this was largely conjecture, theoretical possibilities. They hadn’t happened but seemingly they could.

Sometimes you have to smile or wonder if someone somewhere is sniggering behind their hand. Digital technologists have created a system which is positively alive with attack surfaces or vectors. It is called the internet. But here is where we draw the line?

Landau says when it comes to tech we should be willing accept imperfection. I quite agree. If we try CSS or other systems and monitor them closely any discovered breaches can and should be acted upon immediatly. No more “put it out Tuesday, fix it Thursday, maybe” . Worries about potential misuse or unintended consequences can be met by having strong audit and transparency mechanisms with regular public reporting reinforced by powerful legal sanctions.

But here is a question Landau never answers because she never asks. I referred to it earlier.

Why is it that since PhotoDNA first started being used in 2009 there have been zero reported instances of it getting anything wrong? Zero reported instances of anyone being named, shamed, arrested or charged because an automated tool such as PhotoDNA made a mistake. Zero reported instances of anybody’s privacy being wrongfully invaded. And this is true for similar child protection tools.

The things Professor Landau does not discuss

I have written about these before so below I simply provide links to the blogs in question

E2EE and spin

The idea of privacy has instant and obvious appeal and goodness knows past transgressions by tech and state actors have certainly primed a part of the market. However, there has to be a better way of dealing with encryption. We have the power to shape tech. We cannot let it shape us. Repeating a mantra about the mathematics which underpin encryption sounds clever but really it isn’t. We cannot change the laws of maths but we can make choices about what we do with those laws.

E2EE poses every bit as much of a threat to our way of life as the new forms of AI everyone is talking about. The conversation needs to broaden out and civil society needs to become much more engaged.
[END]

[1] URL: https://johncarr.blog/2023/11/18/the-strange-arguments-of-professor-landau/
[2] URL: https://creativecommons.org/licenses/by-sa/3.0/

DropSafe Blog via Magical.Fish Gopher News Feeds:
gopher://magical.fish/1/feeds/news/alecmuffett/