Smartphone dystopia - proof that the smartphone is your enemy
Mark noticed something amiss with his toddler. His son's penis
looked swollen and was hurting him. Mark, a stay-at-home dad
in San Francisco, grabbed his Android smartphone and took
photos to document the problem so he could track its progression
(
https://nyti.ms/3PGqC65).
It was a Friday night in February 2021. His wife called an advice
nurse at their health care provider to schedule an emergency
consultation for the next morning, by video because it was
a Saturday and there was a pandemic going on. The nurse said
to send photos so the doctor could review them in advance.
Mark's wife grabbed her husband's phone and texted a few high-quality
close-ups of their son's groin area to her iPhone so she could upload
them to the health care provider's messaging system. In one, Mark's
hand was visible, helping to better display the swelling. Mark and
his wife gave no thought to the tech giants that made this quick
capture and exchange of digital data possible, or what those giants
might think of the images.
With help from the photos, the doctor diagnosed the issue and
prescribed antibiotics, which quickly cleared it up. But the episode
left Mark with a much larger problem, one that would cost him more
than a decade of contacts, emails and photos, and make him the
target of a police investigation. Mark, who asked to be identified
only by his first name for fear of potential reputational harm, had
been caught in an algorithmic net designed to snare people exchanging
child sexual abuse material (
https://nyti.ms/2nBmH3a).
Because technology companies routinely capture so much data, they
have been pressured to act as sentinels, examining what passes
through their servers to detect and prevent criminal behavior. Child
advocates say the companies' cooperation is essential to combat the
rampant online spread of sexual abuse imagery. But it can entail
peering into private archives, such as digital photo albums - an
intrusion users may not expect - that has cast innocent behavior
in a sinister light in at least two cases The Times has unearthed.
Jon Callas, a technologist at the Electronic Frontier Foundation,
a digital civil liberties organization, called the cases canaries "
in this particular coal mine."
"There could be tens, hundreds, thousands more of these," he said.
Given the toxic nature of the accusations, Mr. Callas speculated that
most people wrongfully flagged would not publicize what had
happened.
"I knew that these companies were watching and that privacy is not
what we would hope it to be," Mark said. "But I haven't done anything
wrong." The police agreed. Google did not.