2022-07-14 - The Future of the Internet by Jonathan Zittrain
============================================================
This book is more about the history of the Internet than its future.
I love that this book uses plain English to tell a compelling
narrative. Since history is cyclical, the patterns described can
inform the readers about the future.
The history told in this book helps remind me why i was so enamored
with the early Internet, and gives me a more positive way to frame my
career than "I got caught up in the tech bubble glamor and traded my
health and well-being for money."
Internet history
<
https://www.livinginternet.com/tindex_i.htm>
The author frequently uses the phrase "generativity" to mean
openness. I think "creativity" would have worked well enough for me,
in contrast to consumerism. But the author wanted to emphasize the
disruptive and inclusive nature of generativity.
Regarding the problem of lockdown and regulation, i think that boat
has already sailed. While i laud the author for discussing potential
solutions and strategies, i think it is too late for them to succeed.
Preface to the Paperback Edition
================================
[The author compares the origins of the PC and the Internet to Wiley
E. Coyote because of the love of amateur tinkering.]
The first reaction to abuses of openness is to try to lock things
down. One model for lockdown can be drawn from our familiar
appliances... ...we've seen glimpses of that model in communications
platforms like iPods, most video game consoles, e-book readers like
the Amazon Kindle, and cable company set-top boxes.
But there's another model for lockdown that's much more subtle, and
that takes, well, a book to unpack. This new model exploits
near-ubiquitous network connectivity to let vendors change and
monitor their technologies long after they've left the factory--or to
let them bring us, the users, to them, as more and more of our
activities shift away from our own devices and into the Internet's
"cloud."
This model is likely the future of computing and networking, and it
is no minor tweak. It's a wholesale revision to the Internet and PC
environment we've experienced for the past thirty years. The
serendipity of outside tinkering that has marked that generative era
gave us the Web, instant messaging, peer-to-peer networking, Skype,
Wikipedia--all ideas out of left field. Now it is disappearing,
leaving a handful of new gatekeepers in place, with us and them
prisoner to their limited business plans and to regulators who fear
things that are new and disruptive.
Introduction
============
Though these two inventions--iPhone and Apple II--were launched by
the same man, the revolutions that they inaugurated are radically
different. For the technology that each inaugurated is radically
different. The Apple II was quintessentially generative technology.
It was a platform. It invited people to tinker with it. ... The
Apple II was designed for surprises--some very good (VisiCalc), and
some not so good (the inevitable and frequent computer crashes).
The iPhone is the opposite. It is sterile. Rather than a platform
that invites innovation, the iPhone comes preprogrammed.
Viruses, spam, identity theft, crashes: all of these were the
consequences of a certain freedom built into the generative PC. As
these problems grow worse, for many the promise of security is enough
reason to give up that freedom.
The PC revolution was launched with PCs that invited innovation by
others. So too with the Internet. Both were generative: they were
designed to accept any contribution that followed a basic set of
rules... But the future unfolding right now is very different from
this past. The future is not one of generative PCs attached to a
generative network. It is instead one of sterile appliances tethered
to a network of control.
But along with the rise of information appliances that package those
useful activities without readily allowing new ones, there is the
increasing lockdown of the PC itself. PCs may not be competing with
information appliances so much as they are becoming them.
In turn, that lockdown opens the door to new forms of regulatory
surveillance and control.
A lockdown on PCs and a corresponding rise of tethered appliances
will eliminate what today we take for granted: a world where
mainstream technology can be influenced, even revolutionized, out of
left field.
Part 1, The Rise and Stall of the Generative Net
================================================
Today's Internet is not the only way to build a network. In the
1990s, the Internet passed unnoticed in mainstream circles while
networks were deployed by competing proprietary barons such as AOL,
CompuServe, and Prodigy. ... The proprietary networks went extinct,
despite having accumulated millions of subscribers. They were
crushed by a network built by government researchers and computer
scientists who had no CEO, no master business plan, no paying
subscribers, no investment in content, and no financial interest in
accumulating subscribers.
The framers of the Internet did not design their network with visions
of mainstream dominance. Instead, the very unexpectedness of its
success was a critical ingredient. The Internet was able to develop
quietly and organically for years before it became widely known,
remaining outside the notice of those who would have insisted on more
cautious strictures had they only suspected how ubiquitous it would
become.
Chapter 1, Battle of the Boxes
==============================
The Hollerith model is one of powerful, general-purpose machines
maintained continuously and exclusively by a vendor. The appliance
model is one of predictable and easy-to-use specialized machines that
require little or no maintenance. Both have virtues. ... Neither
the Hollerith machine nor the appliance can be easily reprogrammed by
their users or by third parties, and, as later chapters will explain,
"generativity" was thus not one of their features.
The story of the PC versus the information appliance is the first in
a recurring pattern. The pattern begins with a generative platform
that invites contributions from anyone who cares to make them. The
contributions start among amateurs, who participate more for fun and
whimsy than for profit. Their work, previously unnoticed in the
mainstream, begins to catch on, and the power of the market kicks in
to regularize their innovations and deploy them in markets far larger
than the amateurs' domains. Finally, the generative features that
invite contribution and that worked so well to propel the first stage
of innovation begin to invite trouble and reconsideration, as the
power of openness to third-party contribution destabilizes its first
set of gains.
Chapter 2, Battle of the Networks
=================================
In early twentieth-century America, AT&T controlled not only the
telephone network, but also the devices attached to it. People
rented their phones from AT&T, and the company prohibited them from
making any modifications to the phones.
The first online services built on top of AT&T's phone network were
natural extensions of the 1960s IBM-model minicomputer usage within
businesses: one centrally managed machine to which employees' dumb
terminals connected.
Even before PC owners had an opportunity to connect to the Internet,
they had an alternative to paying for appliancized proprietary
networks. Several people wrote BBS ("bulletin board system")
software that could turn any PC into its own information service.
Lacking ready arrangements with institutional content providers like
the Associated Press, computers running BBS software largely depended
on their callers to provide information as well as to consume it.
.. But they were limited by the physical properties and business
model of the phone system that carried their data.
PC generativity provided a way to ameliorate some of these
limitations. A PC owner named Tom Jennings wrote FIDOnet in the
spring of 1984.
Those with Jennings's urge to code soon had an alternative outlet,
one that even the proprietary networks did not foresee as a threat
until far too late: the Internet, which appeared to combine the
reliability of the pay networks with the ethos and flexibility of
user-written FIDOnet.
The Internet's design reflects the situation and outlook of the
Internet's framers: they were primarily academic researchers and
moonlighting corporate engineers who commanded no vast resources to
implement a global network.
The design of the Internet reflected not only the financial
constraints of its creators, but also their motives. They had little
concern for controlling the network or its users' behavior. The
network's design was publicly available and freely shared from the
earliest moments of its development. ... The motto among them was,
"We reject: kings, presidents, and voting. We believe in: rough
consensus and running code."
The Internet was so different in character and audience from the
proprietary networks that few even saw them as competing with one
another.
The resulting Internet was a network that no one in particular owned
and that anyone could join.
.. Internet design, like its generative PC counterpart, tilted
toward the simple and basic. The simple design that the Internet's
framers settled upon makes sense only with a set of principles that
go beyond mere engineering. The most important are what we might
label the procrastination principle and the trust-your-neighbor
approach.
The procrastination principle rests on the assumption that most
problems confronting a network can be solved later or by others. It
says that the network should not be designed to do anything that can
be taken care of by its users.
The network's simplicity meant that many features found in other
networks to keep them secure from fools and knaves would be absent.
.. the assumption that network participants can be trusted, and
indeed that they will be participants rather than customers, infuses
the Internet's design at nearly every level.
This basic design omission has led to the well-documented headaches
of identifying wrongdoers online, from those who swap copyrighted
content to hackers who attack the network itself.
The assumptions made by the Internet's framers and embedded in the
network--that most problems could be solved later and by others, and
that those others themselves would be interested in solving rather
than creating problems--arose naturally within the research
environment that gave birth to the Internet.
But the network managed an astonishing leap as it continued to work
when expanded into the general populace, one which did not share the
worldview that informed the engineers' designs. Indeed, it not only
continued to work, but experienced spectacular growth in the uses to
which it was put.
Chapter 3, Cybersecurity and the Generative Dilemma
===================================================
The university workstations of 1988 were generative: their users
could write new code for them or install code written by others. The
Morris worm was the first large-scale demonstration of a
vulnerability of generativity: even in the custody of trained
administrators, such machines could be commandeered and reprogrammed,
and, if done skillfully, their users would probably not even notice.
As a postmortem to the Morris worm incident, the Internet Engineering
Task Force, the far-flung, unincorporated group of engineers who work
on Internet standards and who have defined its protocols through a
series of formal "request for comments" documents, or RFCs, published
informational RFC 1135, titled "The Helminthiasis of the Internet."
RFC 1135 was titled and written with whimsy, echoing reminiscences of
the worm as a fun challenge. The RFC celebrated that the original
"old boy" network of "UNIX system wizards" was still alive and well
despite the growth of the Internet: teams at university research
centers put their heads together--on conference calls as well as over
the Internet--to solve the problem. After describing the technical
details of the worm, the document articulated the need to instill and
enforce ethical standards as new people (mostly young computer
scientists like Morris) signed on to the Internet.
RFC 1135
<
gopher://gopher.quux.org/0/Computers/Standards%20and%20Specs/RFC/docs/
rfc1135.txt>
Urging users to patch their systems and asking hackers to behave more
maturely might, in retrospect, seem naïve. To understand why these
were the only concrete steps taken to prevent another worm
incident--even a catastrophically destructive one--one must
understand just how deeply computing architectures, both then and
now, are geared toward flexibility rather than security, and how
truly costly it would be to retool them.
Generative systems are built on the notion that they are never fully
complete, that they have many uses yet to be conceived of, and that
the public can be trusted to invent and share good uses. Multiplying
breaches of that trust can threaten the very foundations of the
generative system.
The burgeoning gray zone of software explains why the most common
responses to the security problem cannot solve it. ...the
fundamental problem is that the point of a PC--regardless of its
OS--is that its users can easily reconfigure it to run new software
from anywhere.
The Internet Engineering Task Force's RFC 1135 on the Morris worm
closed with a section titled "Security Considerations." This section
is the place in a standards document for a digital environmental
impact statement--a survey of possible security problems that could
arise from deployment of the standard. RFC 1135's security
considerations section was one sentence: "If security considerations
had not been so widely ignored in the Internet, this memo would not
have been possible."
What does that sentence mean? ...if the Internet had been designed
with security as its centerpiece, it would never have achieved the
kind of success it was enjoying, even as early as 1988.
Part 2, After the Stall
=======================
Our information technology ecosystem functions best with generative
technology at its core. A mainstream dominated by non-generative
systems will harm innovation as well as some important individual
freedoms and opportunities for self-expression. However, generative
and non-generative models are not mutually exclusive. They can
compete and intertwine within a single system.
Chapter 4, The Generative Pattern
=================================
Generativity is a system's capacity to produce unanticipated change
through unfiltered contributions from broad and varied audiences.
What makes something generative? There are five principal factors at
work:
* Leverage: The more a system can do, the more capable it is of
producing change.
* Adaptability: [Technology that] can be endlessly diverted to new
tasks not counted on by... original makers.
* Ease of mastery: The more useful a technology is to both the
neophyte and to the expert, the more generative it is.
* Accessibility: The easier it is to obtain access to the
technology, along with the tools and information necessary to
achieve mastery of it, the more generative it is.
* Transferability: How easily changes in the technology can be
conveyed to others.
Generative tools are not inherently better than their non-generative
("sterile") counterparts.
The more that the five qualities are maximized, the easier it is for
a system or platform to welcome contributions from outsiders as well
as insiders.
Generative systems facilitate change.
Generativity's benefits can be grouped more formally as at least two
distinct goods, one deriving from unanticipated change, and the other
from inclusion of large and varied audiences. The first good is its
innovative output: new things that improve people's lives. The
second good is its participatory input, based on a belief that a life
well lived is one in which there is opportunity to connect to other
people, to work with them, and to express one's own individuality
through creative endeavors.
Non-generative systems can grow and evolve, but their growth is
channeled through their makers...
If one values innovation, it might be useful to try to figure out how
much disruptive innovation remains in a particular field or
technology. For mature technologies, perhaps generativity is not as
important: the remaining leaps, such as that which allows transistors
to be placed closer and closer together on a chip over time without
fundamentally changing the things the chip can do, will come from
exploitative innovation or will necessitate well-funded research
through institutional channels.
It may well be that, in the absence of broad-based technological
accessibility, there would eventually have been the level of
invention currently witnessed in the PC and on the Internet. Maybe
AT&T would have invented the answering machine on its own, and maybe
AOL or CompuServe would have agreed to hyperlink to one another's
walled gardens. But the hints we have suggest otherwise:
less-generative counterparts to the PC and the Internet--such as
standalone word processors and proprietary information services--had
far fewer technological offerings, and they stagnated and then failed
as generative counterparts emerged.
The joy of being able to be helpful to someone--to answer a question
simply because it is asked and one knows a useful answer, to be part
of a team driving toward a worthwhile goal--is one of the best
aspects of being human, and our information technology architecture
has stumbled into a zone where those qualities can be elicited and
affirmed for tens of millions of people.
Generative technologies need not produce forward progress, if by
progress one means something like increasing social welfare. Rather,
they foment change. ... To use an evolutionary metaphor, they
encourage mutations, branchings away from the status quo--some that
are curious dead ends, others that spread like wildfire. They
invite disruption--along with the good things and bad things that can
come with such disruption.
The paradox of generativity is that with an openness to unanticipated
change, we can end up in bad--and non-generative--waters.
Chapter 5, Tethered Appliances, Software as Service, and Perfect
================================================================
Enforcement
===========
The most likely reactions to PC and Internet failures brought on by
the proliferation of bad code, if they are not forestalled, will be
at least as unfortunate as the problems themselves
A shift to tethered appliances and locked-down PCs will have a ripple
effect on long-standing cyberlaw problems, many of which are
tugs-of-war between individuals with a real or perceived injury from
online activity and those who wish to operate as freely as possible
in cyberspace.
As legal systems experienced the first wave of suits arising from use
of the Internet, scholars such as Lawrence Lessig and Joel Reidenberg
emphasized that code could be law. In this view, the software we use
shapes and channels our online behavior as surely as--or even more
surely and subtly than--law itself. Restrictions can be enforced by
the way a piece of software operates.
If regulators can induce certain alterations in the nature of
Internet technologies that others could not undo or widely
circumvent, then many of the regulatory limitations occasioned by the
Internet would evaporate. Lessig and others have worried greatly
about such potential changes, fearing that blunderbuss technology
regulation by overeager regulators will intrude on the creative
freedom of technology makers and the civic freedoms of those who use
the technology.
Appliances become contingent: rented instead of owned, even if one
pays up front for them, since they are subject to instantaneous
revision.
The law as we have known it has had flexible borders. This
flexibility derives from prosecutorial and police discretion and from
the artifice of the outlaw. When code is law, however, execution is
exquisite, and law can be self-enforcing. The flexibility recedes.
Mobile phones can be reprogrammed at a distance, allowing their
microphones to be secretly turned on even when the phone is powered
down. All ambient noise and conversation can then be continuously
picked up and relayed back to law enforcement authorities, regardless
of whether the phone is being used for a call.
When a regulator makes mistakes in the way it construes or applies a
law, a stronger ability to compel compliance implies a stronger
ability to compel compliance with all mandates, even those that are
the results of mistaken interpretations. Gaps in translation may
also arise between a legal mandate and its technological
manifestation. This is especially true when technological design is
used as a preemptive measure.
Law professor Meir Dan-Cohen describes law as separately telling
people how to behave and telling judges what penalties to impose
should people break the law. In more general terms, he has observed
that law comprises both conduct rules and decision rules. There is
some disconnect between the two: people may know what the law
requires without fully understanding the ramifications for breaking
it. This division--what he calls an "acoustic separation"--can be
helpful: a law can threaten a tough penalty in order to ensure that
people obey it, but then later show unadvertised mercy to those who
break it. If the mercy is not telegraphed ahead of time, people will
be more likely to follow the law, while still benefiting from a
lesser penalty if they break it and have an excuse to offer, such as
duress.
Perfect enforcement collapses the public understanding of the law
with its application, eliminating a useful interface between the
law's terms and its application.
Generative networks like the Internet can be partially controlled,
and there is important work to be done to enumerate the ways in which
governments try to censor the Net. But the key move to watch is a
sea change in control over the endpoint: lock down the device, and
network censorship and control can be extraordinarily reinforced.
Chapter 6, The Lessons of Wikipedia
===================================
The Dutch city of Drachten has undertaken an unusual experiment in
traffic management. The roads serving forty-five thousand people are
"verkeersbordvrij": free of nearly all road signs. Drachten is one
of several European test sites for a traffic planning approach called
"unsafe is safe." The city has removed its traffic signs, parking
meters, and even parking spaces. The only rules are that drivers
should yield to those on their right at an intersection, and that
parked cars blocking others will be towed.
The result so far is counterintuitive: a dramatic improvement in
vehicular safety. Without signs to obey mechanically (or, as studies
have shown, disobey seventy percent of the time), people are forced
to drive more mindfully--operating their cars with more care and
attention to the surrounding circumstances. They communicate more
with pedestrians, bicyclists, and other drivers using hand signals
and eye contact. They see other drivers rather than other cars.
A small lesson of the verkeersbordvrij experiment is that standards
can work better than rules in unexpected contexts. A larger lesson
has to do with the traffic expert's claim about law and human
behavior: the more we are regulated, the more we may choose to hew
only and exactly to the regulation or, more precisely, to what we can
get away with... This observation is less about the difference
between rules and standards than it is about the source of mandates:
some may come from a process that a person views as alien, while
others arise from a process in which the person takes an active part.
More generally, order may remain when people see themselves as a part
of a social system, a group of people--more than utter strangers but
less than friends--with some overlap in outlook and goals. Whatever
counts as a satisfying explanation, we see that sometimes the absence
of law has not resulted in the absence of order.
Part 3, Solutions
=================
This book has explained how the Internet's generative characteristics
primed it for extraordinary success--and now position it for failure.
The response to the failure will most likely be sterile tethered
appliances and Web services that are contingently generative, if
generative at all. The trajectory is part of a larger pattern. If
we can understand the pattern and what drives it, we can try to avoid
an end that eliminates most disruptive innovation while facilitating
invasive and all-too-inexpensive control by regulators.
So what to do to stop this future? We need a strategy that blunts
the worst aspects of today's popular generative Internet and PC
without killing these platforms' openness to innovation. Give users
a reason to stick with the technology and the applications that have
worked so surprisingly well--or at least reduce the pressures to
abandon it--and we may halt the movement toward a nongenerative
digital world. This is easier said than done, because our familiar
toolkits for handling problems are not particularly attuned to
maintaining generativity.
The key to threading the needle between needed change and undue
closure can be forged from understanding the portability of both
problems and solutions among the Internet's layers. We have seen
that generativity from one layer can recur to the next.
If generativity and its problems flow from one layer to another, so
too can its
solutions.
..two approaches that might save the generative spirit of the Net,
or at least keep it alive for another interval. The first is to
reconfigure and strengthen the Net's experimentalist architecture to
make it fit better with its now-mainstream home. The second is to
create and demonstrate the tools and practices by which relevant
people and institutions can help secure the Net themselves instead of
waiting for someone else to do it.
Conclusion
==========
Nicholas Negroponte, former director of the MIT Media Lab, announced
the One Laptop Per Child (OLPC) project at the beginning of 2005.
The project aims to give one hundred million hardy, portable
computers to children in the developing world. The laptops, called
XOs, are priced around $100, and they are to be purchased by
governments and given to children through their schools.
Yet OLPC is about revolution rather than evolution, and it embodies
both the promise and challenge of generativity. The project's
intellectual pedigree and structure reveal an enterprise of
breathtaking theoretical and logistical ambition.
But the XO completely redesigns today's user interfaces from the
ground up. Current PC users who encounter an XO have a lot to
unlearn.
XO is but the most prominent and well-funded of a series of
enterprises to attempt to bridge the digital divide.
[But... I read that OLPC was designed by first world elite
intellectuals without significant participation from the people to
whose governments it was marketed. I also read that it is marketing
a technology solution that is looking for problems, when there are
plenty of more pressing real world problems to be addressed. In
other words, it was part of the tech bubble glamor.
> OLPC's failure can be attributed to its lack of understanding of
> local communities and their day-to-day lives.
The Failure of OLPC
<
https://theurgetohelp.com/podcasts/the-failure-of-one-laptop-per-child/>
]
author: Zittrain, Jonathan (Jonathan L.), 1969-
detail: <
gopher://gopherpedia.com/0/
The_Future_of_the_Internet_and_How_to_Stop_It>
LOC: I57 Z53
source: <
https://blogs.harvard.edu/futureoftheinternet/download/>
tags: ebook,history,non-fiction,technical
title: The Future Of The Internet and How to Stop It
Tags
====
ebook
<
gopher://tilde.pink/1/~bencollver/log/tag/ebook/>
history
<
gopher://tilde.pink/1/~bencollver/log/tag/history/>
non-fiction
<
gopher://tilde.pink/1/~bencollver/log/tag/non-fiction/>
technical
<
gopher://tilde.pink/1/~bencollver/log/tag/technical/>