BASIC used to be on every computer a child touched -- but
today there's no easy way for kids to get hooked on
programming.
By David Brin
*
For three years -- ever since my son Ben was in fifth grade -- he
and I have engaged in a quixotic but determined quest: We've searched
for a simple and straightforward way to get the introductory
programming language BASIC to run on either my Mac or my PC.
Why on Earth would we want to do that, in an era of glossy
animation-rendering engines, game-design ogres and sophisticated
avatar worlds? Because if you want to give young students a grounding
in how computers actually work, there's still nothing better than
a little experience at line-by-line programming.
Only, quietly and without fanfare, or even any comment or notice
by software pundits, we have drifted into a situation where almost
none of the millions of personal computers in America offers a
line-programming language simple enough for kids to pick up fast.
Not even the one that was a software lingua franca on nearly all
machines, only a decade or so ago. And that is not only a problem
for Ben and me; it is a problem for our nation and civilization.
Oh, today's desktops and laptops offer plenty of other fancy things
-- a dizzying array of sophisticated services that grow more dazzling
by the week. Heck, I am part of that creative spasm.
Only there's a rub. Most of these later innovations were brought
to us by programmers who first honed their abilities with
line-programming languages like BASIC. Yes, they mostly use higher
level languages now, stacking and organizing object-oriented services,
or using other hifalutin processes that come prepackaged and ready
to use, the way an artist uses pre-packaged paints. (Very few
painters still grind their own pigments. Should they?)
And yet the thought processes that today's best programmers learned
at the line-coding level still serve these designers well. Renowned
tech artist and digital-rendering wizard Sheldon Brown, leader of
the Center for Computing in the Arts, says: "In my Electronics for
the Arts course, each student built their own single board computer,
whose CPU contained a BASIC ROM [a chip permanently encoded with
BASIC software]. We first did this with 8052's and then with a chip
called the BASIC Stamp. The PC was just the terminal interface to
these computers, whose programs would be burned into flash memory.
These lucky art students were grinding their own computer architectures
along with their code pigments -- along their way to controlling
robotic sculptures and installation environments."
But today, very few young people are learning those deeper patterns.
Indeed, they seem to be forbidden any access to that world at all.
And yet, they are tantalized! Ben has long complained that his math
textbooks all featured little type-it-in-yourself programs at the
end of each chapter -- alongside the problem sets -- offering the
student a chance to try out some simple algorithm on a computer.
Usually, it's an equation or iterative process illustrating the
principle that the chapter discussed. These "TRY IT IN BASIC"
exercises often take just a dozen or so lines of text. The aim is
both to illustrate the chapter's topic (e.g. statistics) and to
offer a little taste of programming.
Only no student tries these exercises. Not my son or any of his
classmates. Nor anybody they know. Indeed, I would be shocked if
more than a few dozen students in the whole nation actually type
in those lines that are still published in countless textbooks
across the land. Those who want to (like Ben) simply cannot.
Now, I have been complaining about this for three years. But whenever
I mention the problem to some computer industry maven at a conference
or social gathering, the answer is always the same: "There are still
BASIC programs in textbooks?"
At least a dozen senior Microsoft officials have given me the exact
same response. After taking this to be a symptom of cluelessness
in the textbook industry, they then talk about how obsolete BASIC
is, and how many more things you can do with higher-level languages.
"Don't worry," they invariably add, "the newer textbooks won't have
any of those little BASIC passages in them."
All of which is absolutely true. BASIC is actually quite tedious
and absurd for getting done the vast array of vivid and ambitious
goals that are typical of a modern programmer. Clearly, any kid who
wants to accomplish much in the modern world would not use it for
very long. And, of course, it is obvious that newer texts will
abandon "TRY IT IN BASIC" as a teaching technique, if they haven't
already.
But all of this misses the point. Those textbook exercises were
easy, effective, universal, pedagogically interesting -- and nothing
even remotely like them can be done with any language other than
BASIC. Typing in a simple algorithm yourself, seeing exactly how
the computer calculates and iterates in a manner you could duplicate
with pencil and paper -- say, running an experiment in coin flipping,
or making a dot change its position on a screen, propelled by math
and logic, and only by math and logic: All of this is priceless.
As it was priceless 20 years ago. Only 20 years ago, it was physically
possible for millions of kids to do it. Today it is not.
In effect, we have allowed a situation to develop that is like a
civilization devouring its seed corn. If an enemy had set out to
do this to us -- quietly arranging so that almost no school child
in America can tinker with line coding on his or her own -- any
reasonably patriotic person would have called it an act of war.
Am I being overly dramatic? Then consider a shift in perspective.
First ponder the notion of programming as a series of layers. At
the bottom-most level is machine code. I showed my son the essentials
on scratch paper, explaining the roots of Alan Turing's "general
computer" and how it was ingeniously implemented in the first
four-bit integrated processor, Intel's miraculous 1971 4004 chip,
unleashing a generation of nerdy guys to move bits around in little
clusters, adding and subtracting clumps of ones and zeroes, creating
the first calculators and early desktop computers like the legendary
Altair.
This level of coding is still vital, but only at the realm of
specialists at the big CPU houses. It is important for guys like
Ben to know about machine code -- that it's down there, like DNA
in your cell -- but a bright kid doesn't need to actually do it,
in order to be computer-literate. (Ben wants to, though. Anyone
know a good kit?)
The layer above that is often called assembler, though there are
many various ways that user intent can be interpreted down to the
bit level without actually flicking a series of on-off switches.
Sets of machine instructions are grouped, assembled and correlated
with (for example) ASCII-coded commands. Some call this the "boringest"
level. Think of the hormones swirling through your body. Even a
glimpse puts me to sleep. But at least I know that it is there.
The third layer of this cake is the operating system of your computer.
Call it BIOS and DOS, along with a lot of other names. This was
where guys like Gates and Wozniak truly propelled a whole industry
and way of life, by letting the new desktops communicate with their
users, exchange information with storage disks and actually show
stuff on a screen. Cool.
Meanwhile, the same guys were offering -- at the fourth layer -- a
programming language that folks could use to create new software
of their very own. BASIC was derived from academic research tools
like beloved old FORTRAN (in which my doctoral research was coded
onto punched paper cards, yeesh). It was crude. It was dry. It was
unsuitable for the world of the graphic user interface. BASIC had
a lot of nasty habits. But it liberated several million bright minds
to poke and explore and aspire as never before.
The "scripting" languages that serve as entry-level tools for today's
aspiring programmers -- like Perl and Python -- don't make this
experience accessible to students in the same way. BASIC was close
enough to the algorithm that you could actually follow the reasoning
of the machine as it made choices and followed logical pathways.
Repeating this point for emphasis: You could even do it all yourself,
following along on paper, for a few iterations, verifying that the
dot on the screen was moving by the sheer power of mathematics,
alone. Wow! (Indeed, I would love to sit with my son and write
"Pong" from scratch. The rule set -- the math -- is so simple. And
he would never see the world the same, no matter how many higher-level
languages he then moves on to.)
The closest parallel I can think of is the WWII generation of my
father -- guys for whom the ultra in high tech was automobiles.
What fraction of them tore apart jalopies at home? Or at least
became adept at diagnosing and repairing the always fragile machines
of that era? One result of that free and happy spasm of techie
fascination was utterly strategic. When the "Arsenal of Democracy"
began churning out swarms of tanks and trucks and jeeps, these were
sent to the front and almost overnight an infantry division might
be mechanized, in the sure and confident expectation that there
would be thousands of young men ready (or trainable) to maintain
these tools of war. (Can your kid even change the oil nowadays? Or
a tire?)
The parallel technology of the '70s generation was IT. Not every
boomer soldered an Altair from a kit, or mastered the arcana of
DBASE. But enough of them did so that we got the Internet and Web.
We got Moore's Law and other marvels. We got a chance to ride another
great technological wave.
So, what's the parallel hobby skill today? What tech-marvel has
boys and girls enthralled, tinkering away, becoming expert in
something dazzling and practical and new? Shooting ersatz aliens
in "Halo"? Dressing up avatars in "The Sims"? Oh sure, there's
creativity in creating cool movies and Web pages. But except for
the very few who will make new media films, do you see a great wave
of technological empowerment coming out of all this?
OK, I can hear the sneers. Are these the rants of a grouchy old
boomer? Feh, kids today! (And get the #$#*! off my lawn!)
Fact is, I just wanted to give my son a chance to sample some of
the wizardry standing behind the curtain, before he became lost in
the avatar-filled and glossy-rendered streets of Oz. Like the hero
in "TRON," or "The Matrix," I want him to be a user who can see the
lines that weave through the fabric of cyberspace -- or at least
know some history about where it all came from. At the very minimum,
he ought to be able to type those examples in his math books and
use the computer the way it was originally designed to be used: to
compute.
Hence, imagine my frustration when I discovered that it simply could
not be done.
Yes, yes: For three years I have heard all the rationalized answers.
No kid should even want BASIC, they say. There are higher-level
languages like C++ (Ben is already -- at age 14 -- on page 200 of
his self-teaching C++ book!) and yes, there are better education
programs like Logo. Hey, what about Visual Basic! Others suggested
downloadable versions like q-basic, y-basic, alphabetabasic...
Indeed, I found one that was actually easy to download, easy to
turn on, and that simply let us type in some of those little example
programs, without demanding that we already be manual-chomping
fanatics in order to even get started using the damn thing. Chipmunk
Basic for the Macintosh actually started right up and let us have
a little clean, algorithmic fun. Extremely limited, but helpful.
All of the others, every last one of them, was either too high-level
(missing the whole point!) or else far, far too onerous to figure
out or use. Certainly not meant to be turn-key usable by any junior
high school student. Appeals for help online proved utterly futile.
Until, at last, Ben himself came up with a solution. An elegant
solution of startling simplicity. Essentially: If you can't beat
'em, join 'em.
While trawling through eBay, one day, he came across listings for
archaic 1980s-era computers like the Apple II. "Say, Dad, didn't
you write your first novel on one of those?" he asked.
"Actually, my second. 'Startide Rising.' On an Apple II with Integer
Basic and a serial number in five digits. It got stolen, pity. But
my first novel, 'Sundiver,' was written on this clever device called
a typewrit --"
"Well, look, Dad. Have you seen what it costs to buy one of those
old Apples online, in its original box? Hey, what could we do with
it?"
"Huh?" I stared in amazement.
Then, gradually, I realized the practical possibilities.
Let's cut to the chase. We did not wind up buying an Apple II.
Instead (for various reasons) we bought a Commodore 64 (in original
box) for $25. It arrived in good shape. It took us maybe three
minutes to attach an old TV. We flicked the power switch ... and
up came a command line. In BASIC.
Uh. Problem solved?
I guess. At least far better than any other thing we've tried!
We are now typing in programs from books, having fun making dots
move (and thus knowing why the dots move, at the command of math,
and not magic). There are still problems, like getting an operating
system to make the 5141c disk drive work right. Most of the old
floppies are unreadable. But who cares? (Ben thinks that loading
programs to and from tape is so cool. I gurgle and choke remembering
my old Sinclair ... but whatever.)
What matters is that we got over a wretched educational barrier.
And now Ben can study C++ with a better idea where it all came from.
In the nick of time.
Problem solved? Again, at one level.
And yet, can you see the irony? Are any of the masters of the
information age even able to see the irony?
This is not just a matter of cheating a generation, telling them
to simply be consumers of software, instead of the innovators that
their uncles were. No, this goes way beyond that. In medical school,
professors insist that students have some knowledge of chemistry
and DNA before they are allowed to cut open folks. In architecture,
you are at least exposed to some physics.
But in the high-tech, razzle-dazzle world of software? According
to the masters of IT, line coding is not a deep-fabric topic worth
studying. Not a layer that lies beneath, holding up the world of
object-oriented programming. Rather, it is obsolete! Or, at best,
something to be done in Bangalore. Or by old guys in their 50s,
guaranteeing them job security, the same way that COBOL programmers
were all dragged out of retirement and given new cars full of Jolt
Cola during the Y2K crisis.
All right, here's a challenge. Get past all the rationalizations.
(Because that is what they are.) It would be trivial for Microsoft
to provide a version of BASIC that kids could use, whenever they
wanted, to type in all those textbook examples. Maybe with some
cool tutorial suites to guide them along, plus samples of higher-order
tools. It would take up a scintilla of disk space and maybe even
encourage many of them to move on up. To (for example) Visual Basic!
Or else, hold a big meeting and choose another lingua franca, so
long as it can be universal enough to use in texts, the way that
BASIC was.
Instead, we are told that "those textbooks are archaic" and that
students should be doing "something else." Only then watch the
endless bickering over what that "something else" should be -- with
the net result that there is no lingua franca at all, no "basic"
language so common that textbook publishers can reliably use it as
a pedagogical aide.
The textbook writers and publishers aren't the ones who are obsolete,
out-of-touch and wrong. It is people who have yanked the rug out
from under teachers and students all across the land.
Let me reiterate. Kids are not doing "something else" other than
BASIC. Not millions of them. Not hundreds or tens of thousands of
them. Hardly any of them, in fact. It is not their fault. Because
some of them, like my son, really want to. But they can't. Not
without turning into time travelers, the way we did, by giving up
(briefly) on the present and diving into the past. (I also plan to
teach him how to change the oil and fix a tire!) By using the tools
of a bygone era to learn more about tomorrow.
If this is a test, then Ben and I passed it, ingeniously. In contrast,
Microsoft and Apple and all the big-time education-computerizing
reformers of the MIT Media Lab are failing, miserably. For all of
their high-flown education initiatives (like the "$100 laptop"),
they seem bent on providing information consumption devices, not
tools that teach creative thinking and technological mastery.
Web access for the poor would be great. But machines that kids out
there can understand and program themselves? To those who shape our
technical world, the notion remains not just inaccessible, but
strangely inconceivable.