Subject: RISKS DIGEST 13.54
REPLY-TO: [email protected]

RISKS-LIST: RISKS-FORUM Digest  Wednesday 3 June 1992  Volume 13 : Issue 54

       FORUM ON RISKS TO THE PUBLIC IN COMPUTERS AND RELATED SYSTEMS
  ACM Committee on Computers and Public Policy, Peter G. Neumann, moderator

 Contents:
Risks of Space Junk (PGN)
Girl killed in automatic car window (Ian Spalding)
Pepsi promotion error blamed on computer glitch (Roland Ouellette)
Voter-registration computers know best (Les Earnest)
(more) Social Security Numbers -- billing overloads (Mark Bergman)
Reverse Passwords? (Brinton Cooper)
Risks of being a computer-font company president? (PGN)
Re: Shuttle computer miscomputes rendezvous (Randall Davis)
Re: The risks of telling the truth about viruses (Theodore Ts'o)
Re: Yellow slime (PGN)
Re: Critical technologies (Martyn Thomas)
Re: Payphone Xenophobia (roeber via Darren Alex Griffiths)
Re: Not enough trained computer experts (Brinton Cooper and Fred Cohen)

The RISKS Forum is moderated.  Contributions should be relevant, sound, in
good taste, objective, coherent, concise, and nonrepetitious.  Diversity is
welcome.  CONTRIBUTIONS to [email protected], with relevant, substantive
"Subject:" line.  Others may be ignored!  Contributions will not be ACKed.
The load is too great.  **PLEASE** INCLUDE YOUR NAME & INTERNET FROM: ADDRESS,
especially .UUCP folks.  REQUESTS please to [email protected].
Vol i issue j, type "FTP CRVAX.SRI.COM<CR>login anonymous<CR>AnyNonNullPW<CR>
CD RISKS:<CR>GET RISKS-i.j<CR>" (where i=1 to 13, j always TWO digits).  Vol i
summaries in j=00; "dir risks-*.*<CR>" gives directory; "bye<CR>" logs out.
The COLON in "CD RISKS:" is essential.  "CRVAX.SRI.COM" = "128.18.10.1".
<CR>=CarriageReturn; FTPs may differ; UNIX prompts for username, password.
ALL CONTRIBUTIONS CONSIDERED AS PERSONAL COMMENTS; USUAL DISCLAIMERS APPLY.
Relevant contributions may appear in the RISKS section of regular issues
of ACM SIGSOFT's SOFTWARE ENGINEERING NOTES, unless you state otherwise.
*** For information regarding the availability of this digest via FAX, ***
*** please send an inquiry to [email protected]. ***

----------------------------------------------------------------------

Date: Tue, 2 Jun 92 18:55:39 PDT
From: "Peter G. Neumann" <[email protected]>
Subject: Risks of Space Junk

"Space Junk May Threaten NASA (Space) Station"

A NY Times item seen in the San Francisco Chronicle (2 Jun 1992, p.A7) quotes
the current issue of Space News, which reports that NASA experts have adopted
new calculations on collisions between space junk and the space station, which
suggest that the risks are much greater than previously thought.  They estimate
that there may be 30,000 pieces of debris, mostly from old spacecraft that have
broken up.  Adding protective shielding might significantly raise the cost,
already estimated at $30B to $40B.

------------------------------

Date: Tue, 2 Jun 92 14:10:58 BST
From: Ian Spalding <[email protected]>
Subject: Girl killed in automatic car window

The following appeared in `The Guardian', on 2 Jun 1992:

 Fiat, makers of the Tipo in which two-year-old Lucinda Richardson died
 when an automatic window shut, said last night that one of the front
 doors has to be open and the switch continuously depressed to close a
 front window when the ignition key was removed.

 Lucinda's father, Douglas, said she must have stepped on the switch
 while playing in the car.

 A statement issued by Fiat's UK headquarters at Slough, Berkshire,
 said: "The system ... meets German standards, to which all
 manufacturers conform."  [..]

 Labour called on the Government to issue immediate regulations to
 compel car maufacturers to fit vehicles with safety devices which
 automatically cut out motors on electric windows as soon as pressure
 is applied.  The Royal Society for the Prevetion of Accidents urged
 manufacturers to examine the design of electric window design.

I don't believe that this particular window system is computer controlled ...,
but I do know that several of the quality cars on the market have many of their
body functions, including windows, under control of a central processor.

Ian Spalding

------------------------------

Date: Wed, 3 Jun 92 10:46:59 EDT
From: Roland Ouellette <[email protected]>
Subject: Pepsi promotion error blamed on computer glitch

  [Roland called our attention to an item in The Wall Street Journal,
  1-Jun-92, p. B6B, which I have abstracted as follows.  PGN]

Computer Glitch - 500,000 Pepsi bottlecaps turn into "winners"

PepsiCo's Philippine bottling franchise has had a promotion with numbers
printed in the bottle cap.  A winning number was supposed to pay off up to
$38,000.  A "computer glitch" was blamed for "349" being announced as the
winning number, even though that number appeared on 500,000 bottle caps.
Several thousand people accepted PepsiCo's offer of $19 to each "winner", but
about 4,000 people are seeking governmental action for fraud.

    [This might lead to a new tack in creative advertising -- planning ahead
    of time to have an effort backfire so as not to have to pay off, with
    the a priori intent of blaming it "on the computer".  PGN]

------------------------------

Date: Tue, 2 Jun 92 15:03:02 -0700
From: Les Earnest <[email protected]>
Subject: Voter-registration computers know best

Is anyone with a given name and date of birth who lives in the same county
necessarily the same person?  That is what the Santa Clara County (California)
voter registration computer believes, as reported in today's San Jose Mercury
News.

John P. Taylor, a San Jose patent lawyer, has voted in every election since
1979, but he didn't receive his voting materials in the mail this year even
though his wife, two sons, and daughter did.  On checking with the regristrar
of voters he discovered that another John P. Taylor, who happened to have the
same date of birth, had recently registered in nearby Sunnyvale -- both were
born on Christmas 56 years ago.  The computer "knew" that a person with the
same name and date of birth was the same person and so treated the transaction
as a change of address.  The computer records showed different places of birth,
but it apparently wasn't programmed to check that.

While county authorities scratched their heads over this improbable event and
what to do about it, the disenfranchised John P. Taylor telephoned the other
one and worked out a quick fix.  It seems that their middle names are Paul and
Phillip, so they agreed to re-register with their full names.

What do you think are the chances that this programming blunder will be fixed
before it claims its next victim?  In the meantime, voters with common last
names might consider registering their full names in order to reduce the
chances of being victimized by this kind of programming foolishness.

Les Earnest, 12769 Dianne Drive, Los Altos Hills, CA 94022     415 941-3984
   [email protected]                 ... decwrl!cs.Stanford.edu!Les

------------------------------

Date: Thu, 28 May 92 0:25:24 EDT
From: [email protected] (Mark Bergman)
Subject: (more) Social Security Numbers -- billing overloads

INDIANAPOLIS (AP) - Claudia Braun is only 9 months old, but already her credit
rating is shot. Indiana University Medical Center computers show that the
youthful Miss Braud has run up bills of more than $3,000 for medical services -
including pregnancy tests and dental care.  Since late last year, the baby and
her parents, Jean and Thomas Braun, have received about 15 bills, all for
services rendered to people they don't know. Each bill listed Claudia as the
person responsible for ensuring it got paid.
       "I'd love to see them take me to court and carry her into court and
say, `Hi, this is Miss Claudia Braun,"' Thomas Braun said. "Its hilarious when
you think about it.  But it's getting to be an old joke," added Mrs. Braun, a
registered nurse at Wishard Memorial Hospital.
       Robert D. Wehling, associate director of fiscal affairs at the medical
center, said the problem began when Claudia entered the world without a Social
Security number.
       The university's computer program, which had just been installed,
registered Claudia as 000-00-0000. Later, when other patients who didn't know
their Social Security numbers were entered, they got 00-000-0000, too.  The
computer decided that all the patients with the zeros must be related, and put
Claudia in charge of paying the bills. At last count, there were 37 patients
linked to Claudia's account.
       Medical center officials thought they had the Braun problem fixed in
February and sent the family a letter of apology. But the computer then spit
out seven more bills.   New computer software was installed Tuesday that
Wehling said should eliminate the problem.  "We're sorry it happened," Wehling
said. "This should not happen again."

Mark Bergman 718-855-9148 {cmcl2,uupsi,uunet,apple}!panix!bergman

------------------------------

Date: Wed, 3 Jun 92 10:05:52 EDT
From: Brinton Cooper <[email protected]>
Subject: Reverse Passwords?

>From the "Federal Employees News Digest," 18 May 1992:

       An employee who forgot to type a two-character code in accessing
       a computer at work that then gave him unauthorized access to a
       check generating program should not have been disciplined for
       the mistake, the Merit Systems Protection Board has said.

       The problem began when an Army depot switched certain information
       from a mainframe computer to mini-computers.  When the employee
       tried to access his normal work-related files, he ran into problems
       and inadverently created a file listing banks and companies that were
       to get checks over one million dollars...The checks never were
       actually generated, however.

It goes on to report that the Army demoted the employee and that MSPB ordered
him reinstated because the mistake was a "simple one" and because the employee
had nothing to lose.

The RISK: Often, we've discussed the risks of not using proper passwords to
prevent unauthorized access.  This is the first time that I've seen a user
required to supply a password in order that he NOT be granted access to files.
This is quite a perverse risk!
                                                _Brint

------------------------------

Date: Sun, 31 May 92 17:25:52 PDT
From: "Peter G. Neumann" <[email protected]>
Subject: Risks of being a computer-font company president?

Most of you have read about last week's kidnapping of Charles Geschke, head of
Adobe Systems, Inc., in Mountain View CA, and his successful rescue by the FBI
after four days of captivity.  We wish to express our appreciation for the way
in which the rescue was carried out, and to send our best wishes to the
Geschkes for some sort of return to a normal life.

In general, security (whether with respect to individuals, corporations,
computers, or communications) is very difficult to ensure.  In many cases the
threats are not even perceived -- let alone defended against -- until they have
flagrantly manifested themselves.  Increased awareness of the incredible gamut
of problems is vital.  But the problems are inherently never completely
solvable -- there are no guaranteed solutions.  This is by itself a very
important realization.

Technological solutions to security problems (in general) must be accepted as
just one of many approaches.  In the Geschke case, apparently the FBI planted
transmitters in the package of ransom money.  The kidnapper who picked up the
money had a transmitter sniffer that detected some but not all of the planted
transmitters.  Even there the technology seems to escalate the use of
countermeasures and countercountermeasures, as well as escalating the risks.
PGN

------------------------------

Date: Mon, 1 Jun 92 11:29:07 edt
From: [email protected] (Randall Davis)
Subject: Re: Shuttle computer miscomputes rendezvous (Sullivan, RISKS-13.49)

   The spacewalk was [...] delayed for 1 1/2 hours because
   Endeavour's on-board computer made a mistake in plotting the
   route needed to rendezvous with the satellite.

It is of course completely false.  Endeavor's on-board computer might have
gotten the wrong result because its programmers made a mistake, or perhaps the
data entry folks made a mistake, or perhaps because its fabricators made a
mistake, or even because its designers made a mistake.

The difference is not pedantry; it goes directly to the issue of what the risk
is in computers.  Saying the computer made a mistake encourages people to think
that the machine errs like people, due to wandering attention, fatigue, etc.

One result of that in turn is less attention to the real culprit: programming,
data entry, or fabrication, or design, or whatever else some human did wrong.
A second result is the ability to shut off constructive demands for
improvement by making it far too easy to offer the standard excuse: "It's just
those goofy computers again <shrug>; there's not much we can do about them."

There's not much immediate personal impact from this particular application of
course, but the identical issue arises all the time in response to ATMs,
insurance claims, etc., etc., ad nauseum.

------------------------------

Date: Tue, 2 Jun 92 16:40:35 -0400
From: [email protected] (Theodore Ts'o)
Subject: Re: The risks of telling the truth about viruses

We've discussed this subject before on risks, but I don't believe that there is
such a thing as a "benevolent virus".  If I am a user, I don't want some new
piece of software, which I normally don't run, to be automagically executed on
my machine without my being informed about it, and given an option to veto it.

It all boils down to what your definition of "virus".  My definition of "virus"
is a piece of software which transmits itself from machine to machine without
the knowledge or permission of either a user on the system or the system
administrator of the machine.  Using this definition, I do not believe there
can be such a thing as a "benevolent virus", because I, as a system
administrator or a user, want control over what I run on my machine.  Perhaps I
am running some special software that I know will break if my operating system
gets upgraded --- I don't want a "OS Updating Virus" to go in and change my
system without my permission.  That's just wrong.

If you have some other, more broader definition of "virus", which includes the
ability for the user or system administrator to veto or delay it --- I suggest
you not use the term "virus".  For one thing, the battle is lost.  People think
of viruses as bad (and I suspect most of them have a similar definition of
virus as I presented).  It's like the use of the word "hacker" --- at this
point, it's practically hopeless to try to convince people that there is such a
thing as a "good hacker"; the language has evolved to the point where the only
definition of a "hacker" is a "system cracker"; the original definition of
"super-competent programmer" has be long lost.
                                                       - Ted, (617) 253-8091

------------------------------

Date: Wed, 3 Jun 92 8:29:33 PDT
From: "Peter G. Neumann" <[email protected]>
Subject: Re: Yellow slime (RISKS-13.53)

Several people (for example, Martin Hofmann <[email protected]>) asked me why
that piece appeared in RISKS.  My answer was something like this:

 RISKS is by definition concerned with computers and related technology.
 The stage system is computer controlled, but the point is that even the
 most modern computerized installation can be brought to a halt by something
 seemingly unrelated to the computer!  Usually it is people.  This time it is
 a choice of hydraulic fluid and the decision to use the old pipes.

Incidentally, Nigel Hall <[email protected]> commented that "proper
biodegrading hydraulic fluid, which looks like yellow slime, removes anything
non-metalic in its path (especially automotive paintwork)" ...

------------------------------

Date: Tue, 2 Jun 92 9:17:35 BST
From: Martyn Thomas <[email protected]>
Subject: Re: Critical technologies

Several people have asked me for a reference for the March 1991 Report of the
National Critical Technologies Panel, which I referred to in Risks 13.52.

It has no standard reference number. It is a report to the President of the USA
and Congress, and required every two years "through the year 2000" (1990
Defence Authorisation Act [P L. 101-189] modifying the Natuional Science and
Technology Policy, Organisation and Priorities Act of 1976).

The Panel's address is 1101 Wilson Boulevard, Suite 1500, Arlington, Virginia,
22209. The Chair is William D Phillips.

  Martyn Thomas, Praxis plc, 20 Manvers Street, Bath BA1 1PX UK.
  Tel: +44-225-444700.   Email:   [email protected]

------------------------------

Date: Mon, 1 Jun 1992 22:53:26 GMT
From: [email protected] (Darren Alex Griffiths)
Subject: Re: Payphone Xenophobia

This is a post that I saw on comp.dcom.telecom.  I think the risks are obvious
(i.e., downloading a description of a lead slug).

[email protected] writes:

>Oh yes, I don't know why I didn't remember this immediately.
>
>Last year, at Telecom'91 here in Geneva, a company whose name I forget
>was showing a payphone that recognized coins based on a set of rules
>(weight or mass, size, etc.) programmed in a microprocessor.  So its
>first advantage is that it can be used in many countries, and can even
>(if the owner wishes) take multiple currencies.  The advantage they
>were really touting, though, was the ability to remotely call the
>phone, log in, and download new rules.  This way when a country
>introduces a new coin (as Italy did a few years ago) or replaces a
>coin (as France did last year), one does not have to replace the
>phone, or even physically visit it.

>Frederick G. M. Roeber | CERN -- European Center for Nuclear Research
>e-mail: [email protected] or [email protected] | work: +41 22 767 31 80
>r-mail: CERN/PPE, 1211 Geneva 23, Switzerland | home: +33 50 42 19 44

------------------------------

Date:     Fri, 29 May 92 9:45:30 EDT
From: Brinton Cooper <[email protected]>
Subject:  Re: Not enough trained computer experts (Marshall, RISKS-13.50)

From:     Brinton Cooper  <[email protected]>
fc <FBCohen@dockmaster> writes:

> Over the last several years, I have applied for positions in over 100
> universities, and the universal response is that protection is not of interest
> to the university community.  This dispite the recent report from the US
> National Research Council that calls for increased university research in the
> field.
>
> You cannot find a single US university (and only a few outside the US) with
> more than 2 computer security experts on the same faculty.  You also cannot
> find a university in the US where more than one person was hired with the prior
> knowledge that they have computer security interest (again I am talking about
> faculty positions).  With the educators woefully ignorant, we can only expect
> the students to be equally ignorant.

We then conducted the following dialog; fc suggested that I submit it to Risks.

BRINT:
       I'm interested in your comments about the paucity of university
faculty who are knowledgable in computer security.  The issue is an old
one, but I've never fully come to grips with it.

       Teaching a 3-rd year course in operating systems... I introduce
concepts of "protection" in various units of the course, especially
those involving memory sharing and file system design.  Also, to
facilitate handling of programming projects, I try to be sure that the
students understand the unix file permissions mechanism.

       However, I've not worked out for myself just how far I should go
beyond this.  Some of these students will be system administrators or
systems programmers. Others will be managers responsible for policies
regarding computer installation and "security."  Some others, however,
may be irresponsible types, looking for a laugh or a way to screw their
fellow human.  So, do I teach them about the historical security holes
in Unix systems?  Do I tell them that if they have privileges on their
workstation, many of these privileges can be applied on the fileserver
if they have an account there?  Do I explain how the famous RTM Internet
Worm exploited bugs?  Do I cover how one can capture keystrokes in the
X-windows environment?

       In short, do I do more harm than good or vice versa?  I've heard
most of the standard arguments and tend to feel that what is known
should be passed on, but I'm interested to hear your views.

fc <[email protected]>:

First, I think that your point should be placed on Risks so that all of
the others on that list can understand your concerns.  More importantly,
this issue is fundamental to many peoples' view of protection issues.

I will briefly explain my feelings on this matterhere, but for a much
more thorough coverage, you should probably give me a call - [I've
deleted his number; he didn't specifically say that I might post it.]

         The basic issue is what we should teach about protection to
students in light of the fact some might use if for evil.  I think that
there is a lot of information on how to attack systems available to
anyone who wants it, but there is too little information on defenses for
those who wish to protect themselves.  In light of this imbalance, it is
clear what to do, if you believe it.

         When I teach people about protection, I don't necessarily
discuss all of the attacks in order to point out defenses.  A
fundamental thing that most people don't understand is that protection
is not an excersize in attack and defense, but rather an excersize in
design for integrity, privacy, availabilit, and accountability.  You
could present it in this positive light without talking about attacks at
all.  In fault tolerant computing we discuss failures, often leaving out
the mechanism by which the fault occurs.  We still want continued
operation, and can describe it in terms of proobabilities of events
regardless of causes.

         In terms of teaching about how to break into systems, I think
it is important to every user to understand how people guess passwords,
if only to help them understand how to choose their own passwords.  I
think that they should understand the concept of a Trojan horse, and
that when they run another user's program, they can have all of their
protection bits changed (etc.).  If you don't know this, you might (and
most do) assume that setting a protection bit in Unix makes you safe.
So we teach people a lot of assumptions about protection (if only by
telling them that setting a bit prevents something from happenning), and
these assumptions are almost always wrong, and as a result, they are
vulnerable.  Perhaps you could simply be very careful about the
assumptions you leave your students with.  (e.g.  ANY program can set
the protection bits of any files owned by the user running it - for
example, the "chmod program" provides a handy user interface to this,
while other programs set protection bits without the user getting
involved.)  Some bright student will likely ask whether this means that
their protection bits can be set whenever they run another user's
program, and you should answer "yes!" (note the emphasis).  Take that
chance to also point out that this is why you don't run their programs
from your account - after all, we wouldn't want their grades
accidentally protected so that other could read (or modify) them.  (you
do take this precaution - yes?).

         Having taught protection for many years in universities, I
have faced these moral issues on many occasions.  I can say that the
delicacy of modern systems is a surprise to my students, and they walk
away a bit more careful than when they entered, which I think is a good
thing.

         When you teach about operating systems, do you point out that
the internal OS tables are generally not allocated the "worst case"
amount of space they may require?  You know that in practice, designing
systems in this way is commonplace because the total resources is not
sufficient to handle worst case, but we don't want to artificialy limit
things.  This is a common cause of protection problems - leaving
temporary files around because of the process failures that result,
overrunning available memory and overwriting some OS code, not being
able to handle a critical interrupt, etc.  Most of the programmers I
know don't check every possible return code on each OS call and act
appropriately by undoing everything they should when the failure takes
place, and no current language I am aware of provides the means to
handle these issues appropriately.  None of these things require that
you discuss attacks, but rather are simple realities your students
should understand when writing their code, along with the knowledge that
most programmers don't properly deal with these things.

         Well, my list goes on for books worth - perhaps you should get
some of them for your own interest.  Call me for details.  FC

------------------------------

End of RISKS-FORUM Digest 13.54
************************