Subject: RISKS DIGEST 12.19
REPLY-TO: [email protected]

RISKS-LIST: RISKS-FORUM Digest  Wednesday 28 August 1991  Volume 12 : Issue 19

       FORUM ON RISKS TO THE PUBLIC IN COMPUTERS AND RELATED SYSTEMS
  ACM Committee on Computers and Public Policy, Peter G. Neumann, moderator

 Contents:
Phone Fraud (Ed Andrews summarized by PGN)
(Assumed) False Alarm at Nuclear Plant (Rodney Hoffman)
O, Oh, what a difficult name (Gene Spafford)
Programs Pester Public Policy People (Jeffrey Sorensen)
Re: 13 Aug 91 NY Nine Mile Point 2 Nuclear Plant Incident (Steve Bellovin)
Re: Ada beats C++ according to the DoD (Brinton Cooper)
Re: Unwarranted equivalence assumptions (Brinton Cooper)
Re: TCAS sees ghosts (Steve Jay, Lars-Henrik Eriksson, Keith Hanlan)
pugwash.dcs.ed.ac.uk goes nova too (John Butler)
NIST High Integrity Lecture Series: talk by Laszlo Belady (Laura Strigel)

The RISKS Forum is moderated.  Contributions should be relevant, sound, in
good taste, objective, coherent, concise, and nonrepetitious.  Diversity is
welcome.  CONTRIBUTIONS to [email protected], with relevant, substantive
"Subject:" line.  Others ignored!  REQUESTS to [email protected].  For
vol i issue j, type "FTP CRVAX.SRI.COM<CR>login anonymous<CR>AnyNonNullPW<CR>
CD RISKS:<CR>GET RISKS-i.j<CR>" (where i=1 to 12, j always TWO digits).  Vol i
summaries in j=00; "dir risks-*.*<CR>" gives directory; "bye<CR>" logs out.
The COLON in "CD RISKS:" is essential.  "CRVAX.SRI.COM" = "128.18.10.1".
<CR>=CarriageReturn; FTPs may differ; UNIX prompts for username, password.
ALL CONTRIBUTIONS CONSIDERED AS PERSONAL COMMENTS; USUAL DISCLAIMERS APPLY.
Relevant contributions may appear in the RISKS section of regular issues
of ACM SIGSOFT's SOFTWARE ENGINEERING NOTES, unless you state otherwise.

----------------------------------------------------------------------

Date: Wed, 28 Aug 91 10:27:41 PDT
From: "Peter G. Neumann" <[email protected]>
Subject: Phone Fraud

Abstracted by PGN from an excellent article in the New York Times (28Aug91),
Theft of Telephone Service from Corporations is Surging, by Edmund L. Andrews

Telephone fraud is reaching epidemic proportions, with some companies getting
billed for hundreds of thousands of dollars in bogus calls.  Stolen credit
cards and line tapping are old techniques.  The new craze involves cracking
into switches and PBXs (private branch exchanges).

 ``It is by far the largest segment of communications fraud,'' said Rami
 Abuhamdeh, an independent consultant and until recently executive director of
 the Communications Fraud Control Association in McLean, Va. ``You have all
 this equipment just waiting to answer your calls, and it is being run by
 people who are not in the business of securing telecommunications.''

 Mitsubishi International Corp. reported losing $430,000 last summer, mostly
 from calls to Egypt and Pakistan. Procter & Gamble Co. lost $300,000 in l988.
 The New York City Human Resources Administration lost $529,000 in l987. And
 the Secret Service, which investigates such telephone crime, says it is now
 receiving three to four formal complaints every week, and is adding more
 telephone specialists.

 In its only ruling on the issue thus far, the Federal Communications
 Commission decided in May that the long-distance carrier was entitled to
 collect the bill for illegal calls from the company that was victimized.
 In the closely watched Mitsubishi case filed in June, the company sued AT&T
 for $10 million in the U.S. District Court in Manhattan, arguing that not
 only had it made the equipment through which outsiders entered Mitsubishi's
 phone system, but that AT&T, the maker of the switching equipment, had also
 been paid to maintain the equipment.

 For smaller companies, with fewer resources than Mitsubishi, the problems can
 be financially overwhelming. For example, WRL Group, a small software
 development company in Arlington, Va., found itself charged for 5,470 calls
 it did not make this spring after it installed a toll-free ``800'' telephone
 number and a voice mail recording system machine to receive incoming calls.
 Within three weeks, the intruders had run up a bill of $106,776. to US
 Sprint, a United Telecommunications unit.

The article goes on to document the experiences of WRL, pirate call-sell phone
operations, voice-mail cracking, etc., familiar to RISKS readers, and discusses
the possibilities of blocking calls by area, shutting down out of hours,
verifying callers (!), monitoring for unusual traffic, etc.

 In the past, long-distance carriers bore most of the cost, since the thefts
 were attributed to weaknesses in their networks. But now, the phone companies
 are arguing that the customers should be liable for the cost of the calls,
 because they failed to take proper security precautions on their equipment.

[...]

 Consumertronics, a mail order company in Alamogordo, N.M., sells brochures
 for $29 that describe the general principles of voice mail hacking and the
 particular weaknesses of different models.  Included in the brochure is a
 list of ``800'' numbers along with the kind of voice mail systems to which they
 are connected.  ``It's for educational purposes,'' said the company's owner,
 John Williams, adding that he accepts Mastercard and Visa. Similar insights
 can be obtained from 2600 Magazine, a quarterly publication devoted to
 telephone hacking that is published in Middle Island, N.Y.

It's a good article for those of you whose telephone systems are being cracked
(but good for crackers as well!)...

------------------------------

Date:   Wed, 28 Aug 1991 08:44:37 PDT
From: Rodney Hoffman <[email protected]>
Subject: (Assumed) False Alarm at Nuclear Plant

      NUCLEAR PLANT'S SIREN WAILS BY MISTAKE

San Juan Capistrano - A high-decibel siren warning of a major accident at San
Onofre nuclear power plant sounded by mistake Sunday, Southern California
Edison Co. reported.... [T]he siren ... went off during the late afternoon and
was reported to Edison about 5:30 pm by Orange County emergency management
workers.

By the time a repair crew reached the siren, it had stopped operating. [!]
..[T]he warning device was disconnected [!!] and an investigation begun into
why it went off.  Edison, which operates the San Onofre Nuclear Generating
Station, received no complaints from the public about the mishap.... The siren,
which warns of a nuclear accident, is part of a network of 50 such devices in
[nearby towns].

  -   -   -   -   -   -   -

This was a brief item in the 26 August 1991 `Los Angeles Times.'  No explicit
mention is made of computers, but this seems relevant to RISKS on several
counts.  Unfortunately, the story does not tell why Edison was and remains so
certain that it was a false alarm; I presume that the other 49 sirens were
silent.  As for "no complaints from the public," most people probably assumed
it was just a particularly obnoxious car alarm.  Perhaps the sirens should be
replaced by voice, shouting, "Major nuclear accident underway."  That would
probably get some public attention!

------------------------------

Date: Wed, 28 Aug 91 11:10:06 EST
From: Gene Spafford <[email protected]>
Subject: O, Oh, what a difficult name

From: The Associated Press (and sharply abridged by PGN)

WASHINGTON -- Oh, woe is O.

       For months, Stephen O has been hassled by credit card companies.  It's
 not because he's a bad credit risk.  It's simply that his last name is too
 short.  Twice the 23-year-old South Korean native has applied for new credit
 cards, and twice he's been turned down.  The banks say their computers cannot
 recognize a single-letter last name.  His automobile finance company says
 he's "S.O. Stephen."  The computer at the Virginia Division of Motor Vehicles
 says he's OO, which stymied his efforts to get car insurance for a year.  To
 make matters worse, the computer at the Credit Bureau Inc., which furnishes
 merchants with individual credit references, insisted that O was nobody, even
 though he has carried American Express and Visa cards since he was a college
 student.  Instead, the credit bureau listed him as "Ostephen," which confused
 everybody.

[... He has now changed his name to Oh. ...]

 Since he was a kid, being an O has been both embarrassing and amusing.  "I
 always hated the first day of school," he said.  "The teacher would call the
 roll through the M's and N's and then stumble over the O.  `Is this a
 typographical error?' he'd ask, and I'd say, `That's me'."  [...]

                  [I guess he did not read The Story of O, by Pauline Reage.
                  But, how about "O'O"?  Computers would love it!  PGN]

------------------------------

Date: Wed, 28 Aug 91 13:35:29 EDT
From: [email protected] (Jeffrey Sorensen)
Subject: Programs Pester Public Policy People

In the Aug 24, 1991 issue of _Science News_ p. 127 "Faulting the Numbers":

   A brief article discusses the topic of the accuracy of computer models when
   used as the basis for changes in social and tax programs.  "A National
   Research Coucil panel warns that these estimates are...of unknown quality
   and may be seriously flawed."

   The problems are lack of objective measures for assessing the reliability
   and validity of the resulting figures.  One example cited is the
   underestimate of the popularity of the individual retirement accounts which
   thus led to an underestimate of the subsequent revenue lost.

   "Arguing that detailed simulations...are important to the policy process,
   the panel strongly urges the government to allocate sufficient resources
   to improve the quality of current computer models used for making cost
   estimates."

Whether this is a case of the government expanding to meet the needs of an
expanding government is left as an exercise for the reader.  The problem of
bad statistics used as the basis for bad decisions has been with us alot longer
than computers have.  For some good examples check out:

   _Systems Analysis in Public Policy: A Critque_ by Ida R. Hoos
       Berkeley : University of California Press, 1983.

Also in Science News:

* "Greenhouse Snow: Melting the preconceptions" about the various different
    outcomes of computer models that raise questions about the feedback effect
    of melting snow:
      more heat -> less snow -> darker land -> more heat -> less snow...
    may actually turn out to be
      more heat -> less snow -> more clouds -> a little cooler -> more snow
    or even
      more heat -> less snow -> more radiation to space -> a little cooling

* "Phone glitches and software bugs" says that the DSC equipment responsible
    for the June phone problems suffered from three faulty lines of code in
    a program with several million lines. (a 1E-8 error percentage :-)

* "String and springs net mechanical suprise" gives details of a problem that
    has to be seen to be believed.  A discussion of problems that are
    counterintuitive including Braess paradox which demonstrates that adding
    roads to a congested network can actually increase the amount of
    congestion.  Also an electrical equivalent so that "when you add extra
    current carrying paths, less current flows."

* And, the cover story, the risk free :-) buckyballs and fullerenes with about
    four and 1/2 pages dedicated to research in these new forms of carbon.

    Jeff Sorensen     [email protected]     (518) 276-8202

------------------------------

Date: Tue, 27 Aug 91 20:27:49 EDT
From: [email protected]
Subject: Re: 13 Aug 91 NY Nine Mile Point 2 Nuclear Plant Incident Reassessed

An AP wire story indicates that the problem was dead batteries in the backup
power supply.  The NRC has no standards for battery replacement, the
manufacturer says change them every four years, and these were six years old.
Utility officials blame unclear manuals, and say that the backup systems
weren't wired the way the manual said they should be.

Also worth noting is that the batteries weren't inspected on schedule.
However, the inspection wouldn't have measured their charge level in any event.
Some inspection procedure...

------------------------------

Date:     Wed, 28 Aug 91 10:07:49 EDT
From: Brinton Cooper <[email protected]>
Subject:  Re: Ada beats C++ according to the DoD (Stoffel, RISKS-12.18)

John F Stoffel reports on a set of US DoD studies purporting to show that "...
the DoD mandated programming language Ada is superior in a variety of ways to
its newer rival C++..."

Of course, consider who conducted the studies:  TRW anbd CMU's Software
Engineering Institute, each of which have, no doubt, obtained millions
of dollars in DoD contracts associated with the use and promotion of
Ada.  "Can you say conflict-of-interest, boys and girls?"

"CTA Inc. looked at the productivity of the two languages based on actual
projects and found Ada programmers on average produced 210 source lines per
month while C++ programmers turned out 187 lines."

Does this mean "More code is better code?"  Perhaps it shows that Ada is less
expressive than C++ and requires more source code to say the same thing.
                                                                        _Brint

------------------------------

Date:     Wed, 28 Aug 91 10:12:34 EDT
From: Brinton Cooper <[email protected]>
Subject:  Re: Unwarranted equivalence assumptions (Koenig, RISKS-12.18)

Andrew Koenig discusses four cases of "Unwarranted equivalence assumptions."

His arguments make a lot of sense, but one is flawed.  His fourth example is:

       `I'm sorry, Sir; but even if you are indeed the Ambassador, we
        can't let you into the Embassy building without a proper pass.'

He argues,

 "Perhaps the Ambassador was appointed only a week ago, has been outside the
 country since then, and therefore hasn't had the opportunity to pick up the
 pass that has been waiting for him."

Suppose the denial had been by computer:

        "Incorrect password:  Login aborted."

Would he argue that this might have been the *genuine* user who had forgot her
password and that the "system" should have known better because the login was
from site known to her office?  In fact, both my hypothetical case and that of
newly-appointed Ambassador Strauss are examples of *Authentication* systems.
They must be left in place.  Even if the State Department guard *knew*
Ambassador Strauss personally, it was proper to deny him admission without a
building pass.  Who knows why the pass may have not yet been issued?

Did anyone ever hear of Clark Clifford?
                                                 _Brint

------------------------------

Date: Tue, 27 Aug 91 16:53:05 PDT
From: [email protected] (Steve Jay)
Subject: Re: TCAS sees ghosts (Horning, RISKS 12.16)

> Wahag defends Collins' quality control procedures, which were approved by
> a team of FAA software experts.  "We had a simple human error where an
> engineer misclassified the changes in the software," he told SPECTRUM.
> "It didn't show up in our testing because one of the essential elements was
> absent: you have to have many, many TCAS-equipped airplanes in the sky,"
> as in the high-traffic-density areas where the ghost problem appeared.
>
> To prevent similar omissions, Collins now requires that a committee of
> software engineers review changes before a program is released.  "More than
> one pair of eyes must review these things and make a decision," Wahag said.

Am I the only one who sees a non sequitur here?  "We didn't catch the bug
because we didn't test it in realistic conditions, so next time we'll look
at it harder before we release it."

Seems like some folks don't learn real fast.

Steve Jay, Ultra Network Technologies, 101 Dagget Drive, San Jose, CA 95134
[email protected]  ...ames!ultra!shj                        (408) 922-0100 x130

------------------------------

Date: Wed, 28 Aug 91 09:06:36 +0200
From:  Lars-Henrik Eriksson  <[email protected]>
Subject: Re: FAA seems misled (Re: TCAS Sees Ghosts)

>Not quite.  TCAS is a backup system. It's a redundant backup.  Primary
>responsibility for "see and avoid" is with the pilot (FAR part 91).  The air
>traffic control system, with all it's eyes, ears, and radar exists to help the
>pilot avoid situations that can develop into genuine emergencies.

The "see and avoid" responsibility is only applicable in visual flight
conditions. In instrument flight conditions, the pilot don't have any such
responsibility (Obviously - since he cannot see much outside his own aircraft).

Also, it is a fact that "see and avoid" doesn't work well with aircraft flying
at high speed. Many investigations of midair collisions have shown that
although the pilots had a theoretical possibility to see each others aircraft
in time, the practical possibility was very slight.

>... TCAS and air traffic control are at crossed purposes.  TCAS gives
>authority to the pilot, and ATC takes it away.

ATC authorities (both FAA and those of other countries) have the legitimate
concern that pilots will react unnecessarily to TCAS alerts and cause other
incidents by doing unauthorised deviations.

I understand that the TCAS technology and the procedures being applied
when a TCAS alert occurs have developed to a point when this risk is
at an acceptable level.

Lars-Henrik Eriksson, Swedish Institute of Computer Science, Box 1263, S-164 28
KISTA, SWEDEN           Phone (intn'l): +46 8 752 15 09   Internet: [email protected]

------------------------------

Date:       28 Aug 91 11:17:00 EDT
From: Keith (K.P.) Hanlan <[email protected]>
Subject:    Risks of developer testing (was TCAS sees ghosts)

       The article on TCAS failure quoted by Jim Horning (RISKS 12.16)
illustrates an deficient software development process:

  "The problem arose in the course of testing, because Collins engineers had
   temporarily disabled the program's range correlation function--a few brief
   lines that compare a transponder's current response with previous ones and
   discard any intended for other aircraft.  Without this filter, the system
   can misinterpret a response as coming from a fast-approaching airplane."

       "After testing the systems, Collins shipped them to airline customers
   without re-enabling the range correlation."

The flaw here is that the same group is doing development, testing, and
"manufacturing" (loadbuilding). I'd suggest that if the CASE tool I work on
uses indepentant testers and loadbuilders, an aviation safety device merits
similar precautions. Designers must of course do their own testing but the
code they submit to loadbuilders should be intended for production. The
independent testers should only work with this "production" code. And the
product should only be produced from the loadbuilder's software.

Thus even if the designer accidently submits test code, the testers should
detect the flaw and fail the software. And similarly, if the testers wish to
insert faults, those faults can not get back into the production code.

On a related note, inserting faults by changing code is never a good idea
and this mistake clearly illustrates why.

Let me add that when I refer to "independent" testers, I mean physically
disjoint human beings. I, as a developer with intimate knowledge of the
inner workings, *know* that if this test-case works then all these others
will work as well. This is, of course, until Michelle comes along with her
cunning pathogical special case. This happens time and time again.

Finally, by "loadbuilding" I mean that the activity of configuration
management, compiling, linking and installation. My apologies for using
terms that may only have local meaning.

Keith Hanlan  [email protected]  Bell-Northern Research, Ottawa, Canada 613-765-4645

------------------------------

Date: Wed, 28 Aug 91 12:57:36 bst
From: [email protected]
Subject: pugwash.dcs.ed.ac.uk goes nova too

Followup to posting in RISKS DIGEST 12.17
from [email protected] (Joe Dellinger) 1 Aug 91

Well I guess we've located the newest way of keeping ourselves warm in the
winter nights or at least gassing ourselves so we don't notice.

We have (had) a Sun 3/110 with a Hitachi 15" LC monitor in a lab here. A week or
so ago the occupants of the lab evacuated hastily complaining of a strong smell
watering eyes, sore throat etc.  I would describe the smell as similar to the
sweetish smell you get around a badly ventilated clothing
dry cleaners and would guess a halocarbon of some sort.  We instantly blamed the
air-conditioning units and went looking for coolant leaks. By this time the
security services had been called and they in turn called in the Fire Brigade
who threw us all out and did a thorough survey in full chemical isolation gear
and breathing apparatus.  It's not easy to locate the source of a smell in full
gear and so it was well into the afternoon before someone noticed this monitor
was still on and we traced it.  If the smell had been the usual yukky smell you
get off any torched electronics we'd have got it instantly - this was a new one
on us.   Culprit was what looked like a torodial transformer in the EHT side of
the monitor which was sitting in a little puddle of plasticised slag.

We have no idea what we've been breathing but the city Medical Officer has
requested further tests and we are sending him an intact monitor plus the
slagged transformer.

This incident is still in progress here as we have yet to have any extensive
talks with Sun but I'm posting this meanwhile as it appears there is a real
safety risk.

John Butler, Computer Science, The University of Edinburgh, Kings Buildings,
Edinburgh EH9 3JZ UK        Telephone: +44-(0)31-650-5181 <[email protected]>

------------------------------

Date: Tue, 27 Aug 91 11:26:58 EDT
From: Laura Strigel <[email protected]>
Subject: NIST High Integrity Lecture Series: talk by Laszlo Belady

             NATIONAL INSTITUTE OF STANDARDS AND TECHNOLOGY
                       COMPUTER SYSTEMS LABORATORY
                LECTURE SERIES ON HIGH INTEGRITY SYSTEMS

            The Engineering of Software for High Integrity

                            Laszlo A. Belady
                 Chairman and Director of the Laboratory
                Mitsubishi Electric Research Laboratories


          October 11, 1991, 2:00 p.m., NIST Green Auditorium


Software is now paramount in determining the qualities of man-made and
man-machine systems.  Problems of integrated, networked information systems and
of machinery in which software is the significant component are particularly
acute.  The design of these software-rich systems must be based on combined
expertise in computers and in the application domain.  This leads to design by
teams of many experts whose efforts also need the support of information
technology.  A few emerging solutions, some still in the research stage, will
be discussed, and the importance of technology infusion and education
emphasized.

The goal of the lecture series, open to the public free of charge, is to alert
federal and industry managers, technical staff, and users of the issues they
must be concerned with in the management of valuable information resources.

FUTURE LECTURES:

November 8, 1991:  Early Error Prediction:  Better Error Management and
                       Improved Process Control; Dr. John Gaffney, Manager,
                       Measurement and Economic Modeling, Software Productivity
                       Consortium
December 3, 1991:  Toward a Routine Practice for the Engineering of Software;
                       Dr. Mary Shaw, Professor of Computer Science, Carnegie
                       Mellon University

                              For further information contact:
               Dolores Wallace (301) 975-3340 or Laura Strigel (301) 975-5248.

------------------------------

End of RISKS-FORUM Digest 12.19
************************