RISKS-LIST: RISKS-FORUM Digest Monday 17 May 1993 Volume 14 :
Issue 62
FORUM ON RISKS TO THE PUBLIC IN COMPUTERS AND RELATED
SYSTEMS
ACM Committee on Computers and Public Policy, Peter G. Neumann,
moderator
Contents:
Cut and Paste risks (Elizabeth Zwicky)
CHI & the Color-blind? (Vannevar Yu)
Update on risks in semiconductor manufacturing (Rob Horn)
Re: Epilepsy and video games (Edwin Culver)
Don't Depend on makedepend (Dave Wortman)
makedepend problem - a real world example (Michael Turok via Dave
Wortman)
Business Week: ADP Flubs (Bob Frankston)
Mobile Phones and airbags (was: Opel Corsa ...) (Olivier MJ
Crepin-Leblond)
Re: Denning on NIST/NSA Revelations (Marc Rotenberg, Bill Murray)
NIST Answers to Jim Bidzos' Questions (Ed Roback via Jim Bidzos)
[longish]
The RISKS Forum is a moderated digest discussing risks; comp.risks
is its
Usenet counterpart. Undigestifiers are available throughout the
Internet,
but not from RISKS. Contributions should be relevant, sound, in
good taste,
objective, cogent, coherent, concise, and nonrepetitious.
Diversity is
welcome. CONTRIBUTIONS to
[email protected], with appropriate,
substantive
"Subject:" line. Others may be ignored! Contributions will not
be ACKed.
The load is too great. **PLEASE** INCLUDE YOUR NAME & INTERNET
FROM: ADDRESS,
especially .UUCP folks. REQUESTS please to
[email protected].
Vol i issue j, type "FTP CRVAX.SRI.COM<CR>login
anonymous<CR>AnyNonNullPW<CR>
CD RISKS:<CR>GET RISKS-i.j<CR>" (where i=1 to 14, j always TWO
digits). Vol i
summaries in j=00; "dir risks-*.*<CR>" gives directory; "bye<CR>"
logs out.
The COLON in "CD RISKS:" is essential. "CRVAX.SRI.COM" =
"128.18.10.1".
<CR>=CarriageReturn; FTPs may differ; UNIX prompts for username,
password.
For information regarding delivery of RISKS by FAX, phone
310-455-9300
(or send FAX to RISKS at 310-455-2364, or EMail to
[email protected]).
ALL CONTRIBUTIONS CONSIDERED AS PERSONAL COMMENTS; USUAL
DISCLAIMERS APPLY.
Relevant contributions may appear in the RISKS section of regular
issues
of ACM SIGSOFT's SOFTWARE ENGINEERING NOTES, unless you state
otherwise.
-----------------------------------------------------------------
-----
Date: Mon, 17 May 93 09:51:07 -0700
From: Elizabeth Zwicky <
[email protected]>
Subject: Cut and Paste risks
A few years ago, one of our main servers went down, and failed to
reboot.
Investigation showed that it had been halted with the instruction
to reboot on
a kernel named "imagen.df", which naturally had not worked well.
Further
investigation showed that only our most junior system administrator
had been
logged into it at the time. She was adamant, if not frantic, in her
denial of
having done any such thing. Fortunately, she had been logged into
it from
another machine in an xterm with a scrollbar, and we were able to
scroll it
back to the point before the machine went down, where a wide
variety of
peculiar error messages became visible. Messages like "dumpdates:
permission
denied", and "dump: bad key 'i'" appeared before a final "fastboot"
took
effect. In horror, she said "All I did was drop my pencil and when
I
straightened up it was gone!" It seems that when she leaned over to
pick up
the pencil, her elbow cut and pasted an ls of /etc at the prompt.
Fortunately
"clri" is very picky about its syntax, or we might have had a
really
interesting mess. As it happened, "fastboot" was the first command
which was
willing to take the remainder of its line as valid arguments.
We wrote this off as a once-in-a-lifetime event until Friday, when
another
server went down suddenly. This one came back OK, and we didn't
have to search
for the culprit, a relatively senior administrator who turned
himself in
shamefacedly: "I hit the wrong mouse button. It pasted a fastboot.
It's all my
fault."
I've never liked the X method of cutting and pasting much, but I'm
beginning
to feel actively hostile towards it.
Elizabeth D. Zwicky
[email protected]
------------------------------
Date: Sat, 15 May 93 18:17:57 EDT
From: Vannevar Yu <
[email protected]>
Subject: CHI & the Color-blind?
In the May 1993 issue of Credit Card Management (vol.6, no.2, New
York, NY)
a special report on "Collections Technology" utilizing color
graphic displays
was described in p.66:
"A black icon facing left to right means a call was placed, but
no
contact was made. A red icon facing left to right means a call
was
placed and contact was made, but a promise to pay was not
received.
A green icon facing left to right means a promise to pay was
arranged.
An icon facing right to left means the debtor called the
collector."
Given that the use of computers with fancy color displays will
become more
common with time, how do we ensure that color-blind people can use
these
displays without any confusion? Although one of my color-blind
friends can
tell that there is a difference between, say red and green, a
simple change in
monitor contrast/brightness levels (or being assigned to a
different terminal
or a particular day) would probably be enough to throw him off.
With
disabilities-related legislation in the United States in place,
such issues
regarding computer-human interaction will definitely get more
attention.
------------------------------
Date: 14 May 1993 21:27:51 -0400 (EDT)
From: horn%
[email protected] (rob horn)
Subject: Update on risks in semiconductor manufacturing
The miscarriage risk information on semiconductor manufacturing is
based on
multiple epidemiological studies. The largest of these was by UC
Davis and
tracked 15,000 employees at 14 plants. The studies were examining
effects
from microwave exposure, low frequency magnetic fields, solvents,
paints, CRT
usage and electrostatic field exposure. In one portion of the
manufacturing
process a solvent related problem was found. No other
statistically
significant effects were found.
The solvent risk is from the family of solvents known as glycol
ethers, with
particular risk from the ethylene glycol ethers. These solvents
are used in
the photolithography process. This is the stage where the
semiconductor is
coated with a photoresist layer, this layer exposed through a mask,
and then
partially removed. Most photolithography processes involve
solvents of
various sorts. The study isolated the effect to sites using glycol
ethers.
The other solvents may be dangerous, but the safety precautions
appear to be
adequate.
The risk magnitude is an increase of between 40 and 100% in the
miscarriage
rate for workers in photolithography where ethylene glycol ethers
are used.
Industry reaction is mixed. There is relief that the other safety
measures
are effective. The ethers are not the most dangerous chemical
used. There
are far more toxic chemicals involved elsewhere. There is
disappointment that
the safety measures in place in photolithography are inadequate.
There was
also relief that suspected hazards from CRT usage, EMF, and ESF did
not cause
an increase in the miscarriage rate.
The semiconductor industry is already several years into major
overhauls and
redesigns to reduce health, safety, and environmental hazards. All
of the
glycol ethers were already targets for gradual elimination as
health and
significant environmental hazards. The disposal for one gallon of
such waste
now exceeds $50. This study will accelerate their elimination.
This will
have a significant economic impact because the alternative
processes with low
health and environmental impacts are very different. Often it is
easier to
close the old factory and build a new one. When making a change
this big,
there is often an associated site relocation, job losses, etc.
Health hazards in the electronics manufacturing industry are not
new. The
toxic chemicals hazards from printed circuit board manufacturing
and soldering
stations are some earlier significant dangers.
Rob Horn
[email protected]
------------------------------
Date: Fri, 14 May 93 11:07:15 EDT
From:
[email protected] (Edwin Culver)
Subject: Re: Epilepsy and video games
Larry Hunter and others were asking about seizures induced by video
games. Not
being a neurologist, I wonder if these are similar to those caused
by "photic
stimulation", which has been implicated in some aircraft accidents.
Some people, when exposed to flickering or flashing lights will
have seizures
which are quite similar to epileptic seizures. In aircraft, this
can be a
problem for general aviation pilots, who look through the propeller
disk.
During most of the flight, the frequency is too high to cause
problems, but
during a landing, especially into the setting sun, the propeller
may cause
flickering sufficient to induce a seizure. There are, apparently,
degrees of
severity: some people will seize from a single flash of the right
duration.
Edwin M. Culver
[email protected] (203) 741-8736
------------------------------
Date: Mon, 17 May 1993 14:09:28 -0400
From: Dave Wortman <
[email protected]>
Subject: Don't Depend on makedepend
Context: makedepend is a program that processes a set of C program
files
and determines interdependencies among those files. It is
frequently used
in large software systems to (attempt to) guarantee that a Makefile
correctly
describes the dependencies in a piece of software. There is often
a software
building step called "make depend" that invokes the makedepend
program.
Many users of makedepend run it automatically as a part of system
building
without much appreciation for what it is doing.
Makedepend has a documented undependability!!
In the man page for makedepend under the heading "Bugs", we find
the
remarkable statement:
"If you do not have the source for cpp, the Berkeley Unix C
preprocessor then
makedepend will be compiled in such a way that all #if directives
will
evaluate to "true" regardless of their actual value. This may
cause the
wrong #include directives to be evaluated." ...
This appears to open a large hole for incorrectly building
software.
If the Makefile dependencies generated by makedepend are wrong then
there
is a chance that the software built using the Makefile will also be
wrong.
I think the authors of makedepend made a bad decision in allowing
the program
to become functional in a situation where they knew it could get
the
wrong answer. To their credit, at least they documented the
problem.
If the source for Berkeley cpp is unavailable then it shouldn't be
possible
to compile makedepend.
Risks:
- wrong answers from makedepend could lead to wrong software.
- the ordinary user of makedepend has no way of knowing whether the
makedepend they are using will get the right answer or not.
- many users of makedepend are unaware that makedepend can get the
wrong answer.
------------------------------
Date: Mon, 17 May 1993 20:02:09 -0400
From: Dave Wortman <
[email protected]>
Subject: makedepend problem - a real world example
Shortly after posting my message concerning problems with
makedepend
I came across an unsolicited example of the problem in a posting by
Michael Turok (
[email protected]) to the comp.windows.x
newsgroup.
..
makedepend chokes on one of X11 include files (as distributed
by Sun) - namely Xos.h:
#if !defined(SUNOS41) || defined(__STDC__)
# include <string.h>
# define index strchr
# define rindex strrchr
#else /* BSD && !__STDC__ */
# include <strings.h>
#endif /* !SUNOS41 || __STDC__ */
Here 'makedepend' evaluates both #if and corresponding #else
statements
to 'true' and tries to open the file <strings.h> which doesn't
exist
under Solaris2. ...
------------------------------
Date: Sun 16 May 1993 14:21 -0400
From:
[email protected]
Subject: Business Week: ADP Flubs
There is an article in the May 24, 1993 issue of Business Week
covering
errors in Automatic Data Processing's handling of key tabulations
and
statistics including undercounting the shareholder votes for a
company (the
72.4 million counted were not enough) and misreporting key
government labor
statistics in the late 1980's that resulted in exaggerating job
losses until
1991 by 540,000 jobs.
In my mind the issue is not so much the ADP errors, as the
dependence we have
on such calculations without effective fallback and reasonableness
checks.
Errors, while not forgivable, will occur whether it is a
programming error, a
management oversight or a systemic flaw.
As we build these statistics into policy, the best we can hope for
is that
there is some effective feedback to reduce volatility. My fear is
that bad
statistics can lead to policies that only amplify the problem. Can
hyperinflation be caused by misreporting CPI figures? Perhaps the
job figures
influenced the recent election.
------------------------------
Date: Sat, 15 May 1993 19:22:15 +0100
From: Olivier MJ Crepin-Leblond <
[email protected]>
Subject: Mobile Phones and airbags (was: Opel Corsa Stops...)
In Risks-Digest 14.61, O\ystein Gulbrandsen relays a description of
the
effect of a mobile phone on the electronics of an Opel (or Vauxhall
here in UK :-) ) Corsa engine.
By coincidence a similar subject came-up in a conversation I had
with
a colleague last week. This is second-hand information, but it
appears that the electronics for some AIRBAG impact safety systems
were also affected by mobile telephones, and that in a few cases,
the airbags were inflated spontaneously while the car was being
driven, and of course, without any impact.
It would be interesting if anyone could confirm this. Before
hearing
the Opel corsa story, I was most skeptical about it, but now...
Olivier M.J. Crepin-Leblond, Digital Comms. Section, Elec. Eng.
Department
Imperial College of Science, Technology and Medicine, London SW7
2BT, UK
------------------------------
Date: Sun, 16 May 1993 11:30:25 EST
From: Marc Rotenberg <
[email protected]>
Subject: Re: Denning on NIST/NSA Revelations (Sobel, RISKS-14.59)
David Sobel, CPSR Legal Council, wrote in RISKS DIGEST 14.59:
>> The proposed DSS was widely criticized within the computer
>> industry for its perceived weak security and inferiority to
an
>> existing authentication technology known as the RSA algorithm.
>> Many observers have speculated that the RSA technique was
>> disfavored by NSA because it was, in fact, more secure than
the
>> NSA-proposed algorithm and because the RSA technique could
also
>> be used to encrypt data very securely.
Dorothy Denning responded in RISKS Digest 4.60
> This is terribly misleading. NIST issued the DSS proposal along
with a
> public call for comments as part of their normal practice with
proposed
> standards. The community responded, and NIST promptly addressed
the
> security concerns. Among other things, the DSS now accommodates
longer
> keys (up to 1024 bits). As a result of the revisions, the DSS is
now
> considered to be just as strong as RSA.
Denning has to be kidding. The comments on the proposed DSS were
uniformly
critical. Both Marty Hellman and Ron Rivest questioned the
desirability of
the proposed standard.
One of the reasons for the concern was the secrecy surrounding the
development
of the standard. The documents disclosed by NIST and NSA to CPSR
make clear
that NSA used its classification authority to frustrate the attempt
of even
NIST's scientists to assess the candidate algorithm.
This is not part of "normal practice." In fact, NSA's efforts to
blindfold
NIST and the secrecy surrounding the process violated the central
intent of
the Computer Security Act, the very law that governs the
relationship between
NIST and NSA.
Marc Rotenberg, CPSR Washington office
------------------------------
Date: Sat, 15 May 93 22:56 EDT
From:
[email protected]
Subject: Dr. D. Denning on DSS v. RSA (Sobel, RISKS-14.59)
The point is that NSA dislikes RSA "because (it) could also be used
to
encrypt data very securely." While it may be true that DSS with a
1024
bit modulus is as secure for digital signatures as RSA, it does not
meet
either the congressional mandate or the requirement.
The congressional mandate was for a public-key standard, not for a
digital
signature standard. The requirement is for a mechanism for key
exchange. The
DSS is a ruse; it is an attempt to appear to meet the congressional
mandate
without meeting the requirement.
I think that the CPSR statement is both accurate and to the point.
William Hugh Murray, Executive Consultant, Information System
Security
49 Locust Avenue, Suite 104; New Canaan, CT 06840
1-0-ATT-0-700-WMURRAY
------------------------------
Date: Mon, 17 May 93 14:05:18 PDT
From:
[email protected] (Jim Bidzos)
Subject: NIST Answers to Jim Bidzos' Questions
Date: Mon, 17 May 1993 16:44:28 -0400 (EDT)
From:
[email protected]
Subject: Answers to Your Questions
To:
[email protected]
To: Mr. Jim Bidzos, RSA Data Security, Inc.
From: Ed Roback, NIST
Mr. Ray Kammer asked me to forward to you our answers to the
questions you
raised in your e-mail of 4/27.
We've inserted our answers in your original message.
------------------------------------------------------
From: SMTP%"
[email protected]" 27-APR-1993 03:13:12.75
To:
[email protected]
CC:
Subj: Clipper questions
..
Date: Tue, 27 Apr 93 00:11:50 PDT
From:
[email protected] (Jim Bidzos)
Here are some questions about the Clipper program I would like to
submit.
Much has been said about Clipper and Capstone (the term Clipper
will be used
to describe both) recently. Essentially, Clipper is a
government-sponsored
tamper-resistant chip that employs a classified algorithm and a key
escrow
facility that allows law enforcement, with the cooperation of two
other
parties, to decipher Clipper-encrypted traffic. The stated purpose
of the
program is to offer telecommunications privacy to individuals,
businesses, and
government, while protecting the ability of law enforcement to
conduct
court-authorized wiretapping.
The announcement said, among other things, that there is currently
no plan to
attempt to legislate Clipper as the only legal means to protect
telecommunications. Many have speculated that Clipper, since it is
only
effective in achieving its stated objectives if everyone uses it,
will be
followed by legislative attempts to make it the only legal
telecommunications
protection allowed. This remains to be seen.
>>>> NIST: There are no current plans to legislate the use
of Clipper.
Clipper will be a government standard, which can
be - and
likely will be - used voluntarily by the private
sector. The
option for legislation may be examined during the
policy
review ordered by the President.
The proposal, taken at face value, still raises a number of serious
questions.
What is the smallest number of people who are in a position to
compromise the
security of the system? This would include people employed at a
number of
places such as Mikotronyx, VSLI, NSA, FBI, and at the trustee
facilities. Is
there an available study on the cost and security risks of the
escrow process?
>>>> NIST: It will not be possible for anyone from
Mykotronx, VLSI,
NIST, NSA, FBI (or any other non-escrow holder)
to
compromise the system. Under current plans, it
would be
necessary for three persons, one from each of the
escrow
trustees and one who knows the serial number of
the Clipper
Chip which is the subject of the court authorized
electronic
intercept by the outside law enforcement agency,
to conspire
in order to compromise escrowed keys. To prevent
this, it
is envisioned that every time a law enforcement
agency is
provided access to the escrowed keys there will
be a record
of same referencing the specific lawful intercept
authorization (court order). Audits will be
performed to
assure strict compliance. This duplicates the
protection
afforded nuclear release codes. If additional
escrow agents
are added, one additional person from each would
be required
to compromise the system. NSA's analysis on the
security
risks of the escrow system is not available for
public
dissemination.
How were the vendors participating in the program chosen? Was the
process
open?
>>>> NIST: The services of the current chip vendors were
obtained in
accordance with U.S. Government rules for sole
source
procurement, based on unique capabilities they
presented.
Criteria for selecting additional sources will be
forthcoming over the next few months.
AT&T worked with the government on a voluntary
basis to use
the "Clipper Chip" in their Telephone Security
Device. Any
vendors of equipment who would like to use the
chips in
their equipment may do so, provided they meet
proper
government security requirements.
A significant percentage of US companies are or have been the
subject of an
investigation by the FBI, IRS, SEC, EPA, FTC, and other government
agencies.
Since records are routinely subpoenaed, shouldn't these companies
now assume
that all their communications are likely compromised if they find
themselves
the subject of an investigation by a government agency? If not,
why not?
>>>> NIST: No. First of all, there is strict and limited
use of
subpoenaed material under the Federal Rules of
Criminal
Procedure and sanctions for violation. There has
been no
evidence to date of Governmental abuse of
subpoenaed
material, be it encrypted or not. Beyond this,
other
Federal criminal and civil statutes protect and
restrict the
disclosure of proprietary business information,
trade
secrets, etc. Finally, of all the Federal
agencies cited,
only the FBI has statutory authority to conduct
authorized
electronic surveillance. Electronic surveillance
is
conducted by the FBI only after a Federal judge
agrees that
there is probable cause indicating that a
specific
individual or individuals are using
communications in
furtherance of serious criminal activity and
issues a court
order to the FBI authorizing the interception of
the
communications.
What companies or individuals in industry were consulted (as stated
in the
announcement) on this program prior to its announcement? (This
question seeks
to identify those who may have been involved at the policy level;
certainly
ATT, Mikotronyx and VLSI are part of industry, and surely they were
involved
in some way.)
>>>> NIST: To the best of our knowledge: AT&T, Mykotronx,
VLSI, and
Motorola. Other firms were briefed on the
project, but not
"consulted," per se.
Is there a study available that estimates the cost to the US
government of the
Clipper program?
>>>> NIST: No studies have been conducted on a
government-wide basis to
estimate the costs of telecommunications security
technologies. The needs for such protection are
changing
all the time.
There are a number of companies that employ non-escrowed
cryptography in their
products today. These products range from secure voice, data, and
fax to
secure email, electronic forms, and software distribution, to name
but a few.
With over a million such products in use today, what does the
Clipper program
envision for the future of these products and the many corporations
and
individuals that have invested in and use them? Will the
investment made by
the vendors in encryption-enhanced products be protected? If so,
how? Is it
envisioned that they will add escrow features to their products or
be asked to
employ Clipper?
>>>> NIST: Again, the Clipper Chip is a government standard
which can
be used voluntarily by those in the private
sector. We also
point out that the President's directive on
"Public
Encryption Management" stated: "In making this
decision, I
do not intend to prevent the private sector from
developing,
or the government from approving, other
microcircuits or
algorithms that are equally effective in assuring
both
privacy and a secure key-escrow system." You
will have to
consult directly with private firms as to whether
they will
add escrow features to their products.
Since Clipper, as currently defined, cannot be implemented in
software, what
options are available to those who can benefit from cryptography in
software?
Was a study of the impact on these vendors or of the potential cost
to the
software industry conducted? (Much of the use of cryptography by
software
companies, particularly those in the entertainment industry, is for
the
protection of their intellectual property.)
>>>> NIST: You are correct that, currently, Clipper Chip
functionality
can only be implemented in hardware. We are not
aware of a
solution to allow lawfully authorized government
access when
the key escrow features and encryption algorithm
are
implemented in software. We would welcome the
participation
of the software industry in a cooperative effort
to meet
this technical challenge. Existing software
encryption use
can, of course, continue.
Banking and finance (as well as general commerce) are truly global
today. Most
European financial institutions use technology described in
standards such as
ISO 9796. Many innovative new financial products and services will
employ the
reversible cryptography described in these standards. Clipper does
not comply
with these standards. Will US financial institutions be able to
export
Clipper? If so, will their overseas customers find Clipper
acceptable? Was a
study of the potential impact of Clipper on US competitiveness
conducted? If
so, is it available? If not, why not?
>>>> NIST: Consistent with current export regulations
applied to the
export of the DES, we expect U.S. financial
institutions
will be able to export the Clipper Chip on a case
by case
basis for their use. It is probably too early to
ascertain
how desirable their overseas customers will find
the Clipper
Chip. No formal study of the impact of the
Clipper Chip has
been conducted since it was, until recently, a
classified
technology; however, we are well aware of the
threats from
economic espionage from foreign firms and
governments and we
are making the Clipper Chip available to provide
excellent
protection against these threats. As noted
below, we would
be interested in such input from potential users
and others
affected by the announcement. Use of other
encryption
techniques and standards, including ISO 9796 and
the ISO
8730 series, by non-U.S. Government entities
(such as
European financial institutions) is expected to
continue.
I realize they are probably still trying to assess the impact of
Clipper, but
it would be interesting to hear from some major US financial
institutions on
this issue.
>>>> NIST: We too would be interested in hearing any
reaction from
these institutions, particularly if such input
can be
received by the end of May, to be used in the
Presidentially-directed review of government
cryptographic
policy.
Did the administration ask these questions (and get acceptable
answers) before
supporting this program? If so, can they share the answers with us?
If not,
can we seek answers before the program is launched?
>>>> NIST: These and many, many others were discussed during
the
development of the Clipper Chip key escrow
technology and
the decisions-making process. The decisions
reflect those
discussions and offer a balance among the various
needs of
corporations and citizens for improved security
and privacy
and of the law enforcement community for
continued legal
access to the communications of criminals.
------------------------------
End of RISKS-FORUM Digest 14.62
Downloaded From P-80 International Information Systems 304-744-2253