Subject: RISKS DIGEST 15.44
REPLY-TO: [email protected]

RISKS-LIST: RISKS-FORUM Digest  Weds 2 February 1994  Volume 15 : Issue 44

        FORUM ON RISKS TO THE PUBLIC IN COMPUTERS AND RELATED SYSTEMS
  ACM Committee on Computers and Public Policy, Peter G. Neumann, moderator

 Contents:
Canada to monitor phone calls, fax, etc.? (Walter C. Daugherity)
Risks of cliche collisions on the information superhighway (Phil Agre)
Irrational risk research (Phil Agre)
Czech computer fraud (Mich Kabay)
Clipper Petition (Dave Banisar)
"Digital Woes" by Lauren Wiener (Rob Slade)

The RISKS Forum is a moderated digest.  Its USENET equivalent is comp.risks.
Undigestifiers are available throughout the Internet, but not from RISKS.
Contributions should be relevant, sound, in good taste, objective, cogent,
coherent, concise, and nonrepetitious.  Diversity is welcome, but not
personal attacks.  CONTRIBUTIONS to [email protected], with appropriate,
substantive "Subject:" line; others may be ignored!  Contributions will not
be ACKed; the load is too great.  **PLEASE** include your name & legitimate
Internet FROM: address, especially .UUCP folks.  If you cannot read RISKS
locally as a newsgroup (e.g., comp.risks), or you need help, send requests
to [email protected] (not automated).  BITNET users may subscribe
via your favorite LISTSERV: "SUBSCRIBE RISKS".

Vol i issue j, type "FTP CRVAX.SRI.COM<CR>login anonymous<CR>YourName<CR>
CD RISKS:<CR>GET RISKS-i.j<CR>" (where i=1 to 15, j always TWO digits).
Vol i summaries in j=00; "dir risks-*.*<CR>" gives directory; "bye<CR>"
logs out. The COLON in "CD RISKS:" is vital. CRVAX.SRI.COM = [128.18.30.65];
<CR>=CarriageReturn; FTPs may differ; UNIX prompts for username, password.
WAIS and [email protected] are alternative repositories.

 IF YOU CANNOT GET RISKS ON-LINE, you may be interested in receiving it
 via fax; phone +1 (818) 225-2800, or fax +1 (818) 225-7203 for info
 regarding fax delivery.  PLEASE DO NOT USE THOSE NUMBERS FOR GENERAL
 RISKS COMMUNICATIONS; as a last resort you may try phone PGN at
 +1 (415) 859-2375 if you cannot E-mail [email protected] .

ALL CONTRIBUTIONS CONSIDERED AS PERSONAL COMMENTS; USUAL DISCLAIMERS APPLY.
Relevant contributions may appear in the RISKS section of regular issues
of ACM SIGSOFT's SOFTWARE ENGINEERING NOTES, unless you state otherwise.

----------------------------------------------------------------------

Date: Wed, 2 Feb 94 09:04:52 -0600
From: [email protected] (Walter C. Daugherity)
Subject: Canada to monitor phone calls, fax, etc.?

>From EDUPAGE:

HIGH-TECH SNOOP GADGET.  A super-secret branch of the Canadian Security
Intelligence Service has awarded three contracts to a Montreal firm to make
equipment that can quickly isolate key words and phrases from millions of
airborne phone, fax, radio signals and other transmissions. The hardware has
the "Orwellian potential to sweep through ... and keep records of all
conversations," said one CSIS critic.  (CTV National News, 01/31/94 11:00 pm).

Walter C. Daugherity, Texas A & M University, College Station, TX 77843-3112
Internet, NeXTmail: [email protected]      uucp: uunet!cs.tamu.edu!daugher

------------------------------

Date: Tue, 1 Feb 1994 12:06:51 -0800
From: Phil Agre <[email protected]>
Subject: Risks of cliche collisions on the information superhighway

The 1 Feb 1994 Wall Street Journal's front page center column is about the
metaphors generated by the phrase "information superhighway", which (all are
reported to agree) Al Gore coined by analogy to the US interstate highway
system.  What the article, like the vast majority of recent articles on the
topic, is the whole point of the "highway" metaphor, which is the proposition
that long-distance vehicle/information transfer may well be a natural
monopoly, thus calling for the creation of some kind of public utility.  This
is a fairly spectacular example of the organized forgetting that goes on in
the "agenda setting" process in US politics (and those, no doubt, of other
countries, in their own ways).  The risk here is that these semantical magic
tricks may in the end deprive the public of the information infrastructure
they deserve.

Evidence on this count is readily available, it so happens, in the 2/1/94 New
York Times (business section, page C6), where we are told that investors are
terrified that the Bell Atlantic - TCI merger might actually "lead to a
destructively competitive "two-wire world", where phone and cable companies"
would construct competing networks.  Although the analysts are alarmed, Bell
Atlantic has reassured its investors that it will be doing its best to avoid
that scenario by focusing first on relatively high-profit (i.e.,
uncompetitive) markets.

Heads up.

Phil Agre, UCSD

  [We are going to see all sorts of metaphors springing up on the
  InfoSuperhighway, such as speeding (illicit acts), speedtraps (network
  monitoring to detect misuse), parking lots (traffic congestion),
  StopAndShop (overload from 300 channels of home shopping), deprogramming
  services (for the compulsive shopper), and designated drivers (the
  device drivers you can trust).  Maybe even BurmaShave signs scattering
  doggerel poetry along the way for the oldtimers.  PGN]

------------------------------

Date: Tue, 1 Feb 1994 20:09:14 -0800
From: Phil Agre <[email protected]>
Subject: irrational risk research

The 1 Feb 1994 New York Times (science section) includes an article by Daniel
Goleman entitled "Hidden rules often distort ideas of risk".  It's about a set
of social psychological ideas about "perceptions" of risk that become
newsworthy about once a year despite never seeming to change.  These include
the following:

* Risks that are imposed loom larger than those that are voluntary.
* Risks that seem unfairly shared are also seen as more hazardous.
* Risks that people can take steps to control are more acceptable
  than those they feel are beyond their means to control.
* Natural risks are less threatening than man-made ones.
* Risks that are associated with catastrophes are especially frightening.
* Risks from exotic technologies cause more dread than do those
  involving familiar ones.

The article reports a spectrum of views about the best explanation of these
results and the best policies to deal with them.  This spectrum might be
categorized as follows:

 Conservative:  People are irrational, so forget 'em.
 Moderate:      People are irrational, but we can persuade them.
 Liberal:       People are irrational, but hey, everyone has faults,
                so let's humor them a little.

The common element, of course, is that is a view of ordinary people as
irrational because their rankings of the risks from various technologies are
considerably different from those of the experts.

What somehow never ceases to me is that all three approaches neglect a
perfectly obvious explanation, which is that people distrust the institutions
that seek to reassure them about unfamiliar technologies, having been
repeatedly and egregiously lied to in the past by many of those same
institutions (they were feeding plutonium to *whom*?), and they resent living
in a world dominated by such institutions, so they refuse to acknowledge the
claims of any technological project that has not been organized and evaluated
in a democratic way.  (The article does remark that people don't trust the
numbers, but that's apparently because people irrationally fail to weigh the
nuclear power plants that blow up against all the ones that don't.)  Probably
that's too simple, but it explains the data much more straightforwardly than
the known alternatives.

The interesting sociological question is how this feat of conceptual
constriction actually *works*.  Does this explanation literally never occur to
the people who do this research and write these articles?  How can that be?
Is it a conscious PR thing?  That would be disappointing in a way (too
straightforward), but it's certainly true enough with numerous other issues.

Clearly, as articles on the science pages so often conclude, further research
is needed.

Phil Agre, UCSD

------------------------------

Date: 30 Jan 94 14:53:20 EST
From: "Mich Kabay / JINBU Corp." <[email protected]>
Subject: Czech computer fraud (More on RISKS-15.22)

>From the Associated Press newswire via Executive News Service (GO ENS) on
CompuServe:

Czech-Computer Fraud

   PRAGUE, Czech Republic (AP, 19 Jan 1994) -- A bank employee was sentenced
 to eight years in prison for stealing nearly $1.2 million in the Czech
 Republic's first major computer fraud, a newspaper reported Wednesday.
 Martin Janku, an employee of the Czech Savings Bank in Sokolov, transferred
 money to his own account in the bank with the help of his own computer
 program between September 1991 and April 1992, the daily Mlada Fronta Dnes
 said.

The article continues with a few details:

o    Janku arrested when he tried to withdraw money from a branch where a
    teller recognized him as a programmer she'd met during training;

o    sentenced to 8 years in jail;

o    claims he was testing bank security;

o    returned about $1 million of money he stole; rest, he says, was stolen
    from his car.

    [Moral: never test someone's security systems without written
    authorization from the right people.]

Michel E. Kabay, Ph.D., Director of Education, National Computer Security Assn

------------------------------

Date: Mon, 31 Jan 1994 15:59:20 EST
From: Dave Banisar <[email protected]>
Subject: Clipper Petition

               Electronic Petition to Oppose Clipper
                     Please Distribute Widely

On January 24, many of the nation's leading experts in cryptography
and computer security wrote President Clinton and asked him to
withdraw the Clipper proposal.

The public response to the letter has been extremely favorable,
including coverage in the New York Times and numerous computer and
security trade magazines.

Many people have expressed interest in adding their names to the
letter.  In  response to these requests, CPSR is organizing an
Internet petition drive to oppose the Clipper proposal.  We will
deliver the signed petition to the White House, complete with the
names of all the people who oppose Clipper.

To sign on to the letter, send a message to:

    [email protected]

with the message "I oppose Clipper" (no quotes)

You will receive a return message confirming your vote.

Please distribute this announcement so that others may also express
their opposition to the Clipper proposal.

CPSR is a membership-based public interest organization.  For
membership information, please email [email protected].  For more
information about Clipper, please consult the CPSR Internet Library -
FTP/WAIS/Gopher CPSR.ORG /cpsr/privacy/crypto/clipper

=====================================================================

The President
The White House
Washington, DC  20500

Dear Mr. President:

    We are writing to you regarding the "Clipper" escrowed encryption
proposal now under consideration by the White House.  We wish to express our
concern about this plan and similar technical standards that may be proposed
for the nation's communications infrastructure.

    The current proposal was developed in secret by federal agencies
primarily concerned about electronic surveillance, not privacy protection.
Critical aspects of the plan remain classified and thus beyond public review.

    The private sector and the public have expressed nearly unanimous
opposition to Clipper.  In the formal request for comments conducted by the
Department of Commerce last year, less than a handful of respondents supported
the plan.  Several hundred opposed it.

    If the plan goes forward, commercial firms that hope to develop new
products will face extensive government obstacles. Cryptographers who wish to
develop new privacy enhancing technologies will be discouraged.  Citizens who
anticipate that the progress of technology will enhance personal privacy will
find their expectations unfulfilled.

    Some have proposed that Clipper be adopted on a voluntary basis and
suggest that other technical approaches will remain viable.  The government,
however, exerts enormous influence in the marketplace, and the likelihood that
competing standards would survive is small.  Few in the user community believe
that the proposal would be truly voluntary.

    The Clipper proposal should not be adopted.  We believe that if this
proposal and the associated standards go forward, even on a voluntary basis,
privacy protection will be diminished, innovation will be slowed, government
accountability will be lessened, and the openness necessary to ensure the
successful development of the nation's communications infrastructure will be
bthreatened.

    We respectfully ask the White House to withdraw the Clipper proposal.

------------------------------

Date: Mon, 31 Jan 1994 15:09:40 -0600 (MDT)
From: "Rob Slade, Ed. DECrypt & ComNet, VARUG rep, 604-984-4067"
<[email protected]>
Subject: "Digital Woes" by Lauren Wiener

Lauren Wiener, "Digital Woes", Addison-Wesley Publishing Co., 1993,
0-201-62609-8, U$22.95/C$29.95

When reviewing books on technical topics, one quickly learns to dread the work
of those who do not actually practice in the field.  (Yes, we are told that
Wiener is a technical writer.  They may very well be professionals, but the
overwhelming majority are not technical professionals.)  With this prejudice
firmly in place, it came as a delightful surprise to find that "Digital Woes"
is an accurate, well-researched, and thoroughly engaging treatment of the
subject of software risks.

Chapter one is a list of specific examples of software failures, large and
small.  The stories are thoroughly documented and well told.  The choice of
examples is careful, and useful as well, covering a variety of problems.  One
could, of course, add to the list.  In the virus field programs are extremely
limited in function and rarely exceed 3000 bytes in length, yet almost every
viral strain shows some programming pathology; most of the damage seems to be
done by mistake.  The user interfaces of antivirals are subject to hot debate,
perhaps more importantly than in other systems because of the risks involved in
misunderstanding.  In regard to decision support, I recall the assumption, on
the part of Excel, that everyone wants to use linear forecasting.  Everyone
involved in technical fields will be able to add other specific examples.  For
those uninvolved, Wiener's work is quite sufficient and convincing.

Chapter two is an explanation of why software contains bugs, and why software
errors are so deadly.  Techies will feel somewhat uncomfortable with the lack
of jargon, but persevere.  Initially, I thought she had missed the point of the
difference between analogue and digital systems--until I realized I was in the
middle of a complete and clear explanation that never had to use the word
"analog".  (Technopeasants will, of course, appreciate the lack of jargon.
Rest assured that the same ease of reading and clarity of language holds
throughout the book.)

Chapter three examines the various means used to try to ensure the reliability
of software--usually with a depressing lack of success.  As with all who have
worked in the field, I can relate to the comments regarding the difficulty of
testing.  At one point I uncovered a bug in the third minor variant of the
fourth major release of the fifth generation of a communications program.
Apparently I was the first person on staff who had ever wanted to keep a
running log between sessions--and the functions I used combined to completely
lock up the computer.

Most RISKS-FORUM readers will by now be nodding and muttering, "So what else
is new".  However, Wiener here proves herself capable of some valuable and
original contributions beyond the pronouncements of those working in the
field.  Noting that she is familiar with programmers who have never, in twenty
years of work, had their code incorporated into a delivered product, she
raises the issue of what this type of work environment does to the psyche of
the worker.  My grandfather carved the wooden decorations in our church, and,
fifty years after his death, I can still point that out.  However, in a career
of analysis, training and support, I can point to little beyond an amount of
Internet bandwidth consumed.  (Many would say "wasted".)  To the ephemeral
nature of the craft, though, one must add the legacy of constant failure.
Martin Seligman's "Learned Helplessness" points out the danger quite clearly.
A similar thought was voiced some years ago over the impact on developing
youth of the then new video games, and the fact that you could advance through
levels but never, ultimately, win.  These children are grown now.  You may
know them as "Generation X".

Chapter four deals with means to prevent failure.  Actually most of the
material discusses recovery--assuming that the system will eventually fail, how
to ensure that the failure causes the least damage.

Chapter five is entitled "Big Plans" and looks at various proposed new
technologies and the risks inherent in them.  In this discussion Wiener warns
against those who are overly thrilled with the promises of the new technology.
I agree, but I would caution that public debate is also dominated by those
strident with fear.  The arguments of both sides tend to entrench to defeat the
opposition, while the public, itself, sits bemused in the middle without
knowing whom to believe.  It is a major strength of Wiener's work that the
field is explored thoroughly and in an unbiased manner.

Many books which try to present an objective view of a controversial problem
tend to trail off into meaningless weasel-words, but the final chapter here
concerns "The Wise Use of Smart Stuff."  Wiener lists a good set of criteria to
use in evaluating a proposed system.  The one item I would recommend be toned
down is the axiom that personal care be excluded.  I keep an old Berke Breathed
"Bloom County" cartoon in my office wherein Opus, the Penguin, berates a
computer for depriving him of his humanity until the bemused machine attempts
to confirm that Opus is human.  The perceived coldness of our institutions is
often illusory.  I once worked in a geriatric hospital and thought it a shame
that our culture did not keep aging parents at home.  Until, that is, I lived
in a culture that did, and found that the "technology" of our hospitals
provided more human contact to the old folks than did the "organic" home care.
I also note that the belittled ELIZA is the only program to have passed the
Turing test so far.  A limited, unexpected, and hilarious pass, perhaps, but a
pass nonetheless.

I note, as I am reviewing this book, a press release by a headhunting agency
that half of all executives are computer illiterate.  The survey method is
extremely suspect, and I assume these figures are so kind as to be ridiculous.
I would heartily recommend this work to technical and non-technical workers
alike.  Particularly, though, I recommend it to those executives who are the
ones to make the ultimate decisions on major projects.  Please re-read it after
the next vendor demo you attend.

copyright Robert M. Slade, 1993   BKDGTLWO.RVW  931223

Postscriptum - my wife agrees with Peter Denning that I tend to editorialize in
my reviews.  This is likely true.  "Digital Woes", however, deals with a topic
which has prompted many editorials--and deals with it well.
Permission granted to distribute with unedited copies of the Digest
======================
DECUS Canada Communications, Desktop, Education and Security group newsletters
Editor and/or reviewer [email protected], [email protected], Rob Slade at 1:153/733
DECUS Symposium '94, Vancouver, BC, Mar 1-3, 1994, contact: [email protected]

------------------------------

End of RISKS-FORUM Digest 15.44
************************