Subject: RISKS DIGEST 14.40
REPLY-TO: [email protected]

RISKS-LIST: RISKS-FORUM Digest  Tuesday 16 March 1993  Volume 14 : Issue 40

       FORUM ON RISKS TO THE PUBLIC IN COMPUTERS AND RELATED SYSTEMS
  ACM Committee on Computers and Public Policy, Peter G. Neumann, moderator

 Contents:
Garage door burglaries (Chuck Payne)
MCI 800 problem (Andrew Marchant-Shapiro)
System Dynamics of Risks (Dan Yurman via Bill Park)
Facing the Challenge of Risk and Vulnerability in Information Society
 (Klaus Brunnstein)

The RISKS Forum is a moderated digest discussing risks; comp.risks is its
Usenet counterpart.  Undigestifiers are available throughout the Internet,
but not from RISKS.  Contributions should be relevant, sound, in good taste
good taste, objective, coherent, concise, and nonrepetitious.  Diversity is
welcome.  CONTRIBUTIONS to [email protected], with appropriate, substantive
"Subject:" line.  Others may be ignored!  Contributions will not be ACKed.
The load is too great.  **PLEASE** INCLUDE YOUR NAME & INTERNET FROM: ADDRESS,
especially .UUCP folks.  REQUESTS please to [email protected].

Vol i issue j, type "FTP CRVAX.SRI.COM<CR>login anonymous<CR>AnyNonNullPW<CR>
CD RISKS:<CR>GET RISKS-i.j<CR>" (where i=1 to 14, j always TWO digits).  Vol i
summaries in j=00; "dir risks-*.*<CR>" gives directory; "bye<CR>" logs out.
The COLON in "CD RISKS:" is essential.  "CRVAX.SRI.COM" = "128.18.10.1".
<CR>=CarriageReturn; FTPs may differ; UNIX prompts for username, password.

For information regarding delivery of RISKS by FAX, phone 310-455-9300
(or send FAX to RISKS at 310-455-2364, or EMail to [email protected]).

ALL CONTRIBUTIONS CONSIDERED AS PERSONAL COMMENTS; USUAL DISCLAIMERS APPLY.
Relevant contributions may appear in the RISKS section of regular issues
of ACM SIGSOFT's SOFTWARE ENGINEERING NOTES, unless you state otherwise.

----------------------------------------------------------------------

Date: 11 Mar 1993 14:07:21 -0500 (CDT)
From: "Chuck Payne -- Quad/Tech S&R -- Ext. 7976" <[email protected]>
Subject: Garage door burglaries

The newspapers in Milwaukee reported an interesting case this morning.

An installer of automatic garage door openers has been arrested, pending
being formally charged of burglary.  He is accused of recording the electronic
code settings on the automatic garage doors he installed, and then returning
some time later and opening the garage doors electronically.  He has been
accused of an entire string of burglaries, and stolen goods were apparently
found in his home.  In some cases, the garage door was opened and the inside
door to the rest of the house was unlocked, or else it was pried open.

Sounds like it might be a good idea to change the code settings on your garage
door opener if it was installed by someone else, or even serviced recently.

Charles D. Payne, Safety Engineer, Quad/Tech International, Div. of
Quad/Graphics Inc., Sussex, Wisconsin 414-246-7976 [email protected]

------------------------------

Date: 10 Mar 93 13:28:00 EST
From: "MARCHANT-SHAPIRO, ANDREW" <[email protected]>
Subject: MCI 800 problem

Some time ago, my parents (who live in another state) decided that, if they
were going to hear their grandchildrens' voices, they needed to get a personal
800 number from MCI.  The personal 800 scheme works like this: each household
is assigned a unique 800 number (I'm told), and an access code (4 digits).  As
a precaution against abuse, when you dial the 800 number you get a message
telling you to enter your code.  Only callers who enter the correct code get
connected, so no massive dialing scheme advertising holiday resorts (etc) can
exploit the users' willingness to pay for incoming calls.

I promptly programmed my parents' number and the code on adjacent buttons of
my phone and left it at that.  I would just hit the first button, wait for the
announcement (voice-mail style) and hit the second button.  This worked, until
a little over a month ago.  At that point, after I hit the second button I was
asked to wait, and an operator came on the line and asked for my code.  The
first time this happened, I refused to give the code (since I had forgotten it
(!)).  A moment later, it apparently showed on the operator's console, and I
was put through.

I thought this was an aberration, but at no time after the first event was I
able to get directly through, without talking to an operator.  I thought their
equipment might not be able to handle the high speed dialer, so I relearned
the code and punched it in myself.  Still no go.  I tried from my office.
Same thing.

Finally, last week, I managed to get the operator to switch me to a technical
representative.  This individual and I discussed what was happening, and the
rep told me that he knew of another case where much the same thing had
happened.  I then asked if they had changed or upgraded their system software
lately.  Long pause.  "Why yes, we did, just about a month ago."

I suggested they check things out, and was promised a report.  Well, a couple
of days later the system WORKED!  And it has not failed again since.  I have
not received a report (nor a consulting fee from MCI), but I suspect that
MCI's upgrade of their personal 800 system included some, uh, 'features' of
which they weren't aware.  They may have gone back to the old software, or
they may have just fixed MY problem.  I don't know which.  But I am certain
that the origin of the problem had to do with a programming error in MCI's
hardware/software, and this raises the issue of other errors that might be out
there.

Should MCI employ beta testers?  That would be my suggestion.  They could pay
people like me to make trial calls at, say, 3:00 AM CST, just to make sure the
system worked as advertised.  Hey, in a world where most people can't program
an MS-DOS .BAT file, you need to check!

Andrew Marchant-Shapiro, Sociology and Political Science Depts., Union College
Schenectady NY 12308 518-370-6225 [email protected] [email protected]

------------------------------

Date: Sun, 14 Mar 93 13:20:34 -0800
From: [email protected] (Bill Park)
Subject: System Dynamics of Risks

From: [email protected]
Subject: System Dynamics of Risks
Newsgroups: sci.systems

System Dynamics of Risks: Risk Perceptions, Mental Models, Circuit Breakers

There have been a number of postings about risk and public acceptance of risks
from various technologies, e.g. nuclear, chemical, etc.  I think it's worth
reviewing some of the basics about risk perceptions.  This posting is based on
the following references for those who wish to develop their own conclusions.

"Perceptions of Risk," Slovic, Paul, _Science_, 4/17/87. Vol 236,
pp. 280-285.

"The Fifth Discipline," Senge, Peter, Doubleday, 1990.

"Technological Risk," Lewis, H.W, Norton, 1990.

This posting is done in "bullet" form so that I can show attribution to source
by concept.  Almost all of the material in this post comes from one or more of
the sources noted above.  I have merely condensed some of the key ideas.

Senge's work on system dynamics does not mention risk
perceptions, per se, rather, in any great detail.  I have applied
his tools for thinking about system dynamics to risk perceptions.

Finally, if I have made any errors in representing the work of
these authors, they are unintentional.  I would appreciate
clarifications where necessary.


OBJECTIVES - Slovic

*    Provide a basis for understanding and anticipating public
    perceptions of hazards.  [Note: Senge - risk perceptions are
    mental models.]

*    Improve communication of risk information among technical
    experts, lay people, and decision makers.

BACKGROUND - Slovic

*    The development of chemical and nuclear technologies has
    been accompanied by the potential to cause catastrophic and
    long-lasting damage to the earth and to the life forms that
    inhabit it.

*    The mechanisms underlying these complex technologies are
    unfamiliar and incomprehensible to most citizens.  The most
    harmful consequences of these technologies are such that
    learning to mitigate or control them is not well suited to
    management by trial-and-error.  The public has developed
    increasing levels of dread of the unknown consequences of
    complex technologies.

*    The public is well aware that economic and political
    pressures during the design process in complex systems may
    lead to systems being built and operated near the edge of
    the safety envelope. [Senge - Eroding goals]

*    Some systems, once built, represent such significant
    investments that it is nearly impossible to walk away from
    them regardless of risks. [Senge - Yesterday's solutions are
    today's problems.]  Example, nuclear waste resulting from
    the balance of terror associated with nuclear weapons.

*    Those who are responsible for human health and safety need
    to understand the ways people think about and respond to
    risk.  Perception and acceptance of risks have their roots
    in social and cultural factors and not in science.

*    The result is that some risk communication efforts may be
    irrelevant for the publics for which they are intended
    because the "publics" have hidden agendas.  Also, the public
    may be raising the issue of risk to human health and the
    environment as a surrogate for other social, economic, or
    political concerns.

*    Risk perceptions are mental maps composed of attitudes,
    beliefs, assumptions, and judgements.  Following is an
    example of the "Not in my back yard," or NIMBY mental map.

    [Senge - reinforcing, vicious loops.]

    -    Attitude:      government science is not trustworthy

    -    Belief:        government serves special interests, not
                        the public

    -    Assumption:    you can't fight city hall

    -    Judgement:     whatever it is the government is
                        proposing to do, get it out of my back
                        yard.

*    Disagreements about risk perceptions do not change as a
    result of better data becoming available and being
    disseminated to the public.   People have a hard time
    changing their opinions because of the strong influence
    initial impressions, or pre-existing biases, have on the
    interpretation of new information.  Also, the method of
    presenting the new data, e.g. as mortality or as survival
    rates, can alter perceptions of risk.

*    Generally, the gap between perceived and desired risk levels
    suggests that people are not satisfied with the ways the
    market or regulatory agencies have balanced risks and
    benefits.  Generally, people are more tolerant of risks from
    activities seen as highly beneficial, but this is not a
    systematic relationship.

*    The key factor regarding acceptance of exposure to risk
    appears to be the degree to which a person chooses that
    exposure in return for a perceived level of benefits.  The
    relationships between perceived levels of benefits and
    acceptance of risks are mediated by factors such as
    familiarity, control, potential for catastrophic
    consequences, and equity.

*    In the case of nuclear power people's deep anxieties are
    linked to the history of negative media coverage.  Also,
    there is a strong association between public attitudes about
    nuclear power and anxieties about the proliferation of
    nuclear weapons.


Accidents as Signals - Slovic

*    The impact of accidents can extend far beyond direct harm.
    An entire industry can be affected regardless of which firm
    was responsible for the mishap.

*    Some mishaps cannot be judged solely by damage to property,
    injuries, or death.  Some events, like Three-Mile Island
    (TMI), can have ripple effects on public perceptions of
    risks leading to a more hostile view of complex technologies
    in general.

*    The signal potential of an event like TMI, and thus its
    social impact, appears to be related to how well risks
    associated with the event are understood.  The difference in
    perceptions between a train wreck and a nuclear reactor
    accident is that the wreck is seen as a discrete event in
    time while the reactor problem is regarded as a harbinger of
    further catastrophic mishaps.  The relationship is between
    degree of unknown dread of the consequences of the accident
    and the degree of subsequent irrational fears of future
    catastrophes.


Risks & Benefits - Slovic

*    Firms conducting risk assessments within the framework of
    cost - benefits analyses often fail to see the "ripple"
    effects of worst case scenarios.

*    For example, Ford Motor Co. failed to correct a design
    problem with the gas tank of its Pinto compact care.  A cost
    - benefit analysis indicated that corrections costs greatly
    exceeded expected benefits from increased safety.

*    Had Ford looked at public risk perceptions of auto fires in
    crashes, the analysis might have highlighted this defect
    differently.

    -    Public perceptions of auto crashes regarded the risk of
         fire as a very high order problem involving
         considerable dread.

    -    Ford ignored potential higher order costs such as
         damage claims from lawsuits, damaged public reputation,
         lost future sales, and diminished "good will" from
         regulatory agencies.



Risk Perception & Mental Models - Senge

The logic of mental models with regard to risk perceptions is
illustrated by the following notes:

1.   Senge - Structure influences system performance

    IF:       structure influences system performance, and;

    IF:       mental models - attitudes, beliefs, assumptions,
              judgements - are part of the structure;

    THEN:          Mental models influence system performance.
                   Risk perceptions are mental models because
                   they are based on social and cultural factors
                   such as attitudes, beliefs, assumptions, and
                   judgements


2.   Senge - The easy way out usually leads back in.

    IF:       culture is the dominant collection of shared
              mental models operating in society, and;

    IF:       risk perceptions, which are mental models, have
              their roots in social and cultural factors, and
              not in science;

    THEN:          some risk communication efforts based solely
                   on scientific data will fail since they do
                   not address mental models which are the basis
                   for risk perception.


3.   Senge - The harder you push the harder the system pushes back.

    IF:       both our private and shared mental models are
              always flawed and can get us into trouble when
              they are taken for granted, and;

    IF:       levels of dread, in terms of perceived risk of
              complex technology, are reinforced by irrational
              fears caused by the unknown but potentially
              catastrophic effects of new technologies;

    THEN:          inappropriate mental models about complex
                   technologies may be reinforced, rather than
                   mitigated, by additional "marketing" efforts
                   to promote new technologies.


Charting Mental Models About Risk - Senge

Variables are defined as elements in a system which may act or be
acted upon.  A variable can move up or down in terms of
intensity, duration, absolute or relative values, etc., but it's
movement is measurable.

Slovic - There are four areas in which variables are defined for
mental models at work in shaping risk perceptions.  Following
each variable definition is a list of factors which further
define them.

*    The degree of voluntary acceptance of the risk, e.g.
    drinking coffee (caffeine) v. second hand smoke. (who makes
    the decision for exposure to the risk)

    -    Controllable?

    -    Consequences not fatal for individuals or groups?

    -    Equity in choice, degree of exposure?

    -    Low risk to future generations?

    -    Risks easily reduced or mitigated by individual
         choices?

    -    Risk decreases over time as more knowledge becomes
         available?

*    The level of dread of the unknown the person has about the
    risk, e.g. thermonuclear war v. car accident. (obliteration
    of the collective v. individual survival)

    -    Totally uncontrollable; e.g. Pandora's box?

    -    Catastrophic results?

    -    Consequences fatal?

    -    No equity or choice, random exposures to risks?

    -    High risks to future generations?

    -    Risk increases over time regardless of what is known
         about it?

*    The amount of knowledge the person has about the risk and
    especially its consequences, e.g. inhaling pesticide residue
    v. drinking alcoholic beverages. (imprecise science v.
    known, quantifiable data)

    -    Risks / consequences observable by trial and error,
         experimentation, or measurement?

    -    Those exposed realize the dangers?

    -    Effects / consequences separated in time and space,
         e.g., harm to future generations?

    -    Risks known to science, or exist in realm of
         "folklore?"

*    The degree of control the person has to prevent the
    consequences of system failure, e.g., riding on a snowmobile
    v. working in a coal mine. (individual control v. collective
    control)

    -    Consequences known, capable of quantification?

    -    Effects immediate?

    -    Risk well known and understood by the public and
         science?

    -    Solutions to mitigate risks work?


General Notes on Risks and Human Factors -- the Latent Failure Syndrome -
Lewis

*    Numerous functions and services in large, complex systems
    may be dependent on unrelated events.  Large,
    technologically complex systems have "latent" failures
    within them.  These are failures which are only apparent
    under a specific set of often obscure triggering conditions.
    Examples include;

    Nuclear        Three Mile Island, Chernobyl
    Space          Challenger shuttle explosion
    Industry       Bhopal
    Environment    Exxon Valdez oil spill

*    While these disasters all have apparent triggers, in fact,
    these failures are virtually never the result of a single
    fault.

*    The risks of large system failures, with accompanying
    catastrophic consequences, accrue to the system as a whole
    rather than to individual components.

*    Pressures during the design phase [ eroding goals ] may lead
    to systems being built to operate near the edge of the
    safety envelope.

*    Logical redundancy is compromised by a lack of physical
    redundancy.  For example, separate communication channels
    are carried in the same conduit.


Application of the "Latent Failure" Syndrone -- nuclear/chemical
waste cleanup

1.   Senge - Today's problems come from yesterday's solutions

    IF:       public anxieties [mental models] about nuclear
              technology are linked to dread of thermonuclear
              war, and;

    IF:       existing nuclear wastes are the by-products of
              weapons' production processes;

    THEN:          the public will extend it's original
                   perceptions [ mental models] to cover
                   processes involving the management of the
                   wastes even though the cleanup is designed to
                   neutralize them.

2.   Senge - The cure can be worse than the disease

    IF:       the public has an intuitive grasp of the "latent
              failure syndrone" with regard to complex
              technologies, e.g., nuclear weapons production,
              and;

    IF:       the public's mental map include a paradigm that
              "things blow up,"

    THEN:          the public will assume that the perceived
                   risks of cleaning up waste from nuclear
                   weapons production are no different than for the
                   activities that created the bombs in the first place.

Comments welcome, especially on ways to make distinctions between risk
perceptions about nuclear weapons v. risk perceptions about management of
nuclear wastes.  Are there any?

Dan Yurman, PO Box 1569, Idaho Falls, ID 83403
  [email protected]  [email protected]

------------------------------

Date:  Sat, 13 Mar 1993 15:54:46 +0100
From: [email protected]
Subject: Facing the Challenge of Risk and Vulnerability in Information Society

      INTERNATIONAL FEDERATION FOR INFORMATION PROCESSING
     Working Group 9.2 - Social Accountability of Computing

             Announcement of Working Conference:
          "Facing the Challenge of Risk and Vulnerability
                  in an Information Society"
           to be held at Namur, Belgium, 20-22 May 1993

The event is jointly organised by IFIP-WG9.2 and the "Cellule Interfacultaire
de Technology Assessment" (CITA), F.U.N.D.P., Namur and sponsored by the
Belgian National Scientific Research Fund (FNRS) and the Ministry of the
French Community.


1) Background

There has been much work done on the technical aspects of risk and
vulnerability within computer systems, and on what can be done to reduce risk
and alleviate the consequences. Less attention has been paid to the risks to
which society is exposed and its vulnerability in the age of information
technology.

The problems of risk and vulnerability are not new (and aspects of risk may
even sometimes be considered to be necessary for society) but the size,
complexity and global reach of computer systems means that the issues raised
have acquired a much greater urgency.

The conference is an important opportunity to bring together many specialists
to address specific problems.  The scope of the conference is quite specific:
careful analysis of the concepts of risk and vulnerability, particular
experiences of both individuals and organisations, as well as professions
and other institutions of society, and responses to find new ways of meeting
these challenges.


2) Main themes of the Conference

  - Analysis of Vulnerability and Risk: Theoretical papers which seek
    to analyse the nature and types of risk in society and the ways in
    which society is vulnerable.

  - Vulnerability of the Employee and Citizen

  - Vulnerability of the Manager and Organisation: Papers that are based
    on case studies which increase our understanding of the risks faced by
    people and organisations.

  - Professional Responses

  - Societal Responses: Papers which address the question: What can be
    done? through such means as legislation and the legal system, insurance,
    codes of ethics, codes of practice, education, etc.


3) Structure and Organization of the Working Conference

Day#1: Thursday May 20, 1993: 9.30 a.m., Plenary
BERLEUR Jacques (University of Namur, B), on behalf of IFIP-WG9.2:
       Risk and Vulnerability in an Information and Artificial Society
BEARDON Colin & HALES Mike (Brighton University, UK):
       Whose Risk?  Whose challenge? Questions of Power and
       Vulnerability in a Designed World
VAN LIESHOUT Marc & MASSINK Mieke (University of Nijmegen, NL):
       Constructing A Vulnerable Society


Thursday May 20 p.m.: Workshops on Concepts/Health Care/Access Capabilities

       Workshop on Concepts:
LAUFER Romain (HEC Graduate School of Management, F):
       The Social Construction of "Major Risks"
NAULLEAU Daniel (Equipe Informatique et Societe, F):
       The New Vulnerability. Some Ideas to Face it.
YOUNG Lawrence F. (University of Cincinnati, USA):
       A Jurisprudential View of Information Technology (IT)

       Workshop on Health Care:
BAKKER Albert R.(BAZIS Foundation, NL):
       Dependency of Healthcare Organisation on their Information System
LOUW Gail (University of Brighton, UK):
       The Use of Web Analysis in the Introduction of Nursing
       Information Systems

       Workshop on Access Capabilities:
DUTTON William (Annenberg School for Communication, USA):
       Electronic Service Delivery and the Inner City:
       Community Workshop Summary
WHITEHOUSE Diane (University of Toronto, CDN):
       I.T. and Disability

Thursday May 20  Late p.m.: Participants are invited to write their
       two best ideas on "sand plates".


Day#2: Friday May 21, 1993  9.30 a.m.: Plenary
BRUNNSTEIN Klaus (University Hamburg, D):
       Paradigms of IT and Inherent Risks
LOBET-MARIS Claire, in collaboration with KUSTERS Benoit, (CITA, University
       of Namur, B):
       Risks and Vulnerability in New Inter-Organizational Systems
OWEN Jenny, BLOOMFIELD Brian & COOMBS Rod(CROMTEC,University of Manchester,UK):
       Information Technology in Health Care: Tension and Change
       in the UK National Health Service


Thursday Friday 21 p.m.: Workshops Health Care/Organisations/Tentative Response

       Workshop on Health Care:
NGUYEN NAM Tien, PRINTZ Yves, SAADAOUI Sanae & NICOLAY A.(CITA,
       University of Namur, B):
       Benefits and Risks Assessment of Computerized Health Cards:A Case Study
SCHOPMAN Joop (University of Innsbruck, A):
       Information Technology's Ideology Makes its Use Risky

       Workshop on Organizations:
NILSSON Peter (Swedish National Audit Bureau, SW)
       How to Reduce IS Risk in the Public Sector? A Survey
ZETTERQVIST S (Church of Sweden Education Centre, SW)
       The Need of Education on Managerial Level for an
       Ideological and Member-Based Organization Due to the
       Change in Legal Requirements and to the I.T.
       Implementation

       Workshop on Tentative Responses:
UNDERWOOD Alan (School of Information Systems, Brisbane, AUS):
       Certification in the Australian I.T. Profession
VAN HOUTTE Paul (CRID, University of Namur, B):
       People Risks Related with Informatic Services
       Professions and Professional Liability Insurance

Friday 21: p.m. during Workshops: Selection, amongst individual ideas
(see "Sand Plates" of Thursday p.m.), of the best "group ideas": groups are
invited to write "Silver Plates".

Friday May 21, 1993: 6.00 p.m. Plenary
The 2nd IFIP-WG9.2 NAMUR AWARD will be granted to Riccardo PETRELLA, Head of
the FAST Programme (CEC, DGXII), for his outstanding contribution with
international impact to the awareness of social implication of information
technology.

Friday May 21, 1993 Evening: Conference dinner


Day#3: Saturday May 22, 1993:

Saturday May 22: 9.30 a.m. Plenary
COUMOU C.J. (Computer Security Consultants, NL):
       Using Risk-Analysis as a Tool for Decision Making.
       Experiences from Real Life
HOLVAST Jan (Stichting Waakzaamheid Persoonsregistratie, NL):
       Vulnerability and Privacy: Are We on the Way to a
       Riskless Society?

Saturday a.m., during Workshops: Selection, amongst group ideas (see "Silver
Plates" of Friday p.m.), of the "GOLDEN IDEA"

Saturday May 22 p.m. Plenary: "AGORA":
Presentation and selection of the "GOLDEN PLATE" on "How to face the Challenge
of Risk and Vulnerability in an Information Society?" Recommendations by the
Workshops.


4) Date and Place
The Working Conference will start on Thursday May 22nd, 1993 at 9.30 a.m.
(Welcome at 9.00) and end on Saturday 24th, at 4.00 p.m.

The Conference will take place on the premises of the Facultees Universitaires
Notre-Dame de la Paix, Namur - Belgium.

Participants arriving on Wednesday p.m. will be welcomed at the "Centre de
Rencontres", 53 Rue de Bruxelles, B-5.000 NAMUR (five minutes from the Namur
Railway Station), from 6.00 to 8.00 p.m.


5) Registration
A registration form is included. Participants are kindly requested to com-
plete and return it before April 15th at the latest.

The Registration Fee is BEF 5.500: it includes attendance at all conference
sessions, abstracts, coffee-breaks, lunches, cocktails and the conference
dinner. It is to be paid into the account 350-0000001-23 (Banque Bruxelles
Lambert) of the Facultees Universitaires Notre-Dame de la Paix, Namur with
the mention "cpo 9202- IFIP May Conf." The amount must be in Belgian francs,
all bank charges excluded. Eurocheque or American Express are also accepted,
if you prefer this means of payment. There will be no refund for cancellations
not received before May 10th, 1993.


6) Accommodation
You may receive a list of hotels from the Conference address (below). As hotel
rooms in Namur are limited, you are well advised to book your hotel room as
soon as possible. The Organizing Committee cannot be held responsible for
difficulties encountered in case of late booking, although we shall do our
best to help you.


7) Conference Address:
For all further information, please contact:
      Jacques BERLEUR, B,
      IFIP-WG9.2 Chairman
      FUNDP, Rue de Bruxelles 61, 5000 Namur, Belgium
      Tel:    +32.81.72.40.00
      Fax:    +32.81.72.40.03
      Email/UUCP:   [email protected]
           /Bitnet: jberleur@bnandp51


8) Programme Committee:
         Jacques Berleur, Chair,
         Colin BEARDON, UK
         Paula GOOSSENS, NL
         Romain LAUFER, F
         Peter NILSSON, Sw
         Ton WESTERDUIN, NL
         Luc WILKIN, B


9) REGISTRATION FORM
    (Please use capital letters):

  First Name:
        Surname:
  Company, Organization:
  Mailing Address:
City:
Postal Code:
Country:
Phone:                   Fax:
Email:

  I will attend the Conference
       - date and hour of arrival:
       - date and hour of departure:
  Registration fee
       - paid at FUNDP (cpo 9202)
       - International cheque (to the name of J.BERLEUR)

       Signed:
       Date:

------------------------------

End of RISKS-FORUM Digest 14.40
************************