Subject: RISKS DIGEST 12.66
REPLY-TO: [email protected]

RISKS-LIST: RISKS-FORUM Digest  Tuesday 26 November 1991  Volume 12 : Issue 66

       FORUM ON RISKS TO THE PUBLIC IN COMPUTERS AND RELATED SYSTEMS
  ACM Committee on Computers and Public Policy, Peter G. Neumann, moderator

 Contents:
Pentagon computers vulnerable
Risks of hardcoded hexadecimal instead of symbolic constants? (Tom Blinn)
Re: Leaves cause railway signal failure (Geraint Jones)
Re: Termination (David Lamb, anonymous)
Proposed Antivirus Certification (Klaus Brunnstein)
Call for Papers: IFIP World Congress'92/Vulnerability (Klaus Brunnstein)

The RISKS Forum is moderated.  Contributions should be relevant, sound, in
good taste, objective, coherent, concise, and nonrepetitious.  Diversity is
welcome.  CONTRIBUTIONS to [email protected], with relevant, substantive
"Subject:" line.  Others may be ignored!  Contributions will not be ACKed.
The load is too great.  REQUESTS please to [email protected].  For
vol i issue j, type "FTP CRVAX.SRI.COM<CR>login anonymous<CR>AnyNonNullPW<CR>
CD RISKS:<CR>GET RISKS-i.j<CR>" (where i=1 to 12, j always TWO digits).  Vol i
summaries in j=00; "dir risks-*.*<CR>" gives directory; "bye<CR>" logs out.
The COLON in "CD RISKS:" is essential.  "CRVAX.SRI.COM" = "128.18.10.1".
<CR>=CarriageReturn; FTPs may differ; UNIX prompts for username, password.
ALL CONTRIBUTIONS CONSIDERED AS PERSONAL COMMENTS; USUAL DISCLAIMERS APPLY.
Relevant contributions may appear in the RISKS section of regular issues
of ACM SIGSOFT's SOFTWARE ENGINEERING NOTES, unless you state otherwise.

----------------------------------------------------------------------

Date: Mon, 25 Nov 91 12:03:44 PST
From: "Peter G. Neumann" <[email protected]>
Subject: pentagon computers vulnerable

    Pentagon Computers Vulnerable
  DELFT, Netherlands (AP, 21 Nov 91)
  A leading Dutch computer security expert Friday said any computer whiz
around the world "who is a bit clever" can break into a Pentagon computer and
cover his tracks.  Prof. Bob Herschberg, who teaches hacking at the Delft
University of Technology, said the teen-age hackers who allegedly penetrated
U.S. military computers during the Gulf War most likely represent only the tip
of the iceberg of such intrusions.  And he questioned a U.S. congressional
investigation's finding that the hackers that penetrated the Pentagon systems
were Dutch.  "Anyone who is a bit clever can do it using detours such that
their number is untraceable," said Herschberg. "They could have been from
anywhere in the world including the United States itself."  Camouflaging a
hacker's trail is so easy via interlinked global computer networks that an
adept hacker would have to be "naive" not to escape detection, Herschberg said.
  U.S. congressional investigators told a Senate subcommittee this week that a
group of Dutch teen-age hackers broke into U.S. military computers at 34 sites
over about a one-year period ending last May.  The information the hackers
retrieved was described as crucial, but not secret.
  Herschberg acknowledged that there have been instances of Dutch computer
operators breaking into American computer mainframes.  But he called the
allegations of Dutch break-ins in this case "fishy," suggesting it was an
attempt to use the Dutch as a scapegoat since hacking has not been outlawed
here.  Herschberg suggested that American investigators may be trying to cover
up what may be a far more serious problem.  "Why else would they make all this
fuss?" he said.
  Herschberg, a professor of computer science at this nation's top engineering
school, teaches his students hacking techniques as part of a course on computer
security.  He regularly assigns students to break into corporate computer
systems, with prior authorization, to identify security gaps.  "It's a good
practical exercise," he said.
  Initial reports surfaced last April that Dutch hackers had broken into U.S.
defense systems computers via a worldwide computer research retrieval system.
In the wake of those disclosures, an official at Utrecht University, who was
told by students of the intrusion, defended it as a legitimate learning
exercise and said it was up to the U.S. military to take precautions.

------------------------------

Date: Wed, 20 Nov 91 22:39:01 GMT
Apparently-To: [email protected]
Subject: DNA Dog Tags

Army May Issue "DNA Dog Tags"  (Federal Computer Week, 19 Nov 91)

In a world without computers this would be a nice use of biotechnology to
unambiguously identify casualties in the event of disfiguring injury.  In a
world with databases and computers it represents a tremendous potential threat
to personal privacy.

Background:

Although all humans share a common set of genes, if you look closely there are
many small variations (polymorphisms) in our genes.  As a result, we are each
unique.  By taking a sample of DNA and analyzying a set of sites likely to be
polymorphic, it is possible to "finger print" an individual and determine with
very good reliability if another sample of DNA did or did not come from the
same person.  These polymorphisms can also be used to infer familial
relationships (you inherit half of each of your parent polymorphisms), and to
map and trace genetic disease genes like cytic fibrosis, and sickle cell
anemia.

When you have given a sample of your DNA, you have no control over how it will
be analyzed.  It could be used to define a set of polymorphic markers which are
other anonymous (unlinked to any genes of known function).  The same sample
could also be used to see if you have or carry genetic diseases.  If the
military builds a database of soldier's genotypes, there is nothing to prevent
them from including medically important markers as well as identification
information.  On the contrary, there is every reason to expect that they would
want to include as much medical information as possible because many medical
conditions do impact your ability to function as a soldier.

The risks:

Genetic privacy - Would you be forced to provide your military genotype data
when you applied for health insurance after discharge?  Would the local police
have the right to search the military genotype database every time a DNA sample
(spot of blood, hair follicle etc.) was found at a crime scene.  How are you
going to protect innocent soldiers against computer errors in that kind of a
search?

It affects people other than the soldier - because your relatives share your
genes, if you find out that you carry a genetic disease, everyone in your
family faces the questions of whether they also carry the gene, should they be
tested, should they screen their children etc.

Infered paternity - for about 5% of births the father of record is not the
biological father.  As a database of genotypes grew, cases would inevitably
arise where the genotype data demonstrated that the bibliographic information
being provided was wrong.  How would the military handle this?

We all carry genetic diseases - there is a concept called "genetic load" which
is the number of heterozygous genes (differences in the copies of a gene
inherited from your mother and father) where one of the copies would be lethal
if you got it from both your mother and father.  An average human carries about
6 such genes.  This is why incest is such a universal taboo; if close relatives
father a child there is greatly increased risk of getting two copies of such a
lethal or nearly lethal gene.  As medical science progresses and we enumerate
more and more such genes, the insurance companies will have the "justification"
to demand anyones genotype as a precondition for health insurance.  Would
insurance companies or the military have the right to screen and veto
prospective marriage partners?

The ethical implications of genotype databases are complex and potentially
threatening.  It would be a terrible mistake to proceed blindly into this area
without considering the numerous implications.
                                                               David States
National Center for Biotechnology Information / National Library of Medicine

------------------------------

Date: Wed, 27 Nov 91 11:52:26 PST
From: "Dr. Tom @MKO, CMG S/W Mktg, DTN 264-4865" <[email protected]>
Subject: Risks of hardcoded hexadecimal instead of symbolic constants?

Re: "Phone outages expected to be tied to typing mistake" (from The Wall Street
Journal, 25Nov91, p.B4) in RISKS-12.65 (Tuesday 26 November 1991):

When you put together 'DSC officials admitted that three bits of information
in a huge computer program were incorrect' with 'a "6" in a line of computer
code should actually have been a "D"', you reach the inevitable conclusion
that someone was coding in hexadecimal, unless the difference between a "6"
and "D" in some symbolic names just happened, coincidentally, to result in a
binary difference of three bits.

It seems highly likely that the use of suitably named symbolic constants in
place of cryptic hexadecimal constants would reduce the likelihood of such
errors.  Of course, many modern languages still make it easy to encode data
using hexadecimal constants, not that using decimal or binary or octal would
likely have avoided this error.
                                Dr. Thomas P. Blinn, Digital Equipment Corp.
Digital Drive -- MKO2-2/F10, Merrimack, New Hampshire 03054   (603) 884-4865

------------------------------

Date: Wed, 13 Nov 91 14:58:59 GMT
From: Geraint Jones <[email protected]>
Subject: Re: Leaves cause railway signal failure (RISKS-12.62)

British Rail's problem with wet fallen leaves and electronic train detection is
not caused by the lightness of the new Networker trains (and so is not fixed by
the /weight/ of older heavier trains).

The problem with the newer units is that they use disc brakes.  That means that
the running surfaces of the wheels only ever touch the rails and the insulating
paste of crushed leaf builds up on the /wheels/.   The problem is therefore not
cured by running track-clearing vehicles.   The (clever) fix employed is to
attach a single clutch-braked vehicle to each of the new trains (in many cases,
this would be a heavier clutch-braked multiple unit, but just a carriage will
do).  That car has clean wheels, makes good electrical contact with the rails
and so makes the train visible.

Modifications to clean the running surfaces of the wheels will probably be the
longer-term fix.

It is a classic systems problem:  who would have thought that changing from
external clutch brakes to better-protected disc brakes would undermine the
signalling system?

------------------------------

Date: 27 Nov 91 23:27:07 GMT
From: [email protected] (David Lamb)
Subject: Re: Termination (Bartelt, RISKS-12.65)

I don't see that the "computer system" makes things significantly different.
I've known of companies whose method of laying someone off was essentially a
Friday pink slip saying "Hand in your badge now, here's the contents of your
desk in this box, don't come back Monday, and here's K (>=2) weeks' pay in lieu
of notice" - which in some jurisdictions at least is considered to satisfy
statutory requirements about due notice.  Computer security concerns might make
such practices more widespread - but if you're going to get paid anyway, why is
it important to be allowed to continue to have access to the company's
property?  I suppose you might have personal files on the company computers,
which complicates things a bit.

 [The COMPUTER-RELATED RISK from the company's viewpoint is that ANY access
 whatever could lead to retributional acts.  On one hand, assuming an employee
 is reliable and responsible, there might be a lot of benefits to allowing
 computer accounts to be cleaned up by the individual in question.  On the
 other hand, ``friendly termination'' may be oxymoronic in many situations...
 PGN]

------------------------------

Date: Wed, 27 Nov 91 12:22:53 XST
From: [anonymous]
Subject: Re: Termination (Bartelt, RISKS-12.65)

A certain medium-sized software vendor recently went through the second set of
layoffs in six months.  How did people find out that they were laid off?  They
came into work Wednesday morning, and found all of the machines shut off.  So,
they waited, and talked to each other, and talked in the halls, until managers
came by and picked people off one by one.  Their accounts had been disabled the
night before, you see, and management didn't want people finding out by not
being able to log in.  (I never said this company was intelligent.)

For the previous set of layoffs, the dial-in modems were shut down for a day
or so, because some of the system administrators were fired.

Is there a RISK in all of this?  I'm not sure.  The firings (excuse me,
``layoffs'') were not done in a friendly manner, as near as I can find out.
Since there were indications a couple of weeks in advance (some of the people I
knew were positive they would be going before they were told to leave), the
precautions were pretty useless in my opinion -- and the treatment of the
employees in question did not seemed designed to incur goodwill.

------------------------------

Date: 22 Nov 91 14:49 +0100
From: Klaus Brunnstein <[email protected]>
Subject: Proposed Antivirus Certification

        Computer-Anti-MalWare Certification. A Proposal

                       Vesselin Bontchev
                     Dr. Klaus Brunnstein
                        Morton Swimmer
          Faculty for Informatics, Virus Test Center,
                     University of Hamburg
            Submitted to: NCSA Antivirus Conference
             Washington D.C, November 25-26, 1991

Abstract: To assure and enhance the quality of antiviral products, academic,
user and industry organisations (e.g., EICAR, NCSA) should initiate a process
of cooperation and standardization to lead to a process in which a
"certification" service is offered by a volunteer cooperative of interested
parties and organisations (here described as Anti-MalWare Certification
Institutions, AMCI).  It is hoped that this certificate may become an accepted,
respected and expected indicator of quality and function for software and
hardware.  Evaluation shall be based on published methodology and a collection
of malware (short for: malicious software) both known to exist or to be
feasible.

The tasks of AMCIs are described.  Virus Test Center at the University of
Hamburg is undertaking a pilot project to evaluate and describe the
capabilities of existing antiviral products.  Future research will try to
advance the development and understanding of the methodology of antiviral
products, including detection, prevention, repair of damages as well as
side-effects.

1) Foreword:

As problems of malicious software (malware) continue and spread worldwide and
at fast pace (presently more than 10 per week in IBM-compatible PCs),
enterprises, institutions and organisations find themselves more and more in
danger to become a victim of a "computer accidents". Users must ever more rely
on the quality of anti-malware measures whose producers depend on actual
knowledge of new threats.  With growing numbers and new virus methods, the
"anti/viral gap" (understood as the time gap between detection of a new virus
and the availability of an antiviral product recognising it) inevitably will
also grow (as long, as inherently secure and safe architectures are not
available).

To improve the likelihood of success and reduce the potential for damage, we
identify two possible efforts that deserve our increased attention:

    * secured and fast distribution of new malware knowledge
      to all parties with interest in anti-virus production,

    * evaluation and description of the capabilities of available anti-malware
      products by "credible" (and possibly "authoritative") individuals or
      organisations.

Concerns have been raised, which we intend to give due considerations:

  (1) making (dangerous) knowledge about viral methods available only to
      trusted parties (both in regard to secure communications as in judging
      the intentions and likely actions of the intended recipient);

  (2) ensuring that decisions restricting the flow of  knowledge of  details
      of malware do not result in undesirable  side-effects.

Speedy and effective improvement of anti-malware products and the benefit of
free-market competition is recognized as directly influenced by decisions as to
what information is made available.

2) Mission of "Anti-Malware Certification":

    - To  develop  a process  of  "Anti-Malware  Certification",
      several  independent institutions or individuals shall  be
      asked  (and suitably funded) to perform regular tests  and
      evaluations of anti-malware products or updates.

    - To  inaugurate and assist in such a development,  user  or
      industry organisations with knowledge on malware  problems
      and anti-malware software (e.g., NCSA/USA or EICAR/Europe)
      may  charge  institutions  or  individuals  with  assessed
      knowledge   to perform specific assessments to  assure
      the quality of anti-malware products.

    - Institutions  charged  with  "Anti-Malware  Certification"
      should  not  have commercial interests  in  production  or
      distribution of anti-malware measures.

    - The  test  basis shall be a collection  of  known  malware
      based  upon precise knowledge about any essential  detail,
      the  contents  of which must  be  suitably  published.  To
      minimize the dangers of such a  collection,  state-of-the-art
      security and safety measures shall be applied.

    - Each  submitted  anti-virus is tested for  its  detection,
      elimination  or  prevention capacity against  the  malware
      databank  under  a published  methodology.  The  test  for
      detection  shall  indicate,  in a form  understandable  to
      users, correct, false and missing diagnosis.

    - To guarantee the quality of the test methods applied and of the secure
      malware collection, "Anti-Malware Certification Institutions" will
      discuss their methods in critical scientific discourse.  Where feasible
      and possible without undue bureaucratization, they may also seek some
      form of certification  by legally established  institutions (e.g.,
      NIST/USA, German Information Security Agency).

    - Generally, test results (protocol, remarks) shall be published as some
      sort of "Anti-Malware User Report"; the organisations supporting the
      certification institutions may publish statistical surveys.  Only in
      cases of individual tests asked for by an anti-malware producer, results
      are confidential unless published by the submitter.

    - As independent individuals and academic institutions cannot develop and
      maintain such quality assurance mechanisms (including hardware,
      software, personnel and management), some adequate method of funding
      must be established.  One suggestion is that "Anti-Malware Certification
      Institutions" may charge a fee to cover personal,  managerial and
      machine costs; other suggestions may adapt established consumer report
      and product test procedures.  The adequacy of the financial arrangements
      shall be controlled by public discussion with users, academia and
      industry (possibly via related organisations).

3) Initialisation of the Anti-Malware Certification Process:

Based on the current work of Computer Anti-Virus Research Organisation (CARO),
a collection of annotated trojans and viruses in IBM- and compatible PCs has
been established at the Virus Test Center, University of Hamburg.  A test
methodology is being developed and currently tested, to run antiviral products
against the databank and to diagnose which malware (virus, trojan) is correctly
or incorrectly recognized.

The collection's content will be published periodically (Index of Established
Malware (IBM-PCs); next edition: December 1991). The test methodology (in the
first phase, with a multiplicity of files infected with known file viruses)
will be published when validated with some experience.

A first draft of this document has been initially discussed with the European
Institute for Computer Antivirus Research (EICAR) at its meeting of
chairpersons, on November 18, 1991 in Hamburg.  Following suggestions from this
meeting, Virus Test Center will perform experimental tests and evaluations of
available anti-malware software and report on the results in spring 1992.
After the EICAR meeting, the document had been refined; the authors wish
especially to thank Werner Uhrig (Austin/Texas, major contributor to Macintosh
antiviral activities) for his highly constructive contributions which helped to
refine this paper.

The authors submit this document to the user and academic public, and to
interested organisations.  Especially, this paper is submitted to National
Computer Security Association (NCSA/USA) at it's first Antivirus Developers
Conference, November 25-26, 1991 in Washington D.C.  for discussion.  Moreover,
legal aspects of the proposed quality assurance procedure shall also be
discussed with adequate institutions (e.g., NIST/USA, German Information
Security Agency).

4) Future developments:

Next scientific steps will undertake to assess also the reliability of
eradication (esp.  in multiple attacks) as well as preventive methods such as
checksumming and integrity tools.  Present experiences with shortcomings of
antiviral software prove that there is a lack of knowledge in basic methods to
assess such eradication or prevention of anti-viral methods.  To certify also
deletion and prevention methods, basic research will be needed.

------------------------------

Date: 22 Nov 91 17:11 +0100
From: Klaus Brunnstein <[email protected]>
Subject: Call for Papers: IFIP World Congress'92/Vulnerability

                       Call for Papers
                12th World Computer Congress
         IFIP Congress 92: From Research to Practice
              Madrid/Spain: September 7-11, 1992

              especially for the Congress Stream:
      Diminishing the Vulnerability of Information Society

Overview of the Congress:
This IFIP Congress is composed of topical and interrelated conferences each
organized by a separate subcommittee of the International Program Committee.

  Five parallel streams                     Stream Committee chairman
  ------------------------------------      -------------------------
  Software Development and Maintenance       A.N.Habermann,Pittsburgh
  Algorithms and Efficient Computation        Jan van Leeuwen,Utrecht
  From Architectures to Chips                   Gerald L.Reijns,Delft
  Informatics and Education              Peter Bollerslev, Copenhagen
  Diminishing the Vulnerability of
              the Information Society        Klaus Brunnstein,Hamburg

  Two subconferences:                Subconference Committee chairman
  -------------------------------    --------------------------------
  Expanding the Power of the Personal Computer Friedrich Vogt,Hamburg
  Enhancing the Intelligence
                in Information Systems       Gordon Davis,Minneapolis

The Congress will also include one day workshops, tutorials and an exhibition.
IFIP Congress 92 papers will be published in the conference proceedings
(Elsevier's "Transactions in Informatics" series).

International Program Committee:
Chair: Wilfried Brauer,          Technical University, Munich, Germany
ViceChair:Carlos Delgado Kloos Universidad Politecnica de Madrid,Spain
PastChair: Herve Gallaire                           gsi, Paris, France

Organizing Committee:
Chair: Rosa Alonso           Alcatel Standard Electrica, Madrid, Spain
ViceChair: Jaume Argila                                          Spain
ViceChair: Jose Ignacio Boixo                                    Spain
ViceChair:Fernando Saez Vacas  Universidad Politecnica de Madrid,Spain

                   Special Call for Papers
    Stream: Diminishing the Vulnerability of Information Society

With  worldwise use of Information Technology (IT), new  opportunities
arise but,  likewise,  new risks emerge through growing dependence  on
that same technology.  This means all users become more vulnerable  to
attacks  on and misuse of IT.  New types of computer based crime  have
been reported while the efficient operation of both public and private
enterprises  has  become susceptible  to  malfunction,  deliberate  or
accidental, in the information technology itself.

New  concerns  have arisen and older ones  have  been  enhanced.  Such
concerns include both human and civil rights,  privacy and freedom  of
the individual,  leisure and education,  the roles and design of work,
quality and reliability of the technology, etc. The very existence and
competitivity  of  enterprises has  become,  in  many  cases,  totally
dependent  upon the efficiency and reliability of  IT.  Moreover,  the
problem of complexity in contemporary system design may mean that some
systems  are  uncontrollable  by their users and  even  unfamiliar  to
systems experts. At the same time, the overall quality and reliability
of  the  technology plays an important role in  system  selection  and
design.

The Stream "Diminishing the Vulnerability of Information Society" will
attempt to assess the degree of vulnerability to IT that has developed
since the first discussions in the early 1980s.  Moreover, this stream
aims at identifying the ways and means by which this vulnerability may
be  reduced  and  how emerging problems may be solved  in  advance  by
anticipatory action.

Specific areas of interest which may be addresses in submitted papers include:

    - Opportunities and risks in the adoption of Information
           Technology, particular at International levels, with
           special emphasis on developments in Latin America
    - Social Vulnerability and major Risks
    - Legal Aspects: Reducing Vulnerability through the Law
    - Enhancing IT to meet demands for Reliability and Security,
           with particular emphasis on Personal Computers and
           Local Area Networks
    - Hardware and software systems for identification and
           authentication of users and attached systems
    - Reliability and security in Personal Computers
           and Local Area Networks (LANs)
    - Computer Supported Work: Impact of Vulnerability of IT
           on groups and organisation in an enterprise
    - Human centered strategies to cope with Vulnerability:
           the role of participation, education, and task design
    - The Electronic Cottage: Delivering Information and Communic-
           ation Technologies at Home: For Better or Worse?
    - Women, Computers and Work
    - Computer Ethics and Professional Responsibility.

Moreover,  short  presentations (posters) describing ongoing  research
projects  are  suggested  esp.  for the following  topics  (or  others
related to the topic):

    - The Electronic Cottage
    - Vulnerability of and through AI Systems
    - Enhancing the Security and Safety of IT, with special focus
           on Electronic Data Interchange (EDI) and Electronic
           Funds Transfer Systems (EFTS)

Invited speakers in the stream:
      Professor Harold Highland       New York/USA
      Professor Lance Hoffman         Washington/USA
      Professor Herbert Kubicek       Bremen/Germany
      Professor Bryan Niblett         Abington/England

Panel sessions on:
      Informatics and development
      Identification and authentification of users and systems
      The Electronic Cottage: How will daily life be affected
      Human, Man, Woman
      Ethics of Computing: Information Technology and
                           professional responsibility

Stream Program Committee:
    Klaus Brunnstein (chair)                   University of Hamburg
    William Caelli     Queensland University of Technology, Brisbane
    Robert R.Moeller                        Sears & Roebock, Chicago
    Jose Pino                 University of Chile, Santiago de Chile
    Fernando Saez-Vacas               Polytechnic University, Madrid

Information for Authors:
Six (6) copies of a full paper in English (no longer than 4500  words
or 12 double-spaced pages, including figures, with  150 word abstract,
full title, name and affiliation of author(s) as well  as  postal  and
electronic mail addresses, and telephone and fax numbers)  should   be
submitted not later than 10 January 1992 to the Stream's chairman:

              Professor Klaus Brunnstein
              Faculty for Informatics
              University of Hamburg
              Vogt-Koelln-Str.30
              2000 Hamburg 54
              Germany
              email: [email protected]

All papers will be reviewed by at least three, and relevance, originality and
clarity will be considered.  Accepted papers will be published in full in the
Conference Proceedings.

How to Submit a Poster: Three (3) copies of a one page abstract for a 10 minute
presentation should be sent to the appropriate subcommittee chairman so as to
arrive by April 15, 1992. The poster proposal will be judged for relevance and
clarity. Acceptance/rejection will be notified by May 15, 1992.  The final
version of the abstract has to be sent to the organizing committee for
inclusion into the poster brochure so as to arrive by June 20, 1992.

Key Dates:
   January 10,1992: Deadline for submission of papers
      March 9,1992: Notification of acceptance/rejection of papers
     April 15,1992: Deadline for submission of posters
     April 24,1992: Camera ready paper at Program Committee
       May 15,1992: Notification of acceptance/rejection of posters
      June 20,1992: Camera ready poster at Organizing Committee
September 7-11,1992: World Computer Congress, Madrid

For more details, please contact:
        FESI (Federacion Espanola de Sociedades de Informatica)
        IFIP Congress '92
        Hortaleza 104
        E-28004 Madrid, Spain
        Fax: (+34-1) 2431003
        E-mail: [email protected]

------------------------------

End of RISKS-FORUM Digest 12.66
************************