RISKS-LIST: RISKS-FORUM Digest  Weds 18 December 1991  Volume 12 : Issue BOWEN

       FORUM ON RISKS TO THE PUBLIC IN COMPUTERS AND RELATED SYSTEMS
  ACM Committee on Computers and Public Policy, Peter G. Neumann, moderator

----------------------------------------------------------------------

Date: Wed, 4 Dec 91 16:54:37 GMT
From: [email protected]
Subject: Software safety, formal methods and standards

Peter,

Below is an edited version of the messages I received back from my request
in this area. I have posted this to the comp.software-eng and
comp.specification newsgroups. This is probably too long for comp.risks,
but perhaps it would be worth including a pointer to these two newsgroups.

I hope you had a successful SIGSOFT conference. Will you be summarising
what happened in comp.risks? I think this would be a useful service if
you have the time.

Jonathan Bowen, PRG, Oxford.

Newsgroups: comp.software-eng,comp.specification
Subject: Software safety, formal methods and standards

Here is an edited (!) version of the messages I received back from my
request for information in this area. I have only included messages I
think are most relevant and have removed some sections of included
messages (indicated by "..." below).

Thank you for your responses. They have certainly been useful, and
acknowledgements will be included where appropriate.

Jonathan Bowen, Oxford University Computing Laboratory, UK.

------------------------------

Original message (RISKS-12.60, 6 November 1991):

From: [email protected] (Jonathan Bowen)
Subject: Software safety, formal methods and standards
Date: 6 Nov 91 12:29:45 GMT
Reply-To: [email protected] (Jonathan Bowen)
Organization: Programming Research Group, Oxford University, UK

I am writing a review paper on software safety, formal methods and
standards. In particular, I am looking at the recommendations for the
use of formal methods in software safety standards (and draft
standards). So far I have considered the (some draft) standards listed
below. If anyone has any recommendations of others to consider, please
send me details of how to obtain them (or the document itself if
possible!).

I am also interested in general references in this area. I have
quite a few already, but I may have missed some. If they are
obscure and you can send me the paper itself, so much the better.

If there is enough interest, I will summarize the responses.

--
Jonathan Bowen
Oxford University Computing Laboratory, Programming Research Group
11 Keble Road, Oxford OX1 3QD, England.
Tel:     +44-865-272574 (direct) or 273840 (secretary)
FAX:     +44-865-272582 (direct) or 273839 (general)
Email:   [email protected]

------------------------------

ESA software engineering standards,
European Space Agency, 8-10 rue Mario-Nikis, 75738 Paris Cedex, France,
ESA PSS-05-0 Issue 2, February 1991.

Programmable Electronic Systems in Safety Related Applications:
1. An Introductory Guide,
Health and Safety Executive,
HMSO, Publications Centre, PO Box 276, London SW8 5DT, UK, 1987.

Programmable Electronic Systems in Safety Related Applications:
2. General Technical Guidelines,
Health and Safety Executive,
HMSO, Publications Centre, PO Box 276, London SW8 5DT, UK, 1987.

Software for computers in the application of
industrial safety related systems,
International Electrotechnical Commission,
Technical Committee no. 65, 1989. (BS89/33006DC)

Functional safety of programmable electronic systems: Generic aspects,
International Electrotechnical Commission,
Technical Committee no. 65, 1989. (BS89/33005DC)

Standard for software safety plans,
Preliminary - subject to revision,
P1228, Software Safety Plans Working Group,
Software Engineering Standards Subcommittee,
IEEE Computer Society, USA, July 1991.

The Procurement of Safety Critical Software in Defence Equipment
(Part 1: Requirements, Part 2: Guidance),
Interim Defence Standard 00-55, Issue 1,
Ministry of Defence, Directorate of Standardization,
Kentigern House, 65 Brown Street, Glasgow G2 8EX, UK, 5 April 1991.

Hazard Analysis and Safety Classification of the Computer and
Programmable Electronic System Elements of Defence Equipment,
Interim Defence Standard 00-56, Issue 1,
Ministry of Defence, Directorate of Standardization,
Kentigern House, 65 Brown Street, Glasgow G2 8EX, UK, 5 April 1991.

Safety related software for railway signalling,
BRB/LU Ltd/RIA technical specification no. 23,
Consultative Document, Railway Industry Association,
6 Buckingham Gate, London SW1E 6JP, UK, 1991.

--
Jonathan Bowen, <[email protected]>
Oxford University Computing Laboratory.

>From [email protected] Wed Nov  6 18:41:43 1991
Message-Id: <[email protected]>
Date: Wed, 6 Nov 91 10:45:28 -0500
From: "Ben L. Di Vito" <[email protected]>
To: Jonathan Bowen <Jonathan.Bowen@prg>
In-Reply-To: [email protected]'s message of 6 Nov 91 12:29:45 GMT
Subject: Software safety, formal methods and standards
Reply-To: [email protected]

In the US, software for civilian aircraft is certified in accordance
with the DO-178A "standard" published by a professional organization
known as RTCA:

 Software Considerations in Airborne Systems and Equipment Certification
 DO-178A     March, 1985

 Radio Technical Commission for Aeronautics
 One McPherson Square
 1425 K Street N.W., Suite 500
 Washington, D.C. 20005

Technically, this is only a "guideline" rather than a standard, but in
practice the FAA tends to view it as having the weight of a standard.
It states various requirements that should be met for software to be
certified at three degrees of criticality.

DO-178B is a revision currently being worked out in committee.  It will
probably hit the streets in about a year.  George Finelli of NASA Langley
Research Center in on the RCTA committee.  He and I have been trying to
insert some material on formal methods into the revised document (DO-178A
does not explicitly recognize formal methods as part of accepted practice).
Right now it looks like we will be able to get in one page of very
general guidelines on using formal methods in a section on alternate
means of compliance.  Basically it suggests how one might be able to
trade off formal methods against traditional kinds of verification to
earn "certification credit."  Unfortunately, it will not go nearly as
far as 00-55 in embracing formal methods overall.

If you would like to include something on DO-178B in your paper, check
back with us in a few weeks.  I'm not sure what can be said right now
since this material I mentioned has not yet been approved by the full
committee.

Other standards in the US may also be of interest to you.  You may have
heard of the Trusted Computer System Evaluation Criteria published by
the DoD.  It is primarily concerned with computer security, but that
community has been the leading proponent of applied formal methods
here for over ten years.  I have also heard of some proposed standards
to be issued by the National Institute of Standards and Technology
(NIST), formerly National Bureau of Standards (NBS).  I think these
are still too embryonic to talk about, though.

..

   Ben Di Vito
   Vigyan, Inc.                       [email protected]
   MS 130
   NASA Langley Research Center       (804) 864-4883
   Hampton, VA 23665   USA


>From steph%[email protected] Wed Nov  6 21:29:54 1991
To: Jonathan.Bowen@prg
Subject: Re: Software safety, formal methods and standards
References: <[email protected]>
Date: Wed, 06 Nov 91 11:27:05 -0800
Message-Id: <[email protected]>
From: [email protected]
Sender: steph%[email protected]

..

You might want to look at these:

ISO 9000 (sorry, forgot the title)

I know the FDA (US Food and Drug Administration) has come up with some
new guildlines recently too.

-- Good luck,
  Stephanie Aha   [email protected]
  Information and Computer Science
  U.C. Irvine,  Irvine, CA 92717 USA

>From [email protected] Thu Nov  7 01:20:25 1991
         Thu, 7 Nov 1991 00:08:06 +0000
Date: Wed, 6 Nov 91 16:07:19 PST
From: [email protected] (Douglas L. Bryan)
Message-Id: <[email protected]>
To: Jonathan.Bowen@prg
Subject: Re: Software safety, formal methods and standards
Sender: [email protected]

``Proposed Standard for Software for Computers in the Safety Systems
of Nuclear Power Stations'', (based on IEC Standard 880),
Final Report for contract 2.117.1 for the Atomic Energy Control Board,
March 1991.

by David L. Parnas, TRIO, Computer and Information Science, Queen's
University, Kington, Ontario K7L 3N6 Canada.

..

I don't know that 2.117.1 or IEC mean, and I don't know if it's
been published; I received a copy from a friend months ago.

doug

>From barriost%[email protected] Thu Nov  7 23:21:32 1991
Message-Id: <[email protected]>
Subject: Re: Software safety, formal methods and standards
To: Jonathan.Bowen@prg
Date: Thu, 7 Nov 91 8:56:54 MST
Cc: [email protected]
Sender: barriost%[email protected]

..

A good friend of mine, Scott Sheppard, wrote his master's thesis on
the subject of Software Safety (in 1989?).  it would be available from
the Arizona State University library.  or, if you'd like to contact
Scott, his email address is:
 uunet!ithaca!scott

--
Tim Barrios, AG Communication Systems, Phoenix, AZ
UUCP: ...!{ncar!noao!asuvax | uunet!samsung!romed!asuvax | att}!gtephx!barriost
Internet: [email protected]
voice: (602) 582-7101        fax:   (602) 581-4022

>From ramu%[email protected] Fri Nov  8 03:24:40 1991
From: [email protected]
Date: Wed, 6 Nov 91 16:30:10 CST
Message-Id: <[email protected]>
To: Jonathan.Bowen@prg
Cc: [email protected]
Subject: Re: Software safety, formal methods and standards
References: <[email protected]>
Sender: ramu%[email protected]

..

Hi!  You may want to look at the NIST document on High Integrity Software.
Details about obtaining it is at the end of the message. This is a very
comprehensive report that discusses the topic about software safety,
hazard analysis, and does include some references.  This software is
available via an archive-server.

I think you've compiled a very useful list of software safety references.
May I obtain additional information about obtaining the following
document?  An email address should suffice.

>> Standard for software safety plans,
>> Preliminary - subject to revision,
>> P1228, Software Safety Plans Working Group,
>> Software Engineering Standards Subcommittee,
>> IEEE Computer Society, USA, July 1991.

Another possibly useful reference would be in a journal called
"Journal of Safety and Reliability"  -- I remember that one of the
issues, concentrating on software safety, was published as a book.
I don't remember the exact year, but a trip to the library should
solve the problem via browsing.

Here's the information about the NIST document:

Xref: motsrd comp.software-eng:4855 comp.specification:471
Path: motsrd!mothost!ftpbox!uunet!dove!swe.ncsl.nist.gov!kuhn
From: [email protected] (Rick Kuhn)
Subject: Report on Assurance of High Integrity Software
Message-ID: <[email protected]>
Date: 4 Oct 91 18:09:42 GMT
Sender: [email protected]
Followup-To: comp.software-eng
Organization: NIST
Lines: 37

Assurance of High Integrity Software - report available

The need for dependable software has resulted in the production of a
variety of standards:  the Trusted Computer Security Evaluation
Criteria ("Orange Book"), the British MoD 00-55, the DO-178A standard
for civil aviation, the IEC 880 standard for the nuclear industry, and
others.  Because of technical, economic, and political considerations,
these standards approach the question of assurance from a variety of
viewpoints.  There is much disagreement over how dependable software
can be produced.  The controversy over MoD 00-55, with its requirement
for formal methods and deprecated programming practices, is a recent
example.

To address the question of assuring the trustworthiness and integrity
of software, and what assurances should be required in standards, the
National Institute of Standards and Technology brought together experts
from industry, academia, and government in a Workshop on the Assurance
of High Integrity Software in January.  The report is now available for
electronic distribution.  (It will soon be available from the Govt.
Printing Office in paper form.) The report can be obtained from our
mail server.  Both Postscript and troff formats are available.  Send a
message containing ONE of the following requests to [email protected]:


       send ahisrptp               /* for Postscript */

       send ahisrptt               /* for troff */


The report will be delivered as three (troff) or 16 (postscript) email
messages.  Remove the headers and concatenate the files, then unpack
them using either 'unshar' or the UNIX shell 'sh'.  (Instructions
included in the files.)

Hope this helps.

--Ramu Iyer

[email protected]

>From [email protected] Fri Nov  8 05:15:41 1991
Date: Thu, 7 Nov 1991 21:48:32 -0700
From: Steve Emmerson <[email protected]>
Message-Id: <[email protected]>
To: Jonathan.Bowen@prg
Subject: Re: Software safety, formal methods and standards [via Jim Horning]
References: <[email protected]>
Organization: UCAR/Unidata

..

(From my files:)

>From comp.risks Tue Sep  4 20:05:34 1990
Date: Fri, 31 Aug 90 18:08:40 -0700
From: Nancy Leveson <[email protected]>
Subject: What is "safety-critical"?
Reply-To: [email protected]

RISKS-10.28 was certainly provocative.  I do not want to enter into the basic
argument, but I would like to comment on a couple of statements that appear
incorrect to me.

Peter Mellor writes:
  >Safety-*related*, I would say, but not safety-*critical*. The usual
  >definition of safety-critical is that a system failure results in
  >catastrophe.

System safety engineers in the textbooks I have read do not define
"safety-critical" in terms of catastrophic failure.  They use "hazards"
and the ability to contribute to a hazard.  A safety-critical system or
subsystem is one that can contribute to the system getting into a hazardous
state.  There are very good reasons for this, which I will attempt to explain.

Accidents do not usually have single causes -- most are multi-factorial.
We usually eliminate the simpler accident potentials from our systems which
only leaves multi-failure accidents.   The basic system safety goal
is to eliminate all single-point failures that could lead to unacceptable
consequences and minimize the probability of accidents caused by multi-point
failures.  Using Pete`s definition, there are almost NO systems that are
safety critical including nuclear power plant and air traffic control systems
because we rarely build these systems so that a single failure of a single
component can lead to an accident.  That the failure of the whole nuclear power
plant or air traffic "system" is an accident is tautological -- what we are
really talking about are failures of components or subsystems of these larger
"systems" like the control components or the simulators (in this case).

The term "safety-related" seems unfortunate to me because it is too vague
to be defined.  Some U.S. standards talk about "first-level interfaces"
(actions can directly contribute to a system hazard) or "second-level"
(actions can adversely affect the operation of another component that can
directly contribute to a hazard).  Another way of handling this is to talk
about likelihood of contributing to a hazard -- with the longer chains having
a lower likelihood.  But I would consider them all safety-critical if their
behavior can contribute to a hazard -- it is only the magnitude of the risk
that differs.  I have a feeling that the term "safety-related" is often used
to abdicate or lessen responsibility.

There is a very practical reason for using hazards (states which in
combination with particular environmental conditions have an unacceptable risk
of leading to an accident).  In most complex systems, the designer has no
control over many of the conditions that could lead to an accident.  For
example, air traffic control hazards are stated in terms of minimum separation
between aircraft (or near misses) rather than in terms of eliminating
collisions (which, of course, is the ultimate goal).  The reason is that
whether a collision occurs depends not only on close proximity of the
aircraft but also partly on pilot alertness, luck, weather, and a lot of
other things that are not under the control of the designer of the system.
So what can the designer do to reduce risk?  She can control only the
proximity hazard and that is what she does, i.e., assumes that the environment
is in the worst plausible conditions (bad weather, pilot daydreaming) and
attempts to keep the planes separated by at least 500 feet (or whatever, the
exact requirements depend on the circumstances).

When assessing the risk of the air traffic control system, the likelihood
of the hazard, i.e., violation of minimum separation standards, is assessed,
not the likelihood of an ATC-caused accident.  When one later does a complete
system safety assessment, the risk involved in this hazard along with the
risk of the other contributory factors are combined into the risk of a
collision.

Why does this push my hot button?  Well, I cannot tell you how many times
I have been told that X system is not safety-critical because it's failure
will not cause an accident.  For example, a man from MITRE and the FAA argued
with me that the U.S. air traffic control software is not safety critical (and
therefore does not require any special treatment) because there is no possible
failure of the software that could cause an accident.  They argued that the
software merely gives information to the controller who is the one who gives
the pilot directions.  If the software provides the wrong information to
the controller, well there was always the chance that the controller
or the pilot could have determined that it was wrong somehow.  But an ATC
software failure does not necessarily result in a catastrophe so it is not
safety critical (as defined above).  Perhaps this argument appeals to you,
but as a person who flies a lot, I am not thrilled by it.  As I said,
this argument can be applied to EVERY system and thus, by the above
definition, NO systems are safety-critical.

Again, there may be disagreement, but I think the argument has been pushed
to its absurd limit in commercial avionic software.  The argument has been
made by some to the FAA (and it is in DO-178A and will probably be in
D0-178B) that reduction in criticality of software be allowed on the
basis of the use of a protective device.  That is, if you put in a backup
system for autoland software, then the autoland system itself is not safety
critical and need not be as thoroughly tested.  (One could argue similarly
that the backup system is also not safety-critical since it's failure alone
will not cause an accident -- both of them must fail -- and therefore neither
needs to be tested thoroughly.  This argument is fine when you can accurately
access reliability and thus can accurately combine the probabilities.  But
we have no measures for software reliability that provide adequate confidence
at the levels required, as Pete says).  The major reason for some to support
allowing this reduction is that they want to argue that the use of n-version
software reduces the criticality of the function provided by that software
(none of the versions is safety critical because a failure in one alone will
not lead to an accident) and therefore required testing and other quality
assurance procedures for the software can be reduced.

  >With a safety-critical airborne system, we are certifying that it has
  >a certain maximum probability of crashing the aircraft.

Actually, you are more likely to be certifying that it will not get into
(or has a maximum probability of getting into) a hazardous state from which
the pilot or a backup system cannot recover or has an unacceptably low
probability of recovering.  The distinction is subtle, but important as I
argued above.  Few airborn systems have the capacity to crash the aircraft
by themselves (although we are heading in this direction and some do
exist -- which violates basic system safety design principles).

  >RTCA/DO-178A ('Software Considerations in Airborne Systems and Equipment
  >Certification') does *not* define any reliability figures. It is merely a
  >set of guidelines defining quality assurance practices for software.
  >Regulatory approval depends upon the developer supplying certain documents
  >(e.g., software test plan, procedures and results) to show that the
  >guidelines have been followed.

There is currently an effort to produce a DO-178B.  This will go farther than
178A does.  For one thing, it is likely to include the use of formal methods.
Second, it will almost certainly require hazard-analysis procedures.  If anyone
is interested in attending these meetings and participating, they are open
to the public and to all nationalities.

>From Pete Mellor's description of the Forrester and Morrison article:
   >This is likely to change following the publication of the draft defence
   >standard 00-55 by the MoD in the UK in May 1989. The DoD in the US are
   >not doing anything similar, though the International Civil Aviation
   >Organisation is planning to go to formal methods.

Depends on what you mean by similar.  The DoD, in Mil-Std-882B (System Safety)
has required formal analysis of software safety since the early 80s.  MoD draft
defense standard 00-56 requires less than the equivalent DoD standard.  There
has been a DoD standard called "Software Nuclear Safety" that has been in force
for nuclear systems for about 4-5 years.  And there are other standards
requiring software safety analysis (e.g., for Air Force missile systems) that
were supplanted by 882B.  882B is likely to be replaced by 882C soon -- and the
accompanying handbooks, etc. do mention the use of formal methods to accomplish
the software safety analysis tasks.
                                                      nancy

>From [email protected] Sat Nov  9 02:40:00 1991
Date: Thu, 7 Nov 91 09:28:27 MST
From: Al Stavely <[email protected]>
Message-Id: <[email protected]>
To: Jonathan.Bowen@prg
Subject: Re: Software safety, formal methods and standards
In-Reply-To: <[email protected]>
Organization: New Mexico Tech, USA

This was posted some time ago.  I haven't been able to find descriptions
of the standards mentioned in any literature available to me -- if you
find any, I'd like to hear about them!

------------------------------

>From: [email protected] (Chris Holt)
>Newsgroups: comp.software-eng
>Subject: Re: Formal Specifications Required?
>Message-ID: <[email protected]>
>Date: 25 Sep 91 17:57:28 GMT
>
>[email protected] (Iain Woolf) writes:
>>In article <[email protected]> [email protected] (Ken Collier) writes:
>>>
>>>I recently heard an unsubstantiated rumor that Britain is requiring that
>>>formal methods be used in the specification of all software developed for use
>>>by its government agencies.  Can anyone substantiate (clarify, etc.) or refute
>>>this rumor.  It seems to me that it has some very interesting and serious
>>>ramifications.  Anyone?
>
>>Most work done for Government and MOD will only be contracted to companies
>>who have BS5750 - whether this is official or just an official (or even
>>unofficial) recommendation, I don't know.
>
>Also, 0055 and 0056 look as though they'll want some pretty
>heavy formal methods (though there is a clause that "rigorous
>argument" will be good enough, whatever that means).
>
> [email protected]      Computing Lab, U of Newcastle upon Tyne, UK
> "Where lies the land to which the ship would go?  Far, far ahead..." - AHC
--

       - Allan Stavely, New Mexico Tech, USA

>From [email protected] Sun Nov 10 01:11:01 1991
         Sun, 10 Nov 1991 01:10:26 +0000
Date: Sat 9 Nov 91 11:28:18-PST
From: John Rushby <[email protected]>
Subject: Safety standards
To: Jonathan.Bowen@prg
Cc: [email protected]
Message-Id: <[email protected]>
Mail-System-Version: <SUN-MM(229)+TOPSLIB(128)@csl.sri.com>
Sender: [email protected]

Jonathan,

Two other standards you might want to check out are a recent draft
Canadian standard for safety-critical Nuclear (powerstation) systems
drawn up by David Parnas.  I have a copy of that somewhere.

The other, and by far the most important, is the revision to DO178A
(to be called DO178B), which is the standard governing computerized
control systems in (US) commercial aircraft.  There is nothing
published on this, and the discussion process is on-going, so you'll
need to talk to an insider to find out what's up.  Mike DeWalt (FAA)
and Jim McWha (chief of flight controls for the 777--next big Boeing:
FBW twin) will be at Sigsoft91, so that would be a good excuse for
attending.

John

------------------------------

>From [email protected] Tue Nov 12 22:19:47 1991
Date: Tue, 12 Nov 91 09:41:55 EST
From: David Parnas <[email protected]>
Subject: Re:  Software safety, formal methods and standards
To: Jonathan.Bowen@prg
Message-Id: <[email protected]>

..  The AECB draft standard can only be obtained from the AECB.

You might write to Dr. Kurt Asmis.
His email is [email protected]

Prof. David Lorge Parnas
Communications Research Laboratory
Department of Electrical and Computer Engineering
McMaster University,
Hamilton, Ontario  Canada L8S 4K1

Telephone: 416 525 9140 Ext. 7353
Telefax:   416 521 2922

email: [email protected]

>From [email protected] Tue Nov 12 22:25:23 1991
Message-Id: <[email protected]>
To: Jim Horning <[email protected]>
Cc: Jonathan.Bowen@prg
Subject: Re: I'll bet RISKS readers could help here...
In-Reply-To: Your message of "Mon, 11 Nov 91 10:34:35 -0800."
Date: Mon, 11 Nov 91 10:55:15 -0800
From: [email protected]

I was involved with DO178B in the beginning, i.e., attended some meetings
and wrote some minority views up (I was usually in the minority :-)).
However, I got busy with TCAS and attending all those meetings and have
not had time to participate.  Frankly, I think the whole aircraft
certification attitude toward safety is pretty bad -- but historically
the commercial aircraft industry has never been interested in much of
anything other than reliability.

I am currently writing a book on system/software safety while on sabbatical
and will be including a chapter (and maybe more) on regulation and standards
issues.  Personally I find them very dangerous and try to stay away from
them.  Ethically such efforts bother me as they tend to be used by manufacturers
to limit their liability and their own efforts.  This is particularly true
with DO178A and B.  Standards that specify exactly what a manufacturer must
do also provide them with a list of things that they will do and no more.
And how well they do even the things specified are usually not included.
I much prefer standards that tell manufacturers what their goals should be
and then provide mechanisms to approve their plans, procedures, and success.

In general, I believe that standards that take away responsibility and
accountability from the manufacturer will lead to decreased safety.
Specifying particular techniques to use take away that responsibility
and accountability.

nancy

>From martinc%[email protected] Wed Nov 13 02:49:02 1991
Date: Mon, 11 Nov 91 11:44:26 -0500
From: "Charles R. Martin" <[email protected]>
Message-Id: <[email protected]>
To: Jonathan.Bowen@prg
In-Reply-To: <[email protected]>
Subject: Re: Software safety, formal methods and standards
Sender: martinc%[email protected]

[email protected] writes:
> Charles,
>
> Thanks for the contacts. I've read quite a lot of Nancy Leveson's papers
> (and know that she is a leading light in this area). What is John McHugh's
> contribution to the field?
>

He has been dealing with safety to some extent as part of a more
general definition of trust; he's also been doing a safety
tutorial.  I'm not keeping up with him on that, because we've
been working together on security modeling stuff, but I know he's
doing some.

..

>From Debra_Sparkman.ADD%[email protected] Thu Nov 14 05:19:18 1991
Message-Id: <[email protected]>
Date: 12 Nov 91 12:48:32
From: Debra Sparkman <[email protected]>
Subject: Software Safety, Formal Met
To: Jonathan.Bowen@prg
Cc: [email protected]
Sender: Debra_Sparkman.ADD%[email protected]

                      Subject:                               Time:12:31 PM
 OFFICE MEMO          Software Safety, Formal Methods,_      Date:11/12/91
Jonathan,

My project is also reviewing standards and the state-of-the-practice in
software safety, reliability, formal methods and life cycle processes as part
of larger project.

Your list seemed almost identical to mine.  However, I have ISO 8807 a standard
dealing with the formal method LOTOS.  If you are interested the complete title
is: "Information processing systems - Open Systems Interconnection - LOTOS - A
formal description technique based on the temporal ordering of observational
behaviour".

..
I've also attached a list of articles and publications I have (or am getting)
on software safety and formal methods.  I appologize for the format I dumped it
from a database and the field separators didn't stay.  If you cannot find what
them please let me know I will send them to you.

..

Debra Sparkman
Lawrence Livermore National Laboratory
P. O. Box 808 L-307
Livermore, CA 94550
USA
Tele: (510) 422-1855
FAX: (510) 422-3194

Author  Title   Issued  Issue or Publisher
Bennett Forwards to safety standards    Mar 1991        Software Eng. Journal
Bjorner, Jones  Formal Specification & Software Development     1984    Prentice-Hall
Bjorner, Jones  VDM - A Formal Method at Work - Experience with VDM in Norsk
Data    Mar 1987        Springer-Verlag
Clements, Parnas        A Rational Design Process: How And Why To Fake It       Feb 1986
IEEE Trans. on Software Engineering
Craigen FM89: Assessment of Formal Methods for Trustworthy Computer Systems
1990    Proceedings of the 12th International Conference on Software Engineering
Craigen Tool Support for Formal Methods 1991    IEEE
Craigen, Ryan   FM91: Formal Methods Workshop   1991    IEEE
Craigen, Summerskill    Formal Methods for Trustworthy Computer Systems - FM89
1990    Springer-Verlag
Fraser, Kumar, Vaishnavi        Informal and Formal Requirements Specification
Languages: Bridging the Gap     May 1991        IEEE Trans. on Software Engineering
Gabrielian, Franklin    Multilevel Specification of Real-Time Systems   May 1991
Communications of the ACM
Gerhart Applications of Formal Methods: Developing Virtuoso Software    Sep 1990
IEEE Software
Gerhart Formal Methods: An International Perspective    1991    IEEE
Gerhart Preliminary Summary: FM89 Assessment of Formal Methods for Trustworthy
Computer Systems        Dec 1989        ACM SIGSOFT
Gong, Wing      Raw Code, Specification, and Proof of the Avalon Queue Example  Aug
1989    Carnegie Mellon University - Computer Science #CMU-CS-89-172
Guttag, Horning Introduction to LCL, A Larch/C Interface Language       Jul 1991
Digital Systems Research Center
Guttag, Horning, Wing   Larch in Five Easy Pieces       Jul 1985        Techincal Report 5,
DEC Systems Research Center
Guttag, Horning, Wing   The Larch Family of Specification Languages     Sep 1985        IEEE
Software
Hall    Seven Myths of Formal Methods   Sep 1990        IEEE Software
Harel   Statecharts: A Visual Formalism for Complex Systems     Aug 1987        Sciece of
Computer Programming
Heninger        Specifying Software Requirements for Complex Systems:  New Techniques
and their Application   Jan 1980        IEEE Trans. on Software Engineering
Hill, Robinson, Stokes  Safety-Critical Software in Control Systems     Nov 1989        IEC
Proceedings of Computers and Safety, Nov 8-10, 1989
Jones   LM3: A Larch Interface Language for Modula-3, A Definition and
Introduction, Version 1.0       Jun 1991        Digital Systems Research Center
Jones   Systematic Software Development Using VDM       1986    Prentice-Hall
Jones, Bjorner  Formal Specification & Software Development     1982    Prentice-Hall
Lafontaine, Ledru, Schobbens    An Experiment in Formal Software Development:
Using the B Theorem Prover on a VDM Case Study  May 1991        Communications of the
ACM
Leveson Evaluation on Software Safety   1990    IEEE
McDermid, Thewlis       Safety-critical systems - Editorial     Mar 1991        Software Eng.
Journal
Milner  A Calculus of Communicating Systems     1986
Nix, Collins    The Use of Software Engineering, Including the Z Notation, in
Development of CICS     1988    Quality Assurance pg 102-110
O'Neill VDM development with Ada as the target language 1988    Lecture Notes in
Computer Science
Oneill, Clutterbuck, Farrow, Summers, Dolman    The Formal Verfication of
Safety-Critical Assembly Code   1988    Proceedings of IFAC SAFECOMP Conference,
Pergammon Press
Parnas, Clements, Weiss The Modular Structure of Complex Systems        Mar 1984
Proceedings on the Seventh International Conference on Software
Engineering/IEEE Trans. on Software Engineering Mar 1985
Pedersen, Klein Using the Vienna Development Method (VDM) To Formalize a
Communication Protocol  Nov 1988        Carnegie-Mellon/Software Engineering
Institute/Defense Technical Information Center
Rollins, WIng   Specifications as Search Keys for Software Libraries    Jun 1990
Proceedings of the International Conference on Logic Programming - Paris,
FranceSpivey    Introducting Z: A Specification Language and is Formal Semantics
1988    Cambridge University Press
Spivey  The Z Notation: A Reference Manaul      1987    Prentice Hall
Turner, various Formal Description Techniques   1989    North-Holland, Amsterdam
unknown Software Safety: Two Approaches unknown via Dennis Lawrence
various Formal Description Techniques-Proceedings for the First International
Conference on FDT       Sep 1988        North-Holland
Wing    A Larch Specification of the Library Problem    Dec 1986        Carnegie Mellon
University - Computer Science #CMU-CS-86-168
Wing    A specifier's introduction to formal methods    Sep 1990        IEEE Computer
Wing    Specifying Avalon Objects in Larch      Dec 1988        Carnegie Mellon University -
Computer Science #CMU-CS-88-208
Wing    Using Larch to Specify Avalon/C++ Objects       Sep 1990        IEEE Transactions on
Software Engineering
Wing    Writing Larch Interface Language Specifications Jan 1987        ACM Transactions
on Programming Languages and Systems
Wing, Gong      Experience with the Larch Prover        Sep 1989        Jeannette Wing


       IEE Proceedings of Computers and Safety, Nov 8-10, 1989 Nov 1989

>From [email protected] Thu Nov 14 22:03:15 1991
         Thu, 14 Nov 1991 20:42:41 +0000
         Thu, 14 Nov 91 12:13:33 AST
Date: Thu, 14 Nov 91 12:11:56 EST
From: (Jim Pyra) jpyra <[email protected]>
Message-Id: <[email protected]>
To: Jonathan.Bowen@prg
Subject: Safety Standards

Jonathan:

    You may be interested in the proposed standard for
"SOFTWARE FOR COMPUTERS IN THE SAFETY SYSTEMS OF NUCLEAR POWER STATIONS"
based on IEC Standard 880 for the Atomic Energy Control Board of Canada
and written by Dr. David Parnas.

You can reach Dave Parnas at <[email protected]>

Helpful Jim
Standard disclaimers...
Jim Pyra                            email:[email protected]
PRIOR Data Sciences Ltd.            Phone:(902)423-1331
5475 Spring Garden Rd., Suite 601   FAX: (902)425-3664
Halifax N.S.  B3J 3T2
Canada

>From heiner%b11.dnet%[email protected] Fri Nov 15 14:03:48 1991
From: heiner <[email protected]>
Message-Id: <[email protected]>
Date: Fri, 15 Nov 91 14:10:07 +0100
To: "[email protected]"@dnet.B21
Sender: "b11.dnet!heiner" <heiner%b11.dnet%[email protected]>
Subject: Your request for software safety related standards via Risks Forum

Dear Mr. Bowen,

there are some more standards or draft standards regarding that subject
we know of, but they are rather general and not oriented towards formal methods:

"Software for computers in the safety system of nuclear power stations";
IEC 45A(Central Office)88, 1984.
  Comment: The scope of this standard is much broader than the title suggests.
           The rules of this standard have been derived from preliminary
           guidelines produced by the European Workshop on Industrial Computer
           Systems (EWICS) TC7. As you probably know these guidelines have been
           edited in three volumes "Dependability of Critical Computer Systems"
           by Felix Redmill, Elsevier Applied Science, since 1988.
           EWICS TC7 is still active. The chairman is Robin Bloomfield with
           Adelard in London, phone: 081 983 1708. There is a subgroup on
           formal methods which is chaired by Robin Bloomfield too, I think.

"Software Considerations in Airborne Systems and Equipment Certification";
RTCA/DO-178A, March 1985.
  Comment: This is a guideline being widely applied in civil aviation.

"Principles for computers in safety-related systems";
preliminary standard DIN VDE 0801, January 1990.
  Comment: Though preliminary this standard is actually the most important
           non-application-specific standard in Germany for safety related
           systems. Its english translation has been submitted to IEC.

I would be glad if I could help you a little bit, and I would be even gladder
if you would send me your review paper.
Please don't hesitate to mail again if you need more information (but please
don't use the reply-mechanism; we often got problems receiving replies.)

Best regards

Guenter Heiner
Systems and Software Technology Research
Daimler-Benz AG
Alt-Moabit 91b
D-1000 Berlin 21
phone: ++49-30-39982237
fax:          -39982107
email: [email protected]
      unido!b21!heiner

>From JZ01%swt.decnet%[email protected] Sat Nov 23 22:54:44 1991
Date: Sat, 23 Nov 1991 09:44 CST
From: JZ01 <JZ01%[email protected]>
Subject: Re: Software safety, formal methods and standards
To: jonathan.bowen@prg
Message-Id: <[email protected]>
Sender: JZ01 <JZ01%swt.decnet%[email protected]>

Jon,

I think you have to look into the work of EWICS (European Workshop
on Industrial Computer Systems), TC7 on Realiability and Safety.
The related publications are:
* Felix Redmill (Ed.), Dependability of Critical Computer Systems,
Volumes I and II, Elsevier Applied Science, London, 1988 and 1989
(Crown House, Linton Road, Barking, Essex, IG11 8JU)
* A huge number of Working Papers, some of the related to your topic.

To get the Working Papers you may contact the current Chairman, Robin
Bloomfield, of Adelard, of the former Chairman, Barry Daniels, of NCC,
Manchester.

Robin Bloomfield
EWICS Chairman
Adelard
28, Rhondda Grove
London E3 5AP
Phone: 44-0-13187579
Fax:   44-0-18524738

Since safety is directly related to reliability, you may also try the
following two items:
* A Specification Language for Reliable Real-Time Systems, pp. 273-289,
 by H. Wupper, J. Vytopil, in J.Zalewski, W.Ehrenberger, Hardware and
 Software for Real-Time Process Control, North-Holland, 1989
 The same book contains three papers on safety, one of them titled
 "THe Impact of Standardization on Software Quality, Systems Safety
 and Reliability".  ISBN 0-444-87127-6
* Study of the Feasibility of Using a Formalized Language for the
 Description of reliability Problems, N.D. Deans, A.J. Miller.
 Report EUR 11753EN, Comm. of the European Communities, 1988
 I got it from Tony Miller, who is with Robert Gordon's Institute of
 Technology at Aberdeen, Scotland.
Of course, I have the above book in my office, too, so if you'll have
problems with finding it, let me know.

Please also note that I would be interested in getting a copy of your
paper.  I may also review it before publication, if it makes any sense
to you.

Regards,
Janusz Zalewski
Dept. of Computer Science
Southwest Texas State University
San Marcos, TX 78666-4616
E-mail: [email protected]
Phone:  512-245-3873
Fax:    512-245-3040

------------------------------

End of RISKS-FORUM Digest 12.BOWEN
************************