Path: usenet.cis.ufl.edu!usenet.eel.ufl.edu!psgrain!nntp.teleport.com!usenet
From: [email protected] (Tuomas J Lukka)
Newsgroups: comp.lang.perl.announce,comp.lang.perl.misc
Subject: ANNOUNCE: Gann 0.01 Perl generic neural networks module released
Followup-To: comp.lang.perl.misc
Date: 1 Mar 1996 23:33:20 GMT
Organization: University of Helsinki
Lines: 66
Approved: [email protected] (comp.lang.perl.announce)
Message-ID: <[email protected]>
NNTP-Posting-Host: linda.teleport.com
X-Disclaimer: The "Approved" header verifies header information for article transmission and does not imply approval of content.
Xref: usenet.cis.ufl.edu comp.lang.perl.announce:277 comp.lang.perl.misc:22425

This purpose of this message is to announce the availability of
Gann version 0.01, a copylefted artificial neural network simulator.

Unlike Version 0.00, Version 0.01 is already almost usable.
Therefore, Gann can be considered alpha software from now on.
However, even though the documentation is there, there are
lots of things still missing.

The purpose of announcing Gann at this early stage is to solicit
comments on the programming and user interfaces to the simulator.
Currently, Gann only contains routines to do back-propagation with
gradient descent or momentum descent. However, the interfaces are
very generic and adding new algorithms is very easy.

Gann is copylefted, see the file COPYING in the distribution for details.

What's new?
 - Graphical user interface, requires Tk-b9.01 (available from CPAN)
   see http://www.helsinki.fi/~lukka/gann.html for screen shots.
 - Nomenclature: Changed all package names to be under Math::Neural.
 - Convergence detection in minimization.
 - Temporal difference algorithm (TD(lambda))
 - Bug fixes

Gann is implemented as a Perl module using C++ for the speed-critical
parts and Perl for everything else, for maximum flexibility.
You need perl version 5.002b2 or higher (perl 5.001 is rumoured to work but..).

The package contains an example program demonstrating the learning
of the 'xor' function.

If you are interested in seeing the package developed in a certain
direction, please send me email. I'm especially interested in comments
about the following issues
- What is good and what is bad about the current interfaces in Gann
- What other net types than backprop should I include?
  (for example, Kohonen etc. Which types of nets are currently
  'hot' and which are not so interesting)
- What minimization algorithms should I include, what algorithms
  do you have good and/or bad experiences with?

If possible, please include references to publications, or code
on the net.

I'd like to make Gann a general grab-bag of neural network algorithms
containing well-documented code and examples for any algorithms
one might want to use, implemented in an object-oriented fashion to encourage
reuse and interesting multi-network experiments.

The next revision will probably happen in a few weeks, after
I've had time to consider the shape of the interface based on the
comments obtained about this version.

In the future, I intend to keep the release rate high in order to make
new code and network types accessible as fast as possible.

       Tuomas J. Lukka, [email protected]

P.s.

Gann is available from
ftp://www.funet.fi/pub/languages/perl/CPAN/authors/id/LUKKA/Math_Neural-0.01.tar.gz
or any other CPAN site near you.  see
ftp://www.funet.fi/pub/languages/perl/CPAN/CPAN.html
for information about mirrors near you.