Path: usenet.cis.ufl.edu!usenet.eel.ufl.edu!psgrain!nntp.teleport.com!usenet
From:
[email protected] (Tuomas J Lukka)
Newsgroups: comp.lang.perl.announce,comp.lang.perl.misc
Subject: ANNOUNCE: Gann-0.00 neural network simulator perl module
Followup-To: comp.lang.perl.misc
Date: 12 Feb 1996 14:43:01 GMT
Organization: University of Helsinki
Lines: 54
Approved:
[email protected] (comp.lang.perl.announce)
Message-ID: <
[email protected]>
NNTP-Posting-Host: julie.teleport.com
X-Disclaimer: The "Approved" header verifies header information for article transmission and does not imply approval of content.
Xref: usenet.cis.ufl.edu comp.lang.perl.announce:251 comp.lang.perl.misc:20285
This purpose of this message is to announce the availability of
Gann-0.00, a copylefted artificial neural network simulator.
Version 0.00 is not intended for any use, nor even alpha testing.
The purpose of announcing Gann at this early stage is to solicit
comments on the programming interface to the simulator.
Currently, Gann only contains routines to do back-propagation with
gradient descent or momentum descent. However, the interfaces are
very generic and adding new algorithms is very easy.
Gann is copylefted, see the file COPYING in the distribution for details.
Gann is implemented as a Perl module using C++ for the speed-critical
parts and Perl for everything else, for maximum flexibility.
You need perl version 5.002b2 or higher.
The package contains an example program demonstrating the learning
of the 'xor' function.
If you are interested in seeing the package developed in a certain
direction, please send me email. I'm especially interested in comments
about the following issues
- What is good and what is bad about the current interfaces in Gann
- What other net types than backprop should I include?
(for example, Kohonen etc. Which types of nets are currently
'hot' and which are not so interesting)
- What minimization algorithms should I include, what algorithms
do you have good and/or bad experiences with?
If possible, please include references to publications, or code
on the net.
I'd like to make Gann a general grab-bag of neural network algorithms
containing well-documented code and examples for any algorithms
one might want to use, implemented in an object-oriented fashion to encourage
reuse and interesting multi-network experiments.
The next revision will probably happen in a few weeks, after
I've had time to consider the shape of the interface based on the
comments obtained about the first version.
In the future, I intend to keep the release rate high in order to make
new code and network types accessible as fast as possible.
Tuomas J. Lukka,
[email protected]
P.s.
Gann is available from
ftp://www.funet.fi/pub/languages/perl/CPAN/authors/id/LUKKA/Gann-0.00.tar.gz
or any other CPAN site near you. see
ftp://www.funet.fi/pub/languages/perl/CPAN/CPAN.html
for information about mirrors near you.