This page covers a lightweight yet functional library to simulate, train and
test neural networks. The type of neural networks are known as multilayer
feedforward perceptron networks. Although this type of neural network is
mathematically limited in its capabilities, it provides a solid foundation a
large number of common categorization and recognition tasks can be solved
As the neural network type used in the library has the ability to approximate
a large number of functions (any smooth function actually), we can train a
network to behave like the sinus function. The training is done by
showing the network 100 correct valuepairs of (x, sin(x)). x is spread
equal over the range (-3.14 to 3.14). After the training phase, the network
should be able to closely reproduce the results of the canoncial sinus
function. This example is produced by the libtest.cpp example program
that can be found in the library sources. The configuration used is 1:5:5:1.
As you can see, the neural network (red dots) is quite close to the correct
sinus function (shown in green). In principle, this learning method could be
used to emulate arbitrary functions, exposing the network to a function a lot
of times using correct pairs (training). However, this is just a very simple
example of what a neural network of this type is able to do.
Example: Classification of functions
Another small example that shows the classification of three type of functions
(sin, cos and abs) is available through the file
libtest-classify.cpp in the source).
Example: Visualization of a neural network
The library provides the ability to write a neural network graph visualization
file in the
language, from which pretty graph displays can be created, such as the one
below, displaying a 1:3:2 network. Each box represents one neuron. The legend
is shown in yellow.
It should compile on any Unix environment that has a proper make (such as
GNU make) and C++ compiler (such as g++ 2.95 or later). The library makes use
of the C++ standard template library (STL), make sure its installed, too.
The source is annotated using the wonderful
Doxygen annotation system, which
provides an automatically generated programmers reference. The documentation
can be generated from the source by typing:
from the root directory.
The current HTML documentation is also available online.
For german speaking users an additional documentation and introduction into
neural networks is available in german through the YAVA
project page (direct: pdf (215kb),
The entire library is released under the terms of the
GNU Lesser General Public
To comply with this license, you must give prominent notice that you use the
library, and that it is included under the terms of the LGPL license. You must
include a copy of the LGPL license. You must also do one of the following:
Include the source code for the version of the library that you link with,
as well as the full source or object code to your application so that the user
can relink your application,
Include a written offer, valid for at least three years, to provide the
materials listed in option 1, charging no more than the cost of providing this
Make the materials listed in option 1 available from the same place that
your application is available.
The most common way to comply with the license is to dynamically link with
the library, and then include the library source code and appropriate notices
with your application.
As the library was developed under the course of an university project, it may
be the case that the university
implicitly has an additional license. (I doubt they will use the library
If you have any trouble getting the library or the examples to work, please
contact me at