Main Page   Class Hierarchy   Compound List   File List   Compound Members  

PerceptronNetwork Class Reference

#include <PerceptronNetwork.h>

List of all members.

Public Member Functions

 PerceptronNetwork (vector< unsigned int > desc_layers, const char *network_name="unnamed", const ActivationFunction *fact=&ActivationFunctions::fact_tanh)
 ~PerceptronNetwork (void)
 PerceptronNetwork (PerceptronNetwork &source)
void setActivationFunction (PerceptronLayerType type, const ActivationFunction *fact)
void save (fstream &fs) const
bool load (fstream &fs)
void randomizeParameters (const RandomFunction *weight_func, const RandomFunction *theta_func)
void setInput (vector< double > &in)
void setOptimalOutput (vector< double > &optimal)
vector< double > getOutput (void) const
double errorTerm (void) const
void resetDiffs (void)
void propagate (void)
void backpropagate (void)
void postprocess (void)
void update (void)
void dumpNetworkGraph (const char *filename) const
double getLearningParameter (void) const
void setLearningParameter (double epsilon)
double getOptimalTolerance (void) const
void setOptimalTolerance (double tolerance)
double getWeightDecayParameter (void) const
void setWeightDecayParameter (double factor)
double getMomentumTermParameter (void) const
void setMomentumTermParameter (double factor)

Public Attributes

const char * name
vector< double > input
vector< double > output

Protected Attributes

vector< double > output_optimal
vector< PerceptronLayer * > layers
double epsilon
double opt_tolerance
double weight_decay
double momentum_term


class WeightMatrix

Detailed Description

One multilayer perceptron network.

Constructor & Destructor Documentation

PerceptronNetwork::PerceptronNetwork vector< unsigned int >    desc_layers,
const char *    network_name = "unnamed",
const ActivationFunction   fact = &ActivationFunctions::fact_tanh

PerceptronNetwork constructor

Construct a multilayer perceptron network by layer description given through desc_layers. The vector size specifies the number of layers, the individual elements the number of neurons within its layer.

desc_layers Number of neurons within each layer, given the input layer as first and the output layer as last. Hence, the size of the vector must be at least two.
network_name Symbolic name of the network. Can be NULL. The name is pointer-copied, so we steal a pointer here.
fact Activation function to use for all neurons by default within the network. By default, its the tangens-hyperbolicus function. For individual layers, based one their type, the activation function can be changed using the changeActivation method.

PerceptronNetwork::~PerceptronNetwork void   

PerceptronNetwork destructor

Remove all layers stored within the network.

PerceptronNetwork::PerceptronNetwork PerceptronNetwork &    source

Copy constructor.

source Source object to be copied.

Member Function Documentation

void PerceptronNetwork::backpropagate void   

Backpropagation algorithm for the entire network.

Calculate all delta error signals in every layer and their neurons. Every neuron must have a proper input/output signal assigned, and the output_optimal training target result is used for calculation.

void PerceptronNetwork::dumpNetworkGraph const char *    filename const

Dump whole neural network as graph file in the GraphViz file format (

filename Name of the file to write the graph data to.

double PerceptronNetwork::errorTerm void    const

Calculate the errorterm for the network

The error value for the current output is calculated. The current optimal output must be given prior to calling this function.

The formula used is .

double PerceptronNetwork::getLearningParameter void    const

Getter for the epsilon parameter.

double PerceptronNetwork::getMomentumTermParameter void    const

Getter for the momentum term parameter.

double PerceptronNetwork::getOptimalTolerance void    const

Getter for the optimal tolerance parameter.

vector< double > PerceptronNetwork::getOutput void    const

Getter for the entire network output.

double PerceptronNetwork::getWeightDecayParameter void    const

Getter for the weight decay parameter.

bool PerceptronNetwork::load fstream &    fs

Load the entire network from stream fs.

fs Stream (input) to read the network from.
True on success, false on failure.

void PerceptronNetwork::postprocess void   

Postprocess algorithm for the entire network.

void PerceptronNetwork::propagate void   

Propagation algorithm for the entire network.

The input levels to the network have to be set using the setInput() method, before. Afterwards the output of the network can be obtained using the getOutput() method.

void PerceptronNetwork::randomizeParameters const RandomFunction   weight_func,
const RandomFunction   theta_func

Randomize the variable network parameters, weightings and theta values.

weight_func Function to randomize weighting values.
theta_func Function to randomize theta parameters.

void PerceptronNetwork::resetDiffs void   

Reset all learned parameters.

Call after an update has been made.

void PerceptronNetwork::save fstream &    fs const

Save the entire network as text into the stream fs.

fs Stream (output) to save the network to.

void PerceptronNetwork::setActivationFunction PerceptronLayerType    type,
const ActivationFunction   fact

Set the activation function for layers, based on their type.

type Type of layers that will use the new activation function.
fact Activation function to use.

void PerceptronNetwork::setInput vector< double > &    in

Setter for the entire network input.

in Vector of input signal levels. The size must equal the input layer neuron count.

void PerceptronNetwork::setLearningParameter double    epsilon

Setter for the epsilon parameter.

epsilon Learn parameter, in the range of 0.0 to 0.5.

void PerceptronNetwork::setMomentumTermParameter double    factor

Setter for the momentum term parameter.

factor Value between 0.5 and 0.9.

void PerceptronNetwork::setOptimalOutput vector< double > &    optimal

Setter for the optimal requested network output.

Must be used before any learning algorithm is called (backpropagate).

optimal Optimal network output for the current input.

void PerceptronNetwork::setOptimalTolerance double    tolerance

Setter for the optimal tolerance parameter.

tolerance Value between 0.0 and 0.2.

void PerceptronNetwork::setWeightDecayParameter double    factor

Setter for the weight decay parameter.

factor Value in the range of 0.00005 to 0.0001.

void PerceptronNetwork::update void   

Update algorithm for the entire network.

Friends And Related Function Documentation

friend class WeightMatrix [friend]

The GUI visualization class is declared friend to allow access to individual layers.

Member Data Documentation

double PerceptronNetwork::epsilon [protected]

Generic learn parameter epsilon, should be between 0.05 and 0.5.

vector<double> PerceptronNetwork::input

Input vector, fed into the first layer of the network. The number of elements must equal the number of neurons within the input layer.

vector<PerceptronLayer *> PerceptronNetwork::layers [protected]

Left-to-right list of layers within the network. Must be at least two (one input and one output layer), but can be arbitrary large. More than four layers does not archive any improvement of the network capabilities, though.

double PerceptronNetwork::momentum_term [protected]

Momentum term parameter, should be between 0.5 and 0.9.

const char* PerceptronNetwork::name

Symbolic name of the network. Never used by the methods themselves, but convenient to use.

double PerceptronNetwork::opt_tolerance [protected]

Optimal tolerance parameter, should be between 0.0 and 0.2.

vector<double> PerceptronNetwork::output

Output vector, resulting from propagation through the entire network. The number of elements equals the number of output neurons within the network.

vector<double> PerceptronNetwork::output_optimal [protected]

Optimal expected output vector, used for the learning process. For simple propagation it is not needed.

double PerceptronNetwork::weight_decay [protected]

Weight decay parameter, should be between 0.005 and 0.03.

The documentation for this class was generated from the following files:
Generated on Mon Feb 24 19:37:45 2003 by doxygen1.3-rc3