Main Page   Compound List   File List   Compound Members  

PerceptronLayer Class Reference

#include <PerceptronLayer.h>

List of all members.

Public Member Functions

 PerceptronLayer (unsigned int neuron_count, const ActivationFunction *fact)
 ~PerceptronLayer (void)
 PerceptronLayer (PerceptronLayer &source)
void randomizeParameters (PerceptronLayer *succ, const RandomFunction *weight_func, const RandomFunction *theta_func)
void resetDiffs (void)
void setActivationFunction (const ActivationFunction *fact)
const ActivationFunctiongetActivationFunction (void) const
void propagate (PerceptronLayer *pred)
void backpropagate (PerceptronLayer *succ, vector< double > &output_optimal, double opt_tolerance)
void postprocess (PerceptronLayer *succ, double epsilon, double weight_decay, double momterm)
void update (void)

Public Attributes

PerceptronLayerType type
vector< PerceptronNeuron * > neurons

Protected Attributes

const ActivationFunctionfact


Detailed Description

One layer within the perceptron network.

Most algorithm related calculation is done at the layer level.


Constructor & Destructor Documentation

PerceptronLayer::PerceptronLayer unsigned int    neuron_count,
const ActivationFunction   fact
 

PerceptronLayer constructor, creating a layer with neuron_count number of neurons.

Parameters:
neuron_count Number of neurons within this layer.
fact Activation function to use for this layer.

PerceptronLayer::~PerceptronLayer void   
 

PerceptronLayer destructor, mainly removing all the subneurons within this layer.

PerceptronLayer::PerceptronLayer PerceptronLayer &    source
 

Copy constructor.

Parameters:
source Source object to be copied.


Member Function Documentation

void PerceptronLayer::backpropagate PerceptronLayer *    succ,
vector< double > &    output_optimal,
double    opt_tolerance
 

Backpropagation algorithm.

Compute the delta value for all neurons within this layer.

Parameters:
succ Succeeding layer to this one, can be NULL.
output_optimal Optimal expected output for the last layer (output).
opt_tolerance The optimal difference tolerance parameter.

const ActivationFunction * PerceptronLayer::getActivationFunction void    const
 

Getter for the activation function.

void PerceptronLayer::postprocess PerceptronLayer *    succ,
double    epsilon,
double    weight_decay,
double    momterm
 

Postprocess algorithm for a single layer.

Parameters:
succ Successor layer to this one, or NULL if it is the last.
epsilon Learning parameter.
weight_decay Weight decay factor.
momterm Momentum term factor.

void PerceptronLayer::propagate PerceptronLayer *    pred
 

Forward propagation algorithm. The input and output signals of this layer is calculated from the output signals of the pred layer.

Parameters:
pred the layer before this one, can be NULL.

void PerceptronLayer::randomizeParameters PerceptronLayer *    succ,
const RandomFunction   weight_func,
const RandomFunction   theta_func
 

Network randomization functions.

Randomize variable network parameters: weightings and theta values. Use individual functions for greater customizeability.

Parameters:
succ Succeeding layer in network. Must be non-NULL.
weight_func Function to generate weighting parameters.
theta_func Function to generate theta values.

void PerceptronLayer::resetDiffs void   
 

Reset functions for learned differences.

This function resets all parameters that are learned by the backpropagation and postprocess algorithms. It has to be called after an update has been made. This way, both online- and batch-learning can be implemented.

void PerceptronLayer::setActivationFunction const ActivationFunction   fact
 

Setter for the activation function.

Parameters:
fact Activation function to use for this layer.

void PerceptronLayer::update void   
 

Update algorithm for a single layer.


Member Data Documentation

const ActivationFunction* PerceptronLayer::fact [protected]
 

Activation function and its derivate. Used for both propagation and backpropagation. Must be set and can be different for each layer.

vector<PerceptronNeuron *> PerceptronLayer::neurons
 

Every layer contains at least one neuron. This is the list of neurons within this layer.

PerceptronLayerType PerceptronLayer::type
 

Type of the layer within the network. The input layer is always the first, the output layer the last layer in the network. Every other layer must be a hidden layer.


The documentation for this class was generated from the following files:
Generated on Sun Mar 2 21:35:50 2003 for libperceptronnetwork by doxygen1.3-rc3