Feedforward Closedloop Learning
FeedforwardClosedloopLearning Class Reference

Main class of Feedforward Closed Loop Learning. More...

#include <fcl.h>

Inheritance diagram for FeedforwardClosedloopLearning:

Public Member Functions

 FeedforwardClosedloopLearning (int num_of_inputs, int *num_of_neurons_per_layer_array, int _num_layers)
 Constructor: FCL without any filters. More...
 
 ~FeedforwardClosedloopLearning ()
 Destructor De-allocated any memory. More...
 
void doStep (double *input, double *error)
 Performs the simulation step. More...
 
void doStep (double *input, int n1, double *error, int n2)
 Python wrapper function. More...
 
double getOutput (int index)
 Gets the output from one of the output neurons. More...
 
void setLearningRate (double learningRate)
 Sets globally the learning rate. More...
 
void setLearningRateDiscountFactor (double _learningRateDiscountFactor)
 Sets how the learnign rate increases or decreases from layer to layer. More...
 
void setDecay (double decay)
 Sets a typical weight decay scaled with the learning rate. More...
 
void setMomentum (double momentum)
 Sets the global momentum for all layers. More...
 
void setActivationFunction (Neuron::ActivationFunction _activationFunction)
 Sets the activation function of the Neuron. More...
 
void initWeights (double max=0.001, int initBias=1, Neuron::WeightInitMethod weightInitMethod=Neuron::MAX_OUTPUT_RANDOM)
 Inits the weights in all layers. More...
 
void seedRandom (int s)
 Seeds the random number generator. More...
 
void setBias (double _bias)
 Sets globally the bias. More...
 
int getNumLayers ()
 Gets the total number of layers. More...
 
LayergetLayer (int i)
 Gets a pointer to a layer. More...
 
LayergetOutputLayer ()
 Gets the output layer. More...
 
int getNumInputs ()
 Gets the number of inputs. More...
 
Layer ** getLayers ()
 Returns all Layers. More...
 
bool saveModel (const char *name)
 Saves the whole network. More...
 
bool loadModel (const char *name)
 Loads the while network. More...
 

Detailed Description

Main class of Feedforward Closed Loop Learning.

Create an instance of this class to do the learning. It will create the whole network with an input layer, layers and an output layer. Learning is done iterative by first setting the input values and errors and then calling doStep().

(C) 2017,2018-2022, Bernd Porr bernd.nosp@m.@gla.nosp@m.sgown.nosp@m.euro.nosp@m..tech (C) 2017,2018, Paul Miller paul@.nosp@m.glas.nosp@m.gowne.nosp@m.uro..nosp@m.tech

GNU GENERAL PUBLIC LICENSE Version 3, 29 June 2007

Constructor & Destructor Documentation

◆ FeedforwardClosedloopLearning()

FeedforwardClosedloopLearning::FeedforwardClosedloopLearning ( int  num_of_inputs,
int *  num_of_neurons_per_layer_array,
int  _num_layers 
)

Constructor: FCL without any filters.

Parameters
num_of_inputsNumber of inputs in the input layer
num_of_neurons_per_layer_arrayNumber of neurons in each layer
_num_layersNumber of layer (needs to match with array above)

◆ ~FeedforwardClosedloopLearning()

FeedforwardClosedloopLearning::~FeedforwardClosedloopLearning ( )

Destructor De-allocated any memory.

Member Function Documentation

◆ doStep() [1/2]

void FeedforwardClosedloopLearning::doStep ( double *  input,
double *  error 
)

Performs the simulation step.

Parameters
inputArray with the input values
errorArray of the error signals

◆ doStep() [2/2]

void FeedforwardClosedloopLearning::doStep ( double *  input,
int  n1,
double *  error,
int  n2 
)

Python wrapper function.

Not public.

◆ getLayer()

Layer* FeedforwardClosedloopLearning::getLayer ( int  i)
inline

Gets a pointer to a layer.

Parameters
iIndex of the layer.
Returns
A pointer to a layer class.

◆ getLayers()

Layer** FeedforwardClosedloopLearning::getLayers ( )
inline

Returns all Layers.

Returns
Returns a two dimensional array of all layers.

◆ getNumInputs()

int FeedforwardClosedloopLearning::getNumInputs ( )
inline

Gets the number of inputs.

Returns
The number of inputs

◆ getNumLayers()

int FeedforwardClosedloopLearning::getNumLayers ( )
inline

Gets the total number of layers.

Returns
The total number of all layers.

◆ getOutput()

double FeedforwardClosedloopLearning::getOutput ( int  index)
inline

Gets the output from one of the output neurons.

Parameters
indexThe index number of the output neuron.
Returns
The output value of the output neuron.

◆ getOutputLayer()

Layer* FeedforwardClosedloopLearning::getOutputLayer ( )
inline

Gets the output layer.

Returns
A pointer to the output layer which is also a Layer class.

◆ initWeights()

void FeedforwardClosedloopLearning::initWeights ( double  max = 0.001,
int  initBias = 1,
Neuron::WeightInitMethod  weightInitMethod = Neuron::MAX_OUTPUT_RANDOM 
)

Inits the weights in all layers.

Parameters
maxMaximum value of the weights.
initBiasIf the bias also should be initialised.
weightInitMethodSee Neuron::WeightInitMethod for the options.

◆ loadModel()

bool FeedforwardClosedloopLearning::loadModel ( const char *  name)

Loads the while network.

Parameters
namefilename

◆ saveModel()

bool FeedforwardClosedloopLearning::saveModel ( const char *  name)

Saves the whole network.

Parameters
namefilename

◆ seedRandom()

void FeedforwardClosedloopLearning::seedRandom ( int  s)
inline

Seeds the random number generator.

Parameters
sAn arbitratry number.

◆ setActivationFunction()

void FeedforwardClosedloopLearning::setActivationFunction ( Neuron::ActivationFunction  _activationFunction)

Sets the activation function of the Neuron.

Parameters
_activationFunctionSee Neuron::ActivationFunction for the different options.

◆ setBias()

void FeedforwardClosedloopLearning::setBias ( double  _bias)

Sets globally the bias.

Parameters
_biasSets globally the bias input to all neurons.

◆ setDecay()

void FeedforwardClosedloopLearning::setDecay ( double  decay)

Sets a typical weight decay scaled with the learning rate.

Parameters
decayThe larger the faster the decay.

◆ setLearningRate()

void FeedforwardClosedloopLearning::setLearningRate ( double  learningRate)

Sets globally the learning rate.

Parameters
learningRateSets the learning rate for all layers and neurons.

◆ setLearningRateDiscountFactor()

void FeedforwardClosedloopLearning::setLearningRateDiscountFactor ( double  _learningRateDiscountFactor)
inline

Sets how the learnign rate increases or decreases from layer to layer.

Parameters
_learningRateDiscountFactorA factor of >1 means higher learning rate in deeper layers.

◆ setMomentum()

void FeedforwardClosedloopLearning::setMomentum ( double  momentum)

Sets the global momentum for all layers.

Parameters
momentumDefines the intertia of the weight change over time.

The documentation for this class was generated from the following file: