Feedforward Closedloop Learning
Layer Class Reference

Layer which contains the neurons of one layer. More...

#include <layer.h>

Public Types

enum  WeightNormalisation {
  WEIGHT_NORM_NONE = 0, WEIGHT_NORM_LAYER_EUCLEDIAN = 1, WEIGHT_NORM_NEURON_EUCLEDIAN = 2, WEIGHT_NORM_LAYER_MANHATTAN = 3,
  WEIGHT_NORM_NEURON_MANHATTAN = 4, WEIGHT_NORM_LAYER_INFINITY = 5, WEIGHT_NORM_NEURON_INFINITY = 6
}
 Weight normalisation constants Defines if weights are normalised layer-wide or for every neuron separately.
 

Public Member Functions

 Layer (int _nNeurons, int _nInputs)
 Constructor. More...
 
 ~Layer ()
 Destructor Frees all memory. More...
 
void calcOutputs ()
 Calculates the output values in all neurons. More...
 
void doLearning ()
 Adjusts the weights. More...
 
void setError (double _error)
 Sets the global error for all neurons. More...
 
void setError (int i, double _error)
 sets the error individually More...
 
void setErrors (double *_errors)
 Sets all errors from an input array. More...
 
double getError (int i)
 Retrieves the error. More...
 
void setBias (double _bias)
 Sets the global bias for all neurons. More...
 
void setInput (int inputIndex, double input)
 Set the input value of one input. More...
 
void setInputs (double *_inputs)
 Sets all inputs from an input array. More...
 
void setLearningRate (double _learningRate)
 Sets the learning rate of all neurons. More...
 
void setActivationFunction (Neuron::ActivationFunction _activationFunction)
 Set the activation function. More...
 
void setMomentum (double _momentum)
 Set the momentum of all neurons in this layer. More...
 
void setDecay (double _decay)
 Sets the weight decay scaled by the learning rate. More...
 
void initWeights (double _max=1, int initBiasWeight=1, Neuron::WeightInitMethod weightInitMethod=Neuron::MAX_OUTPUT_RANDOM)
 Inits the weights. More...
 
double getOutput (int index)
 Gets the outpuut of one neuron. More...
 
NeurongetNeuron (int index)
 Gets a pointer to one neuron. More...
 
int getNneurons ()
 Gets the number of neurons. More...
 
int getNinputs ()
 Number of inputs. More...
 
void setConvolution (int width, int height)
 Defines a 2D geometry for the input layer of widthxheight. More...
 
void setMaxDetLayer (int _m)
 Maxium detection layer. More...
 
void setNormaliseWeights (WeightNormalisation _normaliseWeights)
 Normalise the weights. More...
 
void setDebugInfo (int layerIndex)
 Sets the layer index within the whole network. More...
 
void setStep (long int step)
 Sets the simulation step in the layer for debug purposes. More...
 
double getWeightDistanceFromInitialWeights ()
 Get weight distance from the start of the simulation. More...
 
void doNormaliseWeights ()
 Performs the weight normalisation. More...
 
void setUseThreads (int _useThreads)
 Sets if threads should be used. More...
 
int saveWeightMatrix (char *filename)
 Save weight matrix for documentation and debugging. More...
 

Detailed Description

Layer which contains the neurons of one layer.

It performs all computations possible in a layer. In particular it calls all neurons in separate threads and triggers the compuations there. These functions are all called from the parent class.

Constructor & Destructor Documentation

◆ Layer()

Layer::Layer ( int  _nNeurons,
int  _nInputs 
)

Constructor.

GNU GENERAL PUBLIC LICENSE Version 3, 29 June 2007.

Parameters
_nNeuronsNumber of neurons in the layer.
_nInputsNumber of inputs to the Layer.
_nFiltersNumber of lowpass filters at each input.
_minTMinimum time of the lowpass filter.
_maxTMaximum time of the lowpass filter.

(C) 2017, Bernd Porr bernd.nosp@m.@gla.nosp@m.sgown.nosp@m.euro.nosp@m..tech (C) 2017, Paul Miller paul@.nosp@m.glas.nosp@m.gowne.nosp@m.uro..nosp@m.tech

◆ ~Layer()

Layer::~Layer ( )

Destructor Frees all memory.

Member Function Documentation

◆ calcOutputs()

void Layer::calcOutputs ( )

Calculates the output values in all neurons.

◆ doLearning()

void Layer::doLearning ( )

Adjusts the weights.

◆ doNormaliseWeights()

void Layer::doNormaliseWeights ( )

Performs the weight normalisation.

◆ getError()

double Layer::getError ( int  i)

Retrieves the error.

Parameters
iIndex of the neuron

◆ getNeuron()

Neuron* Layer::getNeuron ( int  index)
inline

Gets a pointer to one neuron.

Parameters
indexThe index number of the neuron.
Returns
A pointer to a Layer class.

◆ getNinputs()

int Layer::getNinputs ( )
inline

Number of inputs.

Returns
The number of inputs

◆ getNneurons()

int Layer::getNneurons ( )
inline

Gets the number of neurons.

Returns
The number of neurons.

◆ getOutput()

double Layer::getOutput ( int  index)
inline

Gets the outpuut of one neuron.

Parameters
indexThe index number of the neuron.
Returns
Retuns the double valye of the output.

◆ getWeightDistanceFromInitialWeights()

double Layer::getWeightDistanceFromInitialWeights ( )

Get weight distance from the start of the simulation.

Returns
The distance from the initial (random) weight setup.

◆ initWeights()

void Layer::initWeights ( double  _max = 1,
int  initBiasWeight = 1,
Neuron::WeightInitMethod  weightInitMethod = Neuron::MAX_OUTPUT_RANDOM 
)

Inits the weights.

Parameters
_maxMaximum value if using random init.
initBiasWeightif one also the bias weight is initialised.
weightInitMethodThe methid employed to init the weights.

◆ saveWeightMatrix()

int Layer::saveWeightMatrix ( char *  filename)

Save weight matrix for documentation and debugging.

Parameters
filenameThe filename it should be saved to.

◆ setActivationFunction()

void Layer::setActivationFunction ( Neuron::ActivationFunction  _activationFunction)

Set the activation function.

Parameters
_activationFunctionThe activation function. See: Neuron::ActivationFunction

◆ setBias()

void Layer::setBias ( double  _bias)

Sets the global bias for all neurons.

Parameters
_biasThe bias for all neurons

◆ setConvolution()

void Layer::setConvolution ( int  width,
int  height 
)

Defines a 2D geometry for the input layer of widthxheight.

Parameters
widthThe width of the convolutional window.
heightThe height of the convolution window.

◆ setDebugInfo()

void Layer::setDebugInfo ( int  layerIndex)

Sets the layer index within the whole network.

Parameters
layerIndexThe layer index in the whole network.

◆ setDecay()

void Layer::setDecay ( double  _decay)

Sets the weight decay scaled by the learning rate.

Parameters
_decayThe decay rate of the weights

◆ setError() [1/2]

void Layer::setError ( double  _error)

Sets the global error for all neurons.

Parameters
_errorSets the error in the whole layer

◆ setError() [2/2]

void Layer::setError ( int  i,
double  _error 
)

sets the error individually

Parameters
iIndex of the neuron
_errorThe error to be set

◆ setErrors()

void Layer::setErrors ( double *  _errors)

Sets all errors from an input array.

Parameters
_errorsis an array of errors

◆ setInput()

void Layer::setInput ( int  inputIndex,
double  input 
)

Set the input value of one input.

Parameters
inputIndexThe index number of the input.
inputThe value of the input

◆ setInputs()

void Layer::setInputs ( double *  _inputs)

Sets all inputs from an input array.

Parameters
_inputsarray of all inputs

◆ setLearningRate()

void Layer::setLearningRate ( double  _learningRate)

Sets the learning rate of all neurons.

Parameters
_learningRateThe learning rate

◆ setMaxDetLayer()

void Layer::setMaxDetLayer ( int  _m)
inline

Maxium detection layer.

Experimental. This hasn't been implemented.

◆ setMomentum()

void Layer::setMomentum ( double  _momentum)

Set the momentum of all neurons in this layer.

Parameters
_momentumThe momentum for all neurons in this layer.

◆ setNormaliseWeights()

void Layer::setNormaliseWeights ( WeightNormalisation  _normaliseWeights)

Normalise the weights.

Parameters
_normaliseWeightsMetod of normalisation.

◆ setStep()

void Layer::setStep ( long int  step)

Sets the simulation step in the layer for debug purposes.

Parameters
stepStep number.

◆ setUseThreads()

void Layer::setUseThreads ( int  _useThreads)
inline

Sets if threads should be used.

Parameters
_useThreads0 = no Threads, 1 = Threads

The documentation for this class was generated from the following files: