Feedforward Closedloop Learning
|
1 #ifndef __FEEDFORWARD_CLOSEDLOOP_LEARNING_H_
2 #define __FEEDFORWARD_CLOSEDLOOP_LEARNING_H_
4 #include "fcl/globals.h"
6 #include "fcl/neuron.h"
7 #include "fcl/bandpass.h"
41 int* num_of_neurons_per_layer_array,
54 void doStep(
double* input,
double* error);
58 void doStep(
double* input,
int n1,
double* error,
int n2);
65 return layers[num_layers-1]->
getOutput(index);
77 learningRateDiscountFactor = _learningRateDiscountFactor;
157 double learningRateDiscountFactor = 1;
void setDecay(double decay)
Sets a typical weight decay scaled with the learning rate.
double getOutput(int index)
Gets the outpuut of one neuron.
Definition: layer.h:278
Layer ** getLayers()
Returns all Layers.
Definition: fcl.h:138
void setMomentum(double momentum)
Sets the global momentum for all layers.
int getNumInputs()
Gets the number of inputs.
Definition: fcl.h:133
void setBias(double _bias)
Sets globally the bias.
Layer which contains the neurons of one layer.
Definition: layer.h:169
WeightInitMethod
Constants how to init the weights in the neuron.
Definition: neuron.h:71
void doStep(double *input, double *error)
Performs the simulation step.
Layer * getLayer(int i)
Gets a pointer to a layer.
Definition: fcl.h:123
double getOutput(int index)
Gets the output from one of the output neurons.
Definition: fcl.h:64
bool loadModel(const char *name)
Loads the while network.
Main class of Feedforward Closed Loop Learning.
Definition: fcl.h:30
void seedRandom(int s)
Seeds the random number generator.
Definition: fcl.h:107
bool saveModel(const char *name)
Saves the whole network.
FeedforwardClosedloopLearning(int num_of_inputs, int *num_of_neurons_per_layer_array, int _num_layers)
Constructor: FCL without any filters.
void setLearningRate(double learningRate)
Sets globally the learning rate.
void initWeights(double max=0.001, int initBias=1, Neuron::WeightInitMethod weightInitMethod=Neuron::MAX_OUTPUT_RANDOM)
Inits the weights in all layers.
ActivationFunction
Activation functions on offer LINEAR: linear unit, TANH: tangens hyperbolicus, RELU: linear rectifier...
Definition: neuron.h:86
~FeedforwardClosedloopLearning()
Destructor De-allocated any memory.
Layer * getOutputLayer()
Gets the output layer.
Definition: fcl.h:128
void setActivationFunction(Neuron::ActivationFunction _activationFunction)
Sets the activation function of the Neuron.
int getNumLayers()
Gets the total number of layers.
Definition: fcl.h:117
void setLearningRateDiscountFactor(double _learningRateDiscountFactor)
Sets how the learnign rate increases or decreases from layer to layer.
Definition: fcl.h:76