ROOT 6.13/01 Reference Guide |
Definition at line 48 of file TMultiLayerPerceptron.h.
Public Types | |
enum | EDataSet { kTraining, kTest } |
enum | ELearningMethod { kStochastic, kBatch, kSteepestDescent, kRibierePolak, kFletcherReeves, kBFGS } |
Public Member Functions | |
TMultiLayerPerceptron () | |
Default constructor. More... | |
TMultiLayerPerceptron (const char *layout, TTree *data=0, const char *training="Entry$%2==0", const char *test="", TNeuron::ENeuronType type=TNeuron::kSigmoid, const char *extF="", const char *extD="") | |
The network is described by a simple string: The input/output layers are defined by giving the branch names separated by comas. More... | |
TMultiLayerPerceptron (const char *layout, const char *weight, TTree *data=0, const char *training="Entry$%2==0", const char *test="", TNeuron::ENeuronType type=TNeuron::kSigmoid, const char *extF="", const char *extD="") | |
The network is described by a simple string: The input/output layers are defined by giving the branch names separated by comas. More... | |
TMultiLayerPerceptron (const char *layout, TTree *data, TEventList *training, TEventList *test, TNeuron::ENeuronType type=TNeuron::kSigmoid, const char *extF="", const char *extD="") | |
The network is described by a simple string: The input/output layers are defined by giving the branch names separated by comas. More... | |
TMultiLayerPerceptron (const char *layout, const char *weight, TTree *data, TEventList *training, TEventList *test, TNeuron::ENeuronType type=TNeuron::kSigmoid, const char *extF="", const char *extD="") | |
The network is described by a simple string: The input/output layers are defined by giving the branch names separated by comas. More... | |
virtual | ~TMultiLayerPerceptron () |
Destructor. More... | |
void | ComputeDEDw () const |
Compute the DEDw = sum on all training events of dedw for each weight normalized by the number of events. More... | |
virtual void | Draw (Option_t *option="") |
Draws the network structure. More... | |
void | DrawResult (Int_t index=0, Option_t *option="test") const |
Draws the neural net output It produces an histogram with the output for the two datasets. More... | |
Bool_t | DumpWeights (Option_t *filename="-") const |
Dumps the weights to a text file. More... | |
Double_t | Evaluate (Int_t index, Double_t *params) const |
Returns the Neural Net for a given set of input parameters #parameters must equal #input neurons. More... | |
void | Export (Option_t *filename="NNfunction", Option_t *language="C++") const |
Exports the NN as a function for any non-ROOT-dependant code Supported languages are: only C++ , FORTRAN and Python (yet) This feature is also usefull if you want to plot the NN as a function (TF1 or TF2). More... | |
Double_t | GetDelta () const |
Double_t | GetEpsilon () const |
Double_t | GetError (Int_t event) const |
Error on the output for a given event. More... | |
Double_t | GetError (TMultiLayerPerceptron::EDataSet set) const |
Error on the whole dataset. More... | |
Double_t | GetEta () const |
Double_t | GetEtaDecay () const |
TMultiLayerPerceptron::ELearningMethod | GetLearningMethod () const |
Int_t | GetReset () const |
TString | GetStructure () const |
Double_t | GetTau () const |
TNeuron::ENeuronType | GetType () const |
Bool_t | LoadWeights (Option_t *filename="") |
Loads the weights from a text file conforming to the format defined by DumpWeights. More... | |
void | Randomize () const |
Randomize the weights. More... | |
Double_t | Result (Int_t event, Int_t index=0) const |
Computes the output for a given event. More... | |
void | SetData (TTree *) |
Set the data source. More... | |
void | SetDelta (Double_t delta) |
Sets Delta - used in stochastic minimisation (look at the constructor for the complete description of learning methods and parameters) More... | |
void | SetEpsilon (Double_t eps) |
Sets Epsilon - used in stochastic minimisation (look at the constructor for the complete description of learning methods and parameters) More... | |
void | SetEta (Double_t eta) |
Sets Eta - used in stochastic minimisation (look at the constructor for the complete description of learning methods and parameters) More... | |
void | SetEtaDecay (Double_t ed) |
Sets EtaDecay - Eta *= EtaDecay at each epoch (look at the constructor for the complete description of learning methods and parameters) More... | |
void | SetEventWeight (const char *) |
Set the event weight. More... | |
void | SetLearningMethod (TMultiLayerPerceptron::ELearningMethod method) |
Sets the learning method. More... | |
void | SetReset (Int_t reset) |
Sets number of epochs between two resets of the search direction to the steepest descent. More... | |
void | SetTau (Double_t tau) |
Sets Tau - used in line search (look at the constructor for the complete description of learning methods and parameters) More... | |
void | SetTestDataSet (TEventList *test) |
Sets the Test dataset. More... | |
void | SetTestDataSet (const char *test) |
Sets the Test dataset. More... | |
void | SetTrainingDataSet (TEventList *train) |
Sets the Training dataset. More... | |
void | SetTrainingDataSet (const char *train) |
Sets the Training dataset. More... | |
void | Train (Int_t nEpoch, Option_t *option="text", Double_t minE=0) |
Train the network. More... | |
Protected Member Functions | |
void | AttachData () |
Connects the TTree to Neurons in input and output layers. More... | |
void | BFGSDir (TMatrixD &, Double_t *) |
Computes the direction for the BFGS algorithm as the product between the Hessian estimate (bfgsh) and the dir. More... | |
void | BuildNetwork () |
Instanciates the network from the description. More... | |
void | ConjugateGradientsDir (Double_t *, Double_t) |
Sets the search direction to conjugate gradient direction beta should be: ||g_{(t+1)}||^2 / ||g_{(t)}||^2 (Fletcher-Reeves) g_{(t+1)} (g_{(t+1)}-g_{(t)}) / ||g_{(t)}||^2 (Ribiere-Polak) More... | |
Double_t | DerivDir (Double_t *) |
scalar product between gradient and direction = derivative along direction More... | |
bool | GetBFGSH (TMatrixD &, TMatrixD &, TMatrixD &) |
Computes the hessian matrix using the BFGS update algorithm. More... | |
Double_t | GetCrossEntropy () const |
Cross entropy error for a softmax output neuron, for a given event. More... | |
Double_t | GetCrossEntropyBinary () const |
Cross entropy error for sigmoid output neurons, for a given event. More... | |
void | GetEntry (Int_t) const |
Load an entry into the network. More... | |
Double_t | GetSumSquareError () const |
Error on the output for a given event. More... | |
Bool_t | LineSearch (Double_t *, Double_t *) |
Search along the line defined by direction. More... | |
void | MLP_Batch (Double_t *) |
One step for the batch (stochastic) method. More... | |
void | MLP_Stochastic (Double_t *) |
One step for the stochastic method buffer should contain the previous dw vector and will be updated. More... | |
void | SetGammaDelta (TMatrixD &, TMatrixD &, Double_t *) |
Sets the gamma (g_{(t+1)}-g_{(t)}) and delta (w_{(t+1)}-w_{(t)}) vectors Gamma is computed here, so ComputeDEDw cannot have been called before, and delta is a direct translation of buffer into a TMatrixD. More... | |
void | SteepestDir (Double_t *) |
Sets the search direction to steepest descent. More... | |
Private Member Functions | |
TMultiLayerPerceptron (const TMultiLayerPerceptron &) | |
void | BuildFirstLayer (TString &) |
Instanciates the neurons in input Inputs are normalised and the type is set to kOff (simple forward of the formula value) More... | |
void | BuildHiddenLayers (TString &) |
Builds hidden layers. More... | |
void | BuildLastLayer (TString &, Int_t) |
Builds the output layer Neurons are linear combinations of input, by defaul. More... | |
void | BuildOneHiddenLayer (const TString &sNumNodes, Int_t &layer, Int_t &prevStart, Int_t &prevStop, Bool_t lastLayer) |
Builds a hidden layer, updates the number of layers. More... | |
void | ExpandStructure () |
Expand the structure of the first layer. More... | |
void | MLP_Line (Double_t *, Double_t *, Double_t) |
Sets the weights to a point along a line Weights are set to [origin + (dist * dir)]. More... | |
TMultiLayerPerceptron & | operator= (const TMultiLayerPerceptron &) |
void | Shuffle (Int_t *, Int_t) const |
Shuffle the Int_t index[n] in input. More... | |
Private Attributes | |
Int_t | fCurrentTree |
pointer to the tree used as datasource More... | |
Double_t | fCurrentTreeWeight |
index of the current tree in a chain More... | |
TTree * | fData |
Double_t | fDelta |
Epsilon - used in stochastic minimisation - Default=0. More... | |
Double_t | fEpsilon |
Eta - used in stochastic minimisation - Default=0.1. More... | |
Double_t | fEta |
TTreeFormulaManager for the weight and neurons. More... | |
Double_t | fEtaDecay |
Delta - used in stochastic minimisation - Default=0. More... | |
TTreeFormula * | fEventWeight |
The Learning Method. More... | |
TString | fextD |
TString | fextF |
TObjArray | fFirstLayer |
Double_t | fLastAlpha |
Tau - used in line search - Default=3. More... | |
TObjArray | fLastLayer |
ELearningMethod | fLearningMethod |
EventList defining the events in the test dataset. More... | |
TTreeFormulaManager * | fManager |
formula representing the event weight More... | |
TObjArray | fNetwork |
weight of the current tree in a chain More... | |
TNeuron::ENeuronType | fOutType |
Int_t | fReset |
internal parameter used in line search More... | |
TString | fStructure |
TObjArray | fSynapses |
Double_t | fTau |
EtaDecay - Eta *= EtaDecay at each epoch - Default=1. More... | |
TEventList * | fTest |
EventList defining the events in the training dataset. More... | |
Bool_t | fTestOwner |
internal flag whether one has to delete fTraining or not More... | |
TEventList * | fTraining |
Bool_t | fTrainingOwner |
number of epochs between two resets of the search direction to the steepest descent - Default=50 More... | |
TNeuron::ENeuronType | fType |
TString | fWeight |
Friends | |
class | TMLPAnalyzer |
#include <TMultiLayerPerceptron.h>
Enumerator | |
---|---|
kTraining | |
kTest |
Definition at line 54 of file TMultiLayerPerceptron.h.
Enumerator | |
---|---|
kStochastic | |
kBatch | |
kSteepestDescent | |
kRibierePolak | |
kFletcherReeves | |
kBFGS |
Definition at line 52 of file TMultiLayerPerceptron.h.
TMultiLayerPerceptron::TMultiLayerPerceptron | ( | ) |
Default constructor.
Definition at line 252 of file TMultiLayerPerceptron.cxx.
TMultiLayerPerceptron::TMultiLayerPerceptron | ( | const char * | layout, |
TTree * | data = 0 , |
||
const char * | training = "Entry$%2==0" , |
||
const char * | test = "" , |
||
TNeuron::ENeuronType | type = TNeuron::kSigmoid , |
||
const char * | extF = "" , |
||
const char * | extD = "" |
||
) |
The network is described by a simple string: The input/output layers are defined by giving the branch names separated by comas.
Hidden layers are just described by the number of neurons. The layers are separated by colons. Ex: "x,y:10:5:f" The output can be prepended by '@' if the variable has to be normalized. The output can be followed by '!' to use Softmax neurons for the output layer only. Ex: "x,y:10:5:c1,c2,c3!" Input and outputs are taken from the TTree given as second argument. training and test are two cuts (see TTreeFormula) defining events to be used during the neural net training and testing. Example: "Entry$%2", "(Entry$+1)%2". Both the TTree and the cut can be defined in the constructor, or later with the suited setter method.
Definition at line 420 of file TMultiLayerPerceptron.cxx.
TMultiLayerPerceptron::TMultiLayerPerceptron | ( | const char * | layout, |
const char * | weight, | ||
TTree * | data = 0 , |
||
const char * | training = "Entry$%2==0" , |
||
const char * | test = "" , |
||
TNeuron::ENeuronType | type = TNeuron::kSigmoid , |
||
const char * | extF = "" , |
||
const char * | extD = "" |
||
) |
The network is described by a simple string: The input/output layers are defined by giving the branch names separated by comas.
Hidden layers are just described by the number of neurons. The layers are separated by colons. Ex: "x,y:10:5:f" The output can be prepended by '@' if the variable has to be normalized. The output can be followed by '!' to use Softmax neurons for the output layer only. Ex: "x,y:10:5:c1,c2,c3!" Input and outputs are taken from the TTree given as second argument. training and test are two cuts (see TTreeFormula) defining events to be used during the neural net training and testing. Example: "Entry$%2", "(Entry$+1)%2". Both the TTree and the cut can be defined in the constructor, or later with the suited setter method.
Definition at line 486 of file TMultiLayerPerceptron.cxx.
TMultiLayerPerceptron::TMultiLayerPerceptron | ( | const char * | layout, |
TTree * | data, | ||
TEventList * | training, | ||
TEventList * | test, | ||
TNeuron::ENeuronType | type = TNeuron::kSigmoid , |
||
const char * | extF = "" , |
||
const char * | extD = "" |
||
) |
The network is described by a simple string: The input/output layers are defined by giving the branch names separated by comas.
Hidden layers are just described by the number of neurons. The layers are separated by colons. Ex: "x,y:10:5:f" The output can be prepended by '@' if the variable has to be normalized. The output can be followed by '!' to use Softmax neurons for the output layer only. Ex: "x,y:10:5:c1,c2,c3!" Input and outputs are taken from the TTree given as second argument. training and test are the two TEventLists defining events to be used during the neural net training. Both the TTree and the TEventLists can be defined in the constructor, or later with the suited setter method.
Definition at line 302 of file TMultiLayerPerceptron.cxx.
TMultiLayerPerceptron::TMultiLayerPerceptron | ( | const char * | layout, |
const char * | weight, | ||
TTree * | data, | ||
TEventList * | training, | ||
TEventList * | test, | ||
TNeuron::ENeuronType | type = TNeuron::kSigmoid , |
||
const char * | extF = "" , |
||
const char * | extD = "" |
||
) |
The network is described by a simple string: The input/output layers are defined by giving the branch names separated by comas.
Hidden layers are just described by the number of neurons. The layers are separated by colons. Ex: "x,y:10:5:f" The output can be prepended by '@' if the variable has to be normalized. The output can be followed by '!' to use Softmax neurons for the output layer only. Ex: "x,y:10:5:c1,c2,c3!" Input and outputs are taken from the TTree given as second argument. training and test are the two TEventLists defining events to be used during the neural net training. Both the TTree and the TEventLists can be defined in the constructor, or later with the suited setter method.
Definition at line 360 of file TMultiLayerPerceptron.cxx.
|
virtual |
Destructor.
Definition at line 537 of file TMultiLayerPerceptron.cxx.
|
private |
|
protected |
Connects the TTree to Neurons in input and output layers.
The formulas associated to each neuron are created and reported to the network formula manager. By default, the branch is not normalised since this would degrade performance for classification jobs. Normalisation can be requested by putting '@' in front of the formula.
Definition at line 1216 of file TMultiLayerPerceptron.cxx.
|
protected |
Computes the direction for the BFGS algorithm as the product between the Hessian estimate (bfgsh) and the dir.
Definition at line 2439 of file TMultiLayerPerceptron.cxx.
|
private |
Instanciates the neurons in input Inputs are normalised and the type is set to kOff (simple forward of the formula value)
Definition at line 1351 of file TMultiLayerPerceptron.cxx.
|
private |
Builds hidden layers.
Definition at line 1369 of file TMultiLayerPerceptron.cxx.
|
private |
Builds the output layer Neurons are linear combinations of input, by defaul.
If the structure ends with "!", neurons are set up for classification, ie. with a sigmoid (1 neuron) or softmax (more neurons) activation function.
Definition at line 1433 of file TMultiLayerPerceptron.cxx.
|
protected |
Instanciates the network from the description.
Definition at line 1320 of file TMultiLayerPerceptron.cxx.
|
private |
Builds a hidden layer, updates the number of layers.
Definition at line 1388 of file TMultiLayerPerceptron.cxx.
void TMultiLayerPerceptron::ComputeDEDw | ( | ) | const |
Compute the DEDw = sum on all training events of dedw for each weight normalized by the number of events.
Definition at line 1113 of file TMultiLayerPerceptron.cxx.
|
protected |
Sets the search direction to conjugate gradient direction beta should be: ||g_{(t+1)}||^2 / ||g_{(t)}||^2 (Fletcher-Reeves) g_{(t+1)} (g_{(t+1)}-g_{(t)}) / ||g_{(t)}||^2 (Ribiere-Polak)
Definition at line 2324 of file TMultiLayerPerceptron.cxx.
|
protected |
scalar product between gradient and direction = derivative along direction
Definition at line 2415 of file TMultiLayerPerceptron.cxx.
|
virtual |
Draws the network structure.
Neurons are depicted by a blue disk, and synapses by lines connecting neurons. The line width is proportionnal to the weight.
Definition at line 2469 of file TMultiLayerPerceptron.cxx.
void TMultiLayerPerceptron::DrawResult | ( | Int_t | index = 0 , |
Option_t * | option = "test" |
||
) | const |
Draws the neural net output It produces an histogram with the output for the two datasets.
Index is the number of the desired output neuron. "option" can contain:
Definition at line 1483 of file TMultiLayerPerceptron.cxx.
Bool_t TMultiLayerPerceptron::DumpWeights | ( | Option_t * | filename = "-" | ) | const |
Dumps the weights to a text file.
Set filename to "-" (default) to dump to the standard output
Definition at line 1557 of file TMultiLayerPerceptron.cxx.
Double_t TMultiLayerPerceptron::Evaluate | ( | Int_t | index, |
Double_t * | params | ||
) | const |
Returns the Neural Net for a given set of input parameters #parameters must equal #input neurons.
Definition at line 1663 of file TMultiLayerPerceptron.cxx.
|
private |
Expand the structure of the first layer.
Definition at line 1274 of file TMultiLayerPerceptron.cxx.
void TMultiLayerPerceptron::Export | ( | Option_t * | filename = "NNfunction" , |
Option_t * | language = "C++" |
||
) | const |
Exports the NN as a function for any non-ROOT-dependant code Supported languages are: only C++ , FORTRAN and Python (yet) This feature is also usefull if you want to plot the NN as a function (TF1 or TF2).
Definition at line 1688 of file TMultiLayerPerceptron.cxx.
|
protected |
Computes the hessian matrix using the BFGS update algorithm.
from gamma (g_{(t+1)}-g_{(t)}) and delta (w_{(t+1)}-w_{(t)}). It returns true if such a direction could not be found (if gamma and delta are orthogonal).
Definition at line 2350 of file TMultiLayerPerceptron.cxx.
|
protected |
Cross entropy error for a softmax output neuron, for a given event.
Definition at line 1092 of file TMultiLayerPerceptron.cxx.
|
protected |
Cross entropy error for sigmoid output neurons, for a given event.
Definition at line 1061 of file TMultiLayerPerceptron.cxx.
|
inline |
Definition at line 100 of file TMultiLayerPerceptron.h.
|
protected |
Load an entry into the network.
Definition at line 709 of file TMultiLayerPerceptron.cxx.
|
inline |
Definition at line 99 of file TMultiLayerPerceptron.h.
Double_t TMultiLayerPerceptron::GetError | ( | Int_t | event | ) | const |
Error on the output for a given event.
Definition at line 996 of file TMultiLayerPerceptron.cxx.
Double_t TMultiLayerPerceptron::GetError | ( | TMultiLayerPerceptron::EDataSet | set | ) | const |
Error on the whole dataset.
Definition at line 1025 of file TMultiLayerPerceptron.cxx.
|
inline |
Definition at line 98 of file TMultiLayerPerceptron.h.
|
inline |
Definition at line 101 of file TMultiLayerPerceptron.h.
|
inline |
Definition at line 102 of file TMultiLayerPerceptron.h.
|
inline |
Definition at line 104 of file TMultiLayerPerceptron.h.
|
inline |
Definition at line 105 of file TMultiLayerPerceptron.h.
|
protected |
Error on the output for a given event.
Definition at line 1048 of file TMultiLayerPerceptron.cxx.
|
inline |
Definition at line 103 of file TMultiLayerPerceptron.h.
|
inline |
Definition at line 106 of file TMultiLayerPerceptron.h.
|
protected |
Search along the line defined by direction.
buffer is not used but is updated with the new dw so that it can be used by a later stochastic step. It returns true if the line search fails.
Definition at line 2221 of file TMultiLayerPerceptron.cxx.
Bool_t TMultiLayerPerceptron::LoadWeights | ( | Option_t * | filename = "" | ) |
Loads the weights from a text file conforming to the format defined by DumpWeights.
Definition at line 1607 of file TMultiLayerPerceptron.cxx.
|
protected |
One step for the batch (stochastic) method.
DEDw should have been updated before calling this.
Definition at line 2150 of file TMultiLayerPerceptron.cxx.
|
private |
Sets the weights to a point along a line Weights are set to [origin + (dist * dir)].
Definition at line 2178 of file TMultiLayerPerceptron.cxx.
|
protected |
One step for the stochastic method buffer should contain the previous dw vector and will be updated.
Definition at line 2105 of file TMultiLayerPerceptron.cxx.
|
private |
void TMultiLayerPerceptron::Randomize | ( | ) | const |
Randomize the weights.
Definition at line 1189 of file TMultiLayerPerceptron.cxx.
Double_t TMultiLayerPerceptron::Result | ( | Int_t | event, |
Int_t | index = 0 |
||
) | const |
Computes the output for a given event.
Look at the output neuron designed by index.
Definition at line 983 of file TMultiLayerPerceptron.cxx.
void TMultiLayerPerceptron::SetData | ( | TTree * | data | ) |
Set the data source.
Definition at line 546 of file TMultiLayerPerceptron.cxx.
void TMultiLayerPerceptron::SetDelta | ( | Double_t | delta | ) |
Sets Delta - used in stochastic minimisation (look at the constructor for the complete description of learning methods and parameters)
Definition at line 670 of file TMultiLayerPerceptron.cxx.
void TMultiLayerPerceptron::SetEpsilon | ( | Double_t | eps | ) |
Sets Epsilon - used in stochastic minimisation (look at the constructor for the complete description of learning methods and parameters)
Definition at line 660 of file TMultiLayerPerceptron.cxx.
void TMultiLayerPerceptron::SetEta | ( | Double_t | eta | ) |
Sets Eta - used in stochastic minimisation (look at the constructor for the complete description of learning methods and parameters)
Definition at line 650 of file TMultiLayerPerceptron.cxx.
void TMultiLayerPerceptron::SetEtaDecay | ( | Double_t | ed | ) |
Sets EtaDecay - Eta *= EtaDecay at each epoch (look at the constructor for the complete description of learning methods and parameters)
Definition at line 680 of file TMultiLayerPerceptron.cxx.
void TMultiLayerPerceptron::SetEventWeight | ( | const char * | branch | ) |
Set the event weight.
Definition at line 562 of file TMultiLayerPerceptron.cxx.
|
protected |
Sets the gamma (g_{(t+1)}-g_{(t)}) and delta (w_{(t+1)}-w_{(t)}) vectors Gamma is computed here, so ComputeDEDw cannot have been called before, and delta is a direct translation of buffer into a TMatrixD.
Definition at line 2376 of file TMultiLayerPerceptron.cxx.
void TMultiLayerPerceptron::SetLearningMethod | ( | TMultiLayerPerceptron::ELearningMethod | method | ) |
Sets the learning method.
Available methods are: kStochastic, kBatch, kSteepestDescent, kRibierePolak, kFletcherReeves and kBFGS. (look at the constructor for the complete description of learning methods and parameters)
Definition at line 640 of file TMultiLayerPerceptron.cxx.
void TMultiLayerPerceptron::SetReset | ( | Int_t | reset | ) |
Sets number of epochs between two resets of the search direction to the steepest descent.
(look at the constructor for the complete description of learning methods and parameters)
Definition at line 701 of file TMultiLayerPerceptron.cxx.
void TMultiLayerPerceptron::SetTau | ( | Double_t | tau | ) |
Sets Tau - used in line search (look at the constructor for the complete description of learning methods and parameters)
Definition at line 690 of file TMultiLayerPerceptron.cxx.
void TMultiLayerPerceptron::SetTestDataSet | ( | TEventList * | test | ) |
Sets the Test dataset.
Those events will not be used for the minimization but for control
Definition at line 589 of file TMultiLayerPerceptron.cxx.
void TMultiLayerPerceptron::SetTestDataSet | ( | const char * | test | ) |
Sets the Test dataset.
Those events will not be used for the minimization but for control. Note that the tree must be already defined.
Definition at line 619 of file TMultiLayerPerceptron.cxx.
void TMultiLayerPerceptron::SetTrainingDataSet | ( | TEventList * | train | ) |
Sets the Training dataset.
Those events will be used for the minimization
Definition at line 578 of file TMultiLayerPerceptron.cxx.
void TMultiLayerPerceptron::SetTrainingDataSet | ( | const char * | train | ) |
Sets the Training dataset.
Those events will be used for the minimization. Note that the tree must be already defined.
Definition at line 601 of file TMultiLayerPerceptron.cxx.
|
private |
Shuffle the Int_t index[n] in input.
Input: index: the array to shuffle n: the size of the array Output: index: the shuffled indexes This method is used for stochastic training
Definition at line 2086 of file TMultiLayerPerceptron.cxx.
|
protected |
Sets the search direction to steepest descent.
Definition at line 2200 of file TMultiLayerPerceptron.cxx.
void TMultiLayerPerceptron::Train | ( | Int_t | nEpoch, |
Option_t * | option = "text" , |
||
Double_t | minE = 0 |
||
) |
Train the network.
nEpoch is the number of iterations. option can contain:
Definition at line 738 of file TMultiLayerPerceptron.cxx.
|
friend |
Definition at line 49 of file TMultiLayerPerceptron.h.
|
private |
pointer to the tree used as datasource
Definition at line 146 of file TMultiLayerPerceptron.h.
|
private |
index of the current tree in a chain
Definition at line 147 of file TMultiLayerPerceptron.h.
|
private |
Definition at line 145 of file TMultiLayerPerceptron.h.
|
private |
Epsilon - used in stochastic minimisation - Default=0.
Definition at line 165 of file TMultiLayerPerceptron.h.
|
private |
Eta - used in stochastic minimisation - Default=0.1.
Definition at line 164 of file TMultiLayerPerceptron.h.
|
private |
TTreeFormulaManager for the weight and neurons.
Definition at line 163 of file TMultiLayerPerceptron.h.
|
private |
Delta - used in stochastic minimisation - Default=0.
Definition at line 166 of file TMultiLayerPerceptron.h.
|
private |
The Learning Method.
Definition at line 161 of file TMultiLayerPerceptron.h.
|
private |
Definition at line 157 of file TMultiLayerPerceptron.h.
|
private |
Definition at line 156 of file TMultiLayerPerceptron.h.
|
private |
Definition at line 149 of file TMultiLayerPerceptron.h.
|
private |
Tau - used in line search - Default=3.
Definition at line 168 of file TMultiLayerPerceptron.h.
|
private |
Definition at line 150 of file TMultiLayerPerceptron.h.
|
private |
EventList defining the events in the test dataset.
Definition at line 160 of file TMultiLayerPerceptron.h.
|
private |
formula representing the event weight
Definition at line 162 of file TMultiLayerPerceptron.h.
|
private |
weight of the current tree in a chain
Definition at line 148 of file TMultiLayerPerceptron.h.
|
private |
Definition at line 155 of file TMultiLayerPerceptron.h.
|
private |
internal parameter used in line search
Definition at line 169 of file TMultiLayerPerceptron.h.
|
private |
Definition at line 152 of file TMultiLayerPerceptron.h.
|
private |
Definition at line 151 of file TMultiLayerPerceptron.h.
|
private |
EtaDecay - Eta *= EtaDecay at each epoch - Default=1.
Definition at line 167 of file TMultiLayerPerceptron.h.
|
private |
EventList defining the events in the training dataset.
Definition at line 159 of file TMultiLayerPerceptron.h.
|
private |
internal flag whether one has to delete fTraining or not
Definition at line 171 of file TMultiLayerPerceptron.h.
|
private |
Definition at line 158 of file TMultiLayerPerceptron.h.
|
private |
number of epochs between two resets of the search direction to the steepest descent - Default=50
Definition at line 170 of file TMultiLayerPerceptron.h.
|
private |
Definition at line 154 of file TMultiLayerPerceptron.h.
|
private |
Definition at line 153 of file TMultiLayerPerceptron.h.