Logo ROOT   6.13/01
Reference Guide
List of all members | Public Types | Public Member Functions | Protected Member Functions | Private Member Functions | Private Attributes | Friends | List of all members
TMultiLayerPerceptron Class Reference

Definition at line 48 of file TMultiLayerPerceptron.h.

Public Types

enum  EDataSet { kTraining, kTest }
 
enum  ELearningMethod {
  kStochastic, kBatch, kSteepestDescent, kRibierePolak,
  kFletcherReeves, kBFGS
}
 

Public Member Functions

 TMultiLayerPerceptron ()
 Default constructor. More...
 
 TMultiLayerPerceptron (const char *layout, TTree *data=0, const char *training="Entry$%2==0", const char *test="", TNeuron::ENeuronType type=TNeuron::kSigmoid, const char *extF="", const char *extD="")
 The network is described by a simple string: The input/output layers are defined by giving the branch names separated by comas. More...
 
 TMultiLayerPerceptron (const char *layout, const char *weight, TTree *data=0, const char *training="Entry$%2==0", const char *test="", TNeuron::ENeuronType type=TNeuron::kSigmoid, const char *extF="", const char *extD="")
 The network is described by a simple string: The input/output layers are defined by giving the branch names separated by comas. More...
 
 TMultiLayerPerceptron (const char *layout, TTree *data, TEventList *training, TEventList *test, TNeuron::ENeuronType type=TNeuron::kSigmoid, const char *extF="", const char *extD="")
 The network is described by a simple string: The input/output layers are defined by giving the branch names separated by comas. More...
 
 TMultiLayerPerceptron (const char *layout, const char *weight, TTree *data, TEventList *training, TEventList *test, TNeuron::ENeuronType type=TNeuron::kSigmoid, const char *extF="", const char *extD="")
 The network is described by a simple string: The input/output layers are defined by giving the branch names separated by comas. More...
 
virtual ~TMultiLayerPerceptron ()
 Destructor. More...
 
void ComputeDEDw () const
 Compute the DEDw = sum on all training events of dedw for each weight normalized by the number of events. More...
 
virtual void Draw (Option_t *option="")
 Draws the network structure. More...
 
void DrawResult (Int_t index=0, Option_t *option="test") const
 Draws the neural net output It produces an histogram with the output for the two datasets. More...
 
Bool_t DumpWeights (Option_t *filename="-") const
 Dumps the weights to a text file. More...
 
Double_t Evaluate (Int_t index, Double_t *params) const
 Returns the Neural Net for a given set of input parameters #parameters must equal #input neurons. More...
 
void Export (Option_t *filename="NNfunction", Option_t *language="C++") const
 Exports the NN as a function for any non-ROOT-dependant code Supported languages are: only C++ , FORTRAN and Python (yet) This feature is also usefull if you want to plot the NN as a function (TF1 or TF2). More...
 
Double_t GetDelta () const
 
Double_t GetEpsilon () const
 
Double_t GetError (Int_t event) const
 Error on the output for a given event. More...
 
Double_t GetError (TMultiLayerPerceptron::EDataSet set) const
 Error on the whole dataset. More...
 
Double_t GetEta () const
 
Double_t GetEtaDecay () const
 
TMultiLayerPerceptron::ELearningMethod GetLearningMethod () const
 
Int_t GetReset () const
 
TString GetStructure () const
 
Double_t GetTau () const
 
TNeuron::ENeuronType GetType () const
 
Bool_t LoadWeights (Option_t *filename="")
 Loads the weights from a text file conforming to the format defined by DumpWeights. More...
 
void Randomize () const
 Randomize the weights. More...
 
Double_t Result (Int_t event, Int_t index=0) const
 Computes the output for a given event. More...
 
void SetData (TTree *)
 Set the data source. More...
 
void SetDelta (Double_t delta)
 Sets Delta - used in stochastic minimisation (look at the constructor for the complete description of learning methods and parameters) More...
 
void SetEpsilon (Double_t eps)
 Sets Epsilon - used in stochastic minimisation (look at the constructor for the complete description of learning methods and parameters) More...
 
void SetEta (Double_t eta)
 Sets Eta - used in stochastic minimisation (look at the constructor for the complete description of learning methods and parameters) More...
 
void SetEtaDecay (Double_t ed)
 Sets EtaDecay - Eta *= EtaDecay at each epoch (look at the constructor for the complete description of learning methods and parameters) More...
 
void SetEventWeight (const char *)
 Set the event weight. More...
 
void SetLearningMethod (TMultiLayerPerceptron::ELearningMethod method)
 Sets the learning method. More...
 
void SetReset (Int_t reset)
 Sets number of epochs between two resets of the search direction to the steepest descent. More...
 
void SetTau (Double_t tau)
 Sets Tau - used in line search (look at the constructor for the complete description of learning methods and parameters) More...
 
void SetTestDataSet (TEventList *test)
 Sets the Test dataset. More...
 
void SetTestDataSet (const char *test)
 Sets the Test dataset. More...
 
void SetTrainingDataSet (TEventList *train)
 Sets the Training dataset. More...
 
void SetTrainingDataSet (const char *train)
 Sets the Training dataset. More...
 
void Train (Int_t nEpoch, Option_t *option="text", Double_t minE=0)
 Train the network. More...
 

Protected Member Functions

void AttachData ()
 Connects the TTree to Neurons in input and output layers. More...
 
void BFGSDir (TMatrixD &, Double_t *)
 Computes the direction for the BFGS algorithm as the product between the Hessian estimate (bfgsh) and the dir. More...
 
void BuildNetwork ()
 Instanciates the network from the description. More...
 
void ConjugateGradientsDir (Double_t *, Double_t)
 Sets the search direction to conjugate gradient direction beta should be: ||g_{(t+1)}||^2 / ||g_{(t)}||^2 (Fletcher-Reeves) g_{(t+1)} (g_{(t+1)}-g_{(t)}) / ||g_{(t)}||^2 (Ribiere-Polak) More...
 
Double_t DerivDir (Double_t *)
 scalar product between gradient and direction = derivative along direction More...
 
bool GetBFGSH (TMatrixD &, TMatrixD &, TMatrixD &)
 Computes the hessian matrix using the BFGS update algorithm. More...
 
Double_t GetCrossEntropy () const
 Cross entropy error for a softmax output neuron, for a given event. More...
 
Double_t GetCrossEntropyBinary () const
 Cross entropy error for sigmoid output neurons, for a given event. More...
 
void GetEntry (Int_t) const
 Load an entry into the network. More...
 
Double_t GetSumSquareError () const
 Error on the output for a given event. More...
 
Bool_t LineSearch (Double_t *, Double_t *)
 Search along the line defined by direction. More...
 
void MLP_Batch (Double_t *)
 One step for the batch (stochastic) method. More...
 
void MLP_Stochastic (Double_t *)
 One step for the stochastic method buffer should contain the previous dw vector and will be updated. More...
 
void SetGammaDelta (TMatrixD &, TMatrixD &, Double_t *)
 Sets the gamma (g_{(t+1)}-g_{(t)}) and delta (w_{(t+1)}-w_{(t)}) vectors Gamma is computed here, so ComputeDEDw cannot have been called before, and delta is a direct translation of buffer into a TMatrixD. More...
 
void SteepestDir (Double_t *)
 Sets the search direction to steepest descent. More...
 

Private Member Functions

 TMultiLayerPerceptron (const TMultiLayerPerceptron &)
 
void BuildFirstLayer (TString &)
 Instanciates the neurons in input Inputs are normalised and the type is set to kOff (simple forward of the formula value) More...
 
void BuildHiddenLayers (TString &)
 Builds hidden layers. More...
 
void BuildLastLayer (TString &, Int_t)
 Builds the output layer Neurons are linear combinations of input, by defaul. More...
 
void BuildOneHiddenLayer (const TString &sNumNodes, Int_t &layer, Int_t &prevStart, Int_t &prevStop, Bool_t lastLayer)
 Builds a hidden layer, updates the number of layers. More...
 
void ExpandStructure ()
 Expand the structure of the first layer. More...
 
void MLP_Line (Double_t *, Double_t *, Double_t)
 Sets the weights to a point along a line Weights are set to [origin + (dist * dir)]. More...
 
TMultiLayerPerceptronoperator= (const TMultiLayerPerceptron &)
 
void Shuffle (Int_t *, Int_t) const
 Shuffle the Int_t index[n] in input. More...
 

Private Attributes

Int_t fCurrentTree
 pointer to the tree used as datasource More...
 
Double_t fCurrentTreeWeight
 index of the current tree in a chain More...
 
TTree * fData
 
Double_t fDelta
 Epsilon - used in stochastic minimisation - Default=0. More...
 
Double_t fEpsilon
 Eta - used in stochastic minimisation - Default=0.1. More...
 
Double_t fEta
 TTreeFormulaManager for the weight and neurons. More...
 
Double_t fEtaDecay
 Delta - used in stochastic minimisation - Default=0. More...
 
TTreeFormula * fEventWeight
 The Learning Method. More...
 
TString fextD
 
TString fextF
 
TObjArray fFirstLayer
 
Double_t fLastAlpha
 Tau - used in line search - Default=3. More...
 
TObjArray fLastLayer
 
ELearningMethod fLearningMethod
 EventList defining the events in the test dataset. More...
 
TTreeFormulaManager * fManager
 formula representing the event weight More...
 
TObjArray fNetwork
 weight of the current tree in a chain More...
 
TNeuron::ENeuronType fOutType
 
Int_t fReset
 internal parameter used in line search More...
 
TString fStructure
 
TObjArray fSynapses
 
Double_t fTau
 EtaDecay - Eta *= EtaDecay at each epoch - Default=1. More...
 
TEventList * fTest
 EventList defining the events in the training dataset. More...
 
Bool_t fTestOwner
 internal flag whether one has to delete fTraining or not More...
 
TEventList * fTraining
 
Bool_t fTrainingOwner
 number of epochs between two resets of the search direction to the steepest descent - Default=50 More...
 
TNeuron::ENeuronType fType
 
TString fWeight
 

Friends

class TMLPAnalyzer
 

#include <TMultiLayerPerceptron.h>

Inheritance diagram for TMultiLayerPerceptron:
[legend]

Member Enumeration Documentation

◆ EDataSet

Enumerator
kTraining 
kTest 

Definition at line 54 of file TMultiLayerPerceptron.h.

◆ ELearningMethod

Enumerator
kStochastic 
kBatch 
kSteepestDescent 
kRibierePolak 
kFletcherReeves 
kBFGS 

Definition at line 52 of file TMultiLayerPerceptron.h.

Constructor & Destructor Documentation

◆ TMultiLayerPerceptron() [1/6]

TMultiLayerPerceptron::TMultiLayerPerceptron ( )

Default constructor.

Definition at line 252 of file TMultiLayerPerceptron.cxx.

◆ TMultiLayerPerceptron() [2/6]

TMultiLayerPerceptron::TMultiLayerPerceptron ( const char *  layout,
TTree *  data = 0,
const char *  training = "Entry$%2==0",
const char *  test = "",
TNeuron::ENeuronType  type = TNeuron::kSigmoid,
const char *  extF = "",
const char *  extD = "" 
)

The network is described by a simple string: The input/output layers are defined by giving the branch names separated by comas.

Hidden layers are just described by the number of neurons. The layers are separated by colons. Ex: "x,y:10:5:f" The output can be prepended by '@' if the variable has to be normalized. The output can be followed by '!' to use Softmax neurons for the output layer only. Ex: "x,y:10:5:c1,c2,c3!" Input and outputs are taken from the TTree given as second argument. training and test are two cuts (see TTreeFormula) defining events to be used during the neural net training and testing. Example: "Entry$%2", "(Entry$+1)%2". Both the TTree and the cut can be defined in the constructor, or later with the suited setter method.

Definition at line 420 of file TMultiLayerPerceptron.cxx.

◆ TMultiLayerPerceptron() [3/6]

TMultiLayerPerceptron::TMultiLayerPerceptron ( const char *  layout,
const char *  weight,
TTree *  data = 0,
const char *  training = "Entry$%2==0",
const char *  test = "",
TNeuron::ENeuronType  type = TNeuron::kSigmoid,
const char *  extF = "",
const char *  extD = "" 
)

The network is described by a simple string: The input/output layers are defined by giving the branch names separated by comas.

Hidden layers are just described by the number of neurons. The layers are separated by colons. Ex: "x,y:10:5:f" The output can be prepended by '@' if the variable has to be normalized. The output can be followed by '!' to use Softmax neurons for the output layer only. Ex: "x,y:10:5:c1,c2,c3!" Input and outputs are taken from the TTree given as second argument. training and test are two cuts (see TTreeFormula) defining events to be used during the neural net training and testing. Example: "Entry$%2", "(Entry$+1)%2". Both the TTree and the cut can be defined in the constructor, or later with the suited setter method.

Definition at line 486 of file TMultiLayerPerceptron.cxx.

◆ TMultiLayerPerceptron() [4/6]

TMultiLayerPerceptron::TMultiLayerPerceptron ( const char *  layout,
TTree *  data,
TEventList *  training,
TEventList *  test,
TNeuron::ENeuronType  type = TNeuron::kSigmoid,
const char *  extF = "",
const char *  extD = "" 
)

The network is described by a simple string: The input/output layers are defined by giving the branch names separated by comas.

Hidden layers are just described by the number of neurons. The layers are separated by colons. Ex: "x,y:10:5:f" The output can be prepended by '@' if the variable has to be normalized. The output can be followed by '!' to use Softmax neurons for the output layer only. Ex: "x,y:10:5:c1,c2,c3!" Input and outputs are taken from the TTree given as second argument. training and test are the two TEventLists defining events to be used during the neural net training. Both the TTree and the TEventLists can be defined in the constructor, or later with the suited setter method.

Definition at line 302 of file TMultiLayerPerceptron.cxx.

◆ TMultiLayerPerceptron() [5/6]

TMultiLayerPerceptron::TMultiLayerPerceptron ( const char *  layout,
const char *  weight,
TTree *  data,
TEventList *  training,
TEventList *  test,
TNeuron::ENeuronType  type = TNeuron::kSigmoid,
const char *  extF = "",
const char *  extD = "" 
)

The network is described by a simple string: The input/output layers are defined by giving the branch names separated by comas.

Hidden layers are just described by the number of neurons. The layers are separated by colons. Ex: "x,y:10:5:f" The output can be prepended by '@' if the variable has to be normalized. The output can be followed by '!' to use Softmax neurons for the output layer only. Ex: "x,y:10:5:c1,c2,c3!" Input and outputs are taken from the TTree given as second argument. training and test are the two TEventLists defining events to be used during the neural net training. Both the TTree and the TEventLists can be defined in the constructor, or later with the suited setter method.

Definition at line 360 of file TMultiLayerPerceptron.cxx.

◆ ~TMultiLayerPerceptron()

TMultiLayerPerceptron::~TMultiLayerPerceptron ( )
virtual

Destructor.

Definition at line 537 of file TMultiLayerPerceptron.cxx.

◆ TMultiLayerPerceptron() [6/6]

TMultiLayerPerceptron::TMultiLayerPerceptron ( const TMultiLayerPerceptron )
private

Member Function Documentation

◆ AttachData()

void TMultiLayerPerceptron::AttachData ( )
protected

Connects the TTree to Neurons in input and output layers.

The formulas associated to each neuron are created and reported to the network formula manager. By default, the branch is not normalised since this would degrade performance for classification jobs. Normalisation can be requested by putting '@' in front of the formula.

Definition at line 1216 of file TMultiLayerPerceptron.cxx.

◆ BFGSDir()

void TMultiLayerPerceptron::BFGSDir ( TMatrixD bfgsh,
Double_t *  dir 
)
protected

Computes the direction for the BFGS algorithm as the product between the Hessian estimate (bfgsh) and the dir.

Definition at line 2439 of file TMultiLayerPerceptron.cxx.

◆ BuildFirstLayer()

void TMultiLayerPerceptron::BuildFirstLayer ( TString &  input)
private

Instanciates the neurons in input Inputs are normalised and the type is set to kOff (simple forward of the formula value)

Definition at line 1351 of file TMultiLayerPerceptron.cxx.

◆ BuildHiddenLayers()

void TMultiLayerPerceptron::BuildHiddenLayers ( TString &  hidden)
private

Builds hidden layers.

Definition at line 1369 of file TMultiLayerPerceptron.cxx.

◆ BuildLastLayer()

void TMultiLayerPerceptron::BuildLastLayer ( TString &  output,
Int_t  prev 
)
private

Builds the output layer Neurons are linear combinations of input, by defaul.

If the structure ends with "!", neurons are set up for classification, ie. with a sigmoid (1 neuron) or softmax (more neurons) activation function.

Definition at line 1433 of file TMultiLayerPerceptron.cxx.

◆ BuildNetwork()

void TMultiLayerPerceptron::BuildNetwork ( )
protected

Instanciates the network from the description.

Definition at line 1320 of file TMultiLayerPerceptron.cxx.

◆ BuildOneHiddenLayer()

void TMultiLayerPerceptron::BuildOneHiddenLayer ( const TString &  sNumNodes,
Int_t &  layer,
Int_t &  prevStart,
Int_t &  prevStop,
Bool_t  lastLayer 
)
private

Builds a hidden layer, updates the number of layers.

Definition at line 1388 of file TMultiLayerPerceptron.cxx.

◆ ComputeDEDw()

void TMultiLayerPerceptron::ComputeDEDw ( ) const

Compute the DEDw = sum on all training events of dedw for each weight normalized by the number of events.

Definition at line 1113 of file TMultiLayerPerceptron.cxx.

◆ ConjugateGradientsDir()

void TMultiLayerPerceptron::ConjugateGradientsDir ( Double_t *  dir,
Double_t  beta 
)
protected

Sets the search direction to conjugate gradient direction beta should be: ||g_{(t+1)}||^2 / ||g_{(t)}||^2 (Fletcher-Reeves) g_{(t+1)} (g_{(t+1)}-g_{(t)}) / ||g_{(t)}||^2 (Ribiere-Polak)

Definition at line 2324 of file TMultiLayerPerceptron.cxx.

◆ DerivDir()

Double_t TMultiLayerPerceptron::DerivDir ( Double_t *  dir)
protected

scalar product between gradient and direction = derivative along direction

Definition at line 2415 of file TMultiLayerPerceptron.cxx.

◆ Draw()

void TMultiLayerPerceptron::Draw ( Option_t *  option = "")
virtual

Draws the network structure.

Neurons are depicted by a blue disk, and synapses by lines connecting neurons. The line width is proportionnal to the weight.

Definition at line 2469 of file TMultiLayerPerceptron.cxx.

◆ DrawResult()

void TMultiLayerPerceptron::DrawResult ( Int_t  index = 0,
Option_t *  option = "test" 
) const

Draws the neural net output It produces an histogram with the output for the two datasets.

Index is the number of the desired output neuron. "option" can contain:

  • test or train to select a dataset
  • comp to produce a X-Y comparison plot
  • nocanv to not create a new TCanvas for the plot

Definition at line 1483 of file TMultiLayerPerceptron.cxx.

◆ DumpWeights()

Bool_t TMultiLayerPerceptron::DumpWeights ( Option_t *  filename = "-") const

Dumps the weights to a text file.

Set filename to "-" (default) to dump to the standard output

Definition at line 1557 of file TMultiLayerPerceptron.cxx.

◆ Evaluate()

Double_t TMultiLayerPerceptron::Evaluate ( Int_t  index,
Double_t *  params 
) const

Returns the Neural Net for a given set of input parameters #parameters must equal #input neurons.

Definition at line 1663 of file TMultiLayerPerceptron.cxx.

◆ ExpandStructure()

void TMultiLayerPerceptron::ExpandStructure ( )
private

Expand the structure of the first layer.

Definition at line 1274 of file TMultiLayerPerceptron.cxx.

◆ Export()

void TMultiLayerPerceptron::Export ( Option_t *  filename = "NNfunction",
Option_t *  language = "C++" 
) const

Exports the NN as a function for any non-ROOT-dependant code Supported languages are: only C++ , FORTRAN and Python (yet) This feature is also usefull if you want to plot the NN as a function (TF1 or TF2).

Definition at line 1688 of file TMultiLayerPerceptron.cxx.

◆ GetBFGSH()

bool TMultiLayerPerceptron::GetBFGSH ( TMatrixD bfgsh,
TMatrixD gamma,
TMatrixD delta 
)
protected

Computes the hessian matrix using the BFGS update algorithm.

from gamma (g_{(t+1)}-g_{(t)}) and delta (w_{(t+1)}-w_{(t)}). It returns true if such a direction could not be found (if gamma and delta are orthogonal).

Definition at line 2350 of file TMultiLayerPerceptron.cxx.

◆ GetCrossEntropy()

Double_t TMultiLayerPerceptron::GetCrossEntropy ( ) const
protected

Cross entropy error for a softmax output neuron, for a given event.

Definition at line 1092 of file TMultiLayerPerceptron.cxx.

◆ GetCrossEntropyBinary()

Double_t TMultiLayerPerceptron::GetCrossEntropyBinary ( ) const
protected

Cross entropy error for sigmoid output neurons, for a given event.

Definition at line 1061 of file TMultiLayerPerceptron.cxx.

◆ GetDelta()

Double_t TMultiLayerPerceptron::GetDelta ( ) const
inline

Definition at line 100 of file TMultiLayerPerceptron.h.

◆ GetEntry()

void TMultiLayerPerceptron::GetEntry ( Int_t  entry) const
protected

Load an entry into the network.

Definition at line 709 of file TMultiLayerPerceptron.cxx.

◆ GetEpsilon()

Double_t TMultiLayerPerceptron::GetEpsilon ( ) const
inline

Definition at line 99 of file TMultiLayerPerceptron.h.

◆ GetError() [1/2]

Double_t TMultiLayerPerceptron::GetError ( Int_t  event) const

Error on the output for a given event.

Definition at line 996 of file TMultiLayerPerceptron.cxx.

◆ GetError() [2/2]

Double_t TMultiLayerPerceptron::GetError ( TMultiLayerPerceptron::EDataSet  set) const

Error on the whole dataset.

Definition at line 1025 of file TMultiLayerPerceptron.cxx.

◆ GetEta()

Double_t TMultiLayerPerceptron::GetEta ( ) const
inline

Definition at line 98 of file TMultiLayerPerceptron.h.

◆ GetEtaDecay()

Double_t TMultiLayerPerceptron::GetEtaDecay ( ) const
inline

Definition at line 101 of file TMultiLayerPerceptron.h.

◆ GetLearningMethod()

TMultiLayerPerceptron::ELearningMethod TMultiLayerPerceptron::GetLearningMethod ( ) const
inline

Definition at line 102 of file TMultiLayerPerceptron.h.

◆ GetReset()

Int_t TMultiLayerPerceptron::GetReset ( ) const
inline

Definition at line 104 of file TMultiLayerPerceptron.h.

◆ GetStructure()

TString TMultiLayerPerceptron::GetStructure ( ) const
inline

Definition at line 105 of file TMultiLayerPerceptron.h.

◆ GetSumSquareError()

Double_t TMultiLayerPerceptron::GetSumSquareError ( ) const
protected

Error on the output for a given event.

Definition at line 1048 of file TMultiLayerPerceptron.cxx.

◆ GetTau()

Double_t TMultiLayerPerceptron::GetTau ( ) const
inline

Definition at line 103 of file TMultiLayerPerceptron.h.

◆ GetType()

TNeuron::ENeuronType TMultiLayerPerceptron::GetType ( ) const
inline

Definition at line 106 of file TMultiLayerPerceptron.h.

◆ LineSearch()

bool TMultiLayerPerceptron::LineSearch ( Double_t *  direction,
Double_t *  buffer 
)
protected

Search along the line defined by direction.

buffer is not used but is updated with the new dw so that it can be used by a later stochastic step. It returns true if the line search fails.

Definition at line 2221 of file TMultiLayerPerceptron.cxx.

◆ LoadWeights()

Bool_t TMultiLayerPerceptron::LoadWeights ( Option_t *  filename = "")

Loads the weights from a text file conforming to the format defined by DumpWeights.

Definition at line 1607 of file TMultiLayerPerceptron.cxx.

◆ MLP_Batch()

void TMultiLayerPerceptron::MLP_Batch ( Double_t *  buffer)
protected

One step for the batch (stochastic) method.

DEDw should have been updated before calling this.

Definition at line 2150 of file TMultiLayerPerceptron.cxx.

◆ MLP_Line()

void TMultiLayerPerceptron::MLP_Line ( Double_t *  origin,
Double_t *  dir,
Double_t  dist 
)
private

Sets the weights to a point along a line Weights are set to [origin + (dist * dir)].

Definition at line 2178 of file TMultiLayerPerceptron.cxx.

◆ MLP_Stochastic()

void TMultiLayerPerceptron::MLP_Stochastic ( Double_t *  buffer)
protected

One step for the stochastic method buffer should contain the previous dw vector and will be updated.

Definition at line 2105 of file TMultiLayerPerceptron.cxx.

◆ operator=()

TMultiLayerPerceptron& TMultiLayerPerceptron::operator= ( const TMultiLayerPerceptron )
private

◆ Randomize()

void TMultiLayerPerceptron::Randomize ( ) const

Randomize the weights.

Definition at line 1189 of file TMultiLayerPerceptron.cxx.

◆ Result()

Double_t TMultiLayerPerceptron::Result ( Int_t  event,
Int_t  index = 0 
) const

Computes the output for a given event.

Look at the output neuron designed by index.

Definition at line 983 of file TMultiLayerPerceptron.cxx.

◆ SetData()

void TMultiLayerPerceptron::SetData ( TTree *  data)

Set the data source.

Definition at line 546 of file TMultiLayerPerceptron.cxx.

◆ SetDelta()

void TMultiLayerPerceptron::SetDelta ( Double_t  delta)

Sets Delta - used in stochastic minimisation (look at the constructor for the complete description of learning methods and parameters)

Definition at line 670 of file TMultiLayerPerceptron.cxx.

◆ SetEpsilon()

void TMultiLayerPerceptron::SetEpsilon ( Double_t  eps)

Sets Epsilon - used in stochastic minimisation (look at the constructor for the complete description of learning methods and parameters)

Definition at line 660 of file TMultiLayerPerceptron.cxx.

◆ SetEta()

void TMultiLayerPerceptron::SetEta ( Double_t  eta)

Sets Eta - used in stochastic minimisation (look at the constructor for the complete description of learning methods and parameters)

Definition at line 650 of file TMultiLayerPerceptron.cxx.

◆ SetEtaDecay()

void TMultiLayerPerceptron::SetEtaDecay ( Double_t  ed)

Sets EtaDecay - Eta *= EtaDecay at each epoch (look at the constructor for the complete description of learning methods and parameters)

Definition at line 680 of file TMultiLayerPerceptron.cxx.

◆ SetEventWeight()

void TMultiLayerPerceptron::SetEventWeight ( const char *  branch)

Set the event weight.

Definition at line 562 of file TMultiLayerPerceptron.cxx.

◆ SetGammaDelta()

void TMultiLayerPerceptron::SetGammaDelta ( TMatrixD gamma,
TMatrixD delta,
Double_t *  buffer 
)
protected

Sets the gamma (g_{(t+1)}-g_{(t)}) and delta (w_{(t+1)}-w_{(t)}) vectors Gamma is computed here, so ComputeDEDw cannot have been called before, and delta is a direct translation of buffer into a TMatrixD.

Definition at line 2376 of file TMultiLayerPerceptron.cxx.

◆ SetLearningMethod()

void TMultiLayerPerceptron::SetLearningMethod ( TMultiLayerPerceptron::ELearningMethod  method)

Sets the learning method.

Available methods are: kStochastic, kBatch, kSteepestDescent, kRibierePolak, kFletcherReeves and kBFGS. (look at the constructor for the complete description of learning methods and parameters)

Definition at line 640 of file TMultiLayerPerceptron.cxx.

◆ SetReset()

void TMultiLayerPerceptron::SetReset ( Int_t  reset)

Sets number of epochs between two resets of the search direction to the steepest descent.

(look at the constructor for the complete description of learning methods and parameters)

Definition at line 701 of file TMultiLayerPerceptron.cxx.

◆ SetTau()

void TMultiLayerPerceptron::SetTau ( Double_t  tau)

Sets Tau - used in line search (look at the constructor for the complete description of learning methods and parameters)

Definition at line 690 of file TMultiLayerPerceptron.cxx.

◆ SetTestDataSet() [1/2]

void TMultiLayerPerceptron::SetTestDataSet ( TEventList *  test)

Sets the Test dataset.

Those events will not be used for the minimization but for control

Definition at line 589 of file TMultiLayerPerceptron.cxx.

◆ SetTestDataSet() [2/2]

void TMultiLayerPerceptron::SetTestDataSet ( const char *  test)

Sets the Test dataset.

Those events will not be used for the minimization but for control. Note that the tree must be already defined.

Definition at line 619 of file TMultiLayerPerceptron.cxx.

◆ SetTrainingDataSet() [1/2]

void TMultiLayerPerceptron::SetTrainingDataSet ( TEventList *  train)

Sets the Training dataset.

Those events will be used for the minimization

Definition at line 578 of file TMultiLayerPerceptron.cxx.

◆ SetTrainingDataSet() [2/2]

void TMultiLayerPerceptron::SetTrainingDataSet ( const char *  train)

Sets the Training dataset.

Those events will be used for the minimization. Note that the tree must be already defined.

Definition at line 601 of file TMultiLayerPerceptron.cxx.

◆ Shuffle()

void TMultiLayerPerceptron::Shuffle ( Int_t *  index,
Int_t  n 
) const
private

Shuffle the Int_t index[n] in input.

Input: index: the array to shuffle n: the size of the array Output: index: the shuffled indexes This method is used for stochastic training

Definition at line 2086 of file TMultiLayerPerceptron.cxx.

◆ SteepestDir()

void TMultiLayerPerceptron::SteepestDir ( Double_t *  dir)
protected

Sets the search direction to steepest descent.

Definition at line 2200 of file TMultiLayerPerceptron.cxx.

◆ Train()

void TMultiLayerPerceptron::Train ( Int_t  nEpoch,
Option_t *  option = "text",
Double_t  minE = 0 
)

Train the network.

nEpoch is the number of iterations. option can contain:

  • "text" (simple text output)
  • "graph" (evoluting graphical training curves)
  • "update=X" (step for the text/graph output update)
  • "+" will skip the randomisation and start from the previous values.
  • "current" (draw in the current canvas)
  • "minErrorTrain" (stop when NN error on the training sample gets below minE
  • "minErrorTest" (stop when NN error on the test sample gets below minE All combinations are available.

Definition at line 738 of file TMultiLayerPerceptron.cxx.

Friends And Related Function Documentation

◆ TMLPAnalyzer

friend class TMLPAnalyzer
friend

Definition at line 49 of file TMultiLayerPerceptron.h.

Member Data Documentation

◆ fCurrentTree

Int_t TMultiLayerPerceptron::fCurrentTree
private

pointer to the tree used as datasource

Definition at line 146 of file TMultiLayerPerceptron.h.

◆ fCurrentTreeWeight

Double_t TMultiLayerPerceptron::fCurrentTreeWeight
private

index of the current tree in a chain

Definition at line 147 of file TMultiLayerPerceptron.h.

◆ fData

TTree* TMultiLayerPerceptron::fData
private

Definition at line 145 of file TMultiLayerPerceptron.h.

◆ fDelta

Double_t TMultiLayerPerceptron::fDelta
private

Epsilon - used in stochastic minimisation - Default=0.

Definition at line 165 of file TMultiLayerPerceptron.h.

◆ fEpsilon

Double_t TMultiLayerPerceptron::fEpsilon
private

Eta - used in stochastic minimisation - Default=0.1.

Definition at line 164 of file TMultiLayerPerceptron.h.

◆ fEta

Double_t TMultiLayerPerceptron::fEta
private

TTreeFormulaManager for the weight and neurons.

Definition at line 163 of file TMultiLayerPerceptron.h.

◆ fEtaDecay

Double_t TMultiLayerPerceptron::fEtaDecay
private

Delta - used in stochastic minimisation - Default=0.

Definition at line 166 of file TMultiLayerPerceptron.h.

◆ fEventWeight

TTreeFormula* TMultiLayerPerceptron::fEventWeight
private

The Learning Method.

Definition at line 161 of file TMultiLayerPerceptron.h.

◆ fextD

TString TMultiLayerPerceptron::fextD
private

Definition at line 157 of file TMultiLayerPerceptron.h.

◆ fextF

TString TMultiLayerPerceptron::fextF
private

Definition at line 156 of file TMultiLayerPerceptron.h.

◆ fFirstLayer

TObjArray TMultiLayerPerceptron::fFirstLayer
private

Definition at line 149 of file TMultiLayerPerceptron.h.

◆ fLastAlpha

Double_t TMultiLayerPerceptron::fLastAlpha
private

Tau - used in line search - Default=3.

Definition at line 168 of file TMultiLayerPerceptron.h.

◆ fLastLayer

TObjArray TMultiLayerPerceptron::fLastLayer
private

Definition at line 150 of file TMultiLayerPerceptron.h.

◆ fLearningMethod

ELearningMethod TMultiLayerPerceptron::fLearningMethod
private

EventList defining the events in the test dataset.

Definition at line 160 of file TMultiLayerPerceptron.h.

◆ fManager

TTreeFormulaManager* TMultiLayerPerceptron::fManager
private

formula representing the event weight

Definition at line 162 of file TMultiLayerPerceptron.h.

◆ fNetwork

TObjArray TMultiLayerPerceptron::fNetwork
private

weight of the current tree in a chain

Definition at line 148 of file TMultiLayerPerceptron.h.

◆ fOutType

TNeuron::ENeuronType TMultiLayerPerceptron::fOutType
private

Definition at line 155 of file TMultiLayerPerceptron.h.

◆ fReset

Int_t TMultiLayerPerceptron::fReset
private

internal parameter used in line search

Definition at line 169 of file TMultiLayerPerceptron.h.

◆ fStructure

TString TMultiLayerPerceptron::fStructure
private

Definition at line 152 of file TMultiLayerPerceptron.h.

◆ fSynapses

TObjArray TMultiLayerPerceptron::fSynapses
private

Definition at line 151 of file TMultiLayerPerceptron.h.

◆ fTau

Double_t TMultiLayerPerceptron::fTau
private

EtaDecay - Eta *= EtaDecay at each epoch - Default=1.

Definition at line 167 of file TMultiLayerPerceptron.h.

◆ fTest

TEventList* TMultiLayerPerceptron::fTest
private

EventList defining the events in the training dataset.

Definition at line 159 of file TMultiLayerPerceptron.h.

◆ fTestOwner

Bool_t TMultiLayerPerceptron::fTestOwner
private

internal flag whether one has to delete fTraining or not

Definition at line 171 of file TMultiLayerPerceptron.h.

◆ fTraining

TEventList* TMultiLayerPerceptron::fTraining
private

Definition at line 158 of file TMultiLayerPerceptron.h.

◆ fTrainingOwner

Bool_t TMultiLayerPerceptron::fTrainingOwner
private

number of epochs between two resets of the search direction to the steepest descent - Default=50

Definition at line 170 of file TMultiLayerPerceptron.h.

◆ fType

TNeuron::ENeuronType TMultiLayerPerceptron::fType
private

Definition at line 154 of file TMultiLayerPerceptron.h.

◆ fWeight

TString TMultiLayerPerceptron::fWeight
private

Definition at line 153 of file TMultiLayerPerceptron.h.

Libraries for TMultiLayerPerceptron:
[legend]

The documentation for this class was generated from the following files: