SourceForge.net Logo The AINN Base Design
 Support This Project
[Home]
[Up]
[Books]
[Personal NN View]
[AINN Base Design]

 

Introduction

The design of this library respects the real world of nature (at least what we have found so far!).

I intend to create a base library that uses the same terminology as in biology and respecting also the biological structure.

This is why the library deals with names such as Axon, Soma… I did not intentionally go too far in the concordance as this sound to me all right for a C++ library.

There will also be a section for the data that are combined with the neural network, how does this is organized and how to use it.

General Object structure design

Remarks:

  • The dendrites and the Axon are represented as a same object… The current library consider that if neuron N1 is connected to neuron N2 it is via the same object (Link) that will be considered as a Axon by neuron N1 and as a dendrite by neuron N2.
  • The existence of Input and Output Neuron object is to ease a generalization of the processing. I am using the ‘for_each()’ function to Run/Evaluate, learn etc … I therefore need to avoid the input neuron to learn or evaluate, this is why the functions are virtual and returns only false in the case of Input Neuron.
  • A Neuronal Net is made out of Layers and each layer will have a list of neuron. There is no intention to allow the end-user to modify the object or access it externally.

Detailed Neuron structure

Detailed Neural Network

I will explain here the functions that are required to understand in order to use the library.

The neural network is composed of two to more than 2 layers (Is the case of back propagation network we can construct networks with 4 layers).

Layer

It is not a purpose to fully detail this layer as it is nearly empty and exist more out of design choice conveniences than any other reason. The layer is a utility object that process one layer, it maintain a ref to the network parent who created it, call the creator of the neurons its layer is made of and eventually shake its layer.

  • ‘lr’ is the learning rate value that will be assigned to the newly created neurons.
  • ‘isLast’ is a Boolean specifying if we are dealing with the last layer.

NB: The Layer is able to call three types of function to create the network: GetInputNeuron(), GetMidNeuron() and GetOutputNeuron() – as the parent is empty(), and the parent is not empty but the flag isLast is not set.

Net

This is the most important object in my neural network library as we will interface from it in order to design specific neural networks like BAM, SON, BP, Adaline

I will detail each function specifying its field of impact.

1.      Variable:

  • ‘netLinks’ vector of all the link the net contains.
  • ‘nrAttempt’ is an interesting value as it record the amount of failed attempt to have a converging system when the function ShakeNetwork() is called.

2.      Pure virtual functions

  • bool Eval(vector<double> &v );
  • bool Learn( void );
  • double Train( IOListData &data );

3.      Public Virtual functions

  • void SetupNetwork( … )

Description: It creates the network: build the layers with the neurons. As the argument depends of the network type, you need to refer to the inheritor of the function in order to discover the meaning of the arguments. In order to create a new network, you may have to overload this function.

    • Call Reset() to clean any previous inform.
    • Create each layer.
    • Push each layer in the vector
    • Call LinkNetwork() to bind each neuron with the links.

Arguments: 1 to n arguments whose meaning depend on the network type.

Calls: Reset() – LinkNetwork()

Return value: None

  • void SaveNetwork( ostream &out );

Description: Store in the stream the required vales of the whole network.

Arguments: Stream, can be a file or string stream

Calls: Neuron::Save()

Return value: None

  • void LoadNetwork( istream &in );

Description: Create, load and initialize a neural network with the values found in the stream

Arguments: Stream, can be a file or string stream

Calls: Neuron::Load()

Return value: None

4.      Public Functions

  •  bool SetupInput( const vector<double> &di )

Description: Modify the value of the input neurons (first layer)

Arguments: Sequence of values to assign to each neuron. The argument must at least be as big as the number of input neuron.

Calls: Soma::SetValue( const double )

Return value: true is the size of the argument is at least as big as the size of the input neurons.

  • bool SetupOutput( const vector<double> &v );

Description: Set to the output neuron the expected values

Arguments: vector of the expected values of the network

Calls: OutputNeuron::SetExpectedValue( const double )

Return value: true if the size of the argument is at least as big as the number of output neuron (size of last layer)

  • vector<double> GetOutputValues( void );

Description: Store each value found in the last layer in a vector.

Arguments: None

Calls: Soma::GetValue()

Return value: vector of the output values.

  • double AdaptNetwork( IOListDat &data , const int nrShakes = 3, const int nrTraining= 50  , const double delta = 0.1 );

Description: Assign the network with the best converging bias and link weight.

Arguments:

    • data      Input and expected output values (if output values are required)
    • nrShakes          Set of converging test
    • nrTraining         Number of train iteration we must perform in order to confirm convergence.
    • Delta    Error value below witch we decide we are converging.

Calls: ShakeNetwork(), GetLinksWeight(), GetBiasWeight(), SetLinkWeight(), setBiasWeight().

Return value: the minimal relative error.

  • bool Converge( IOListDat &data , const double delta);

Description: Evaluate the input data

Arguments:

    • data      Input and expected output values (if output values are required)
    • Delta    Error value below witch we decide we are converging.

Calls: SetupOutput(), Eval() GetOutputValues()

Return value: true if the difference of the relative error is less than delta.

5.      Protected functions

  • Layer &GetOutputLayer( void );

Description: inline function to access the last layer

Arguments: None

Calls:

Return value: a reference to the last layer;

  • vector<double> GetLinksWeight( void );

Description: Creates a vector with the weights of each link – the order is the order of their creation.

Arguments: None

Calls: None

Return value: a vector with the links’ weights.

  • void SetLinksWeight( const vector<double> v );

Description: Gives new values to the links.

Arguments: vector of new link weights

Calls: None

Return value: None

  • vector<double> GetBiasWeight( void ;)

Description: extract all the bias values – the order is the one of the links

Arguments: None

Calls: None

Return value: a vector with the bias.

  • void SetBiasWeight( const vector<double> v );

Description: Assign a new value to all bias.

Arguments: vector of new values.

Calls: None

Return value: None

  • void LinkNetwork( void );

Description: Clear the netLinks array and frees memory, also for each neuron, resets the axon vector and the dendrites vector. Connect each neuron of a layer with all the neuron of the next layer.

 Arguments: None

Calls: GetNewLink(), Connect()

Return value: None

6.      Virtual protected functions

  • void Shake( void );

Description: Shake each layer and assign to the network a new weight of links and a new bias value.

Arguments: None

Calls: Layer::Shake()

Return value: None

  • double ShakeNetwork( IOListData &data , const int nrTraining  , const double delta , const int maxLoop = 500 );

Description: For a maximum of time attempt to find a converging value of weights and bias.

Arguments:

    • data      Input and expected output values (if output values are required)
    • nrTraining         Number of train iteration we must perform in order to confirm convergence.
    • Delta    Error value below witch we decide we are converging.
    • maxLoop          Maximum number of attempt before stopping

Calls: Shake(), Converge(), Train()

Return value: If a network is found to converge, it is the relative error otherwise it has the value FLT_MAX.

  • Link *GetNewLink( Neuron *n1 , Neuron *n2 );

Description: Override this if the link of your network is different.

Arguments: Two valid pointers to a neuron

Calls: None

Return value: a newly create Link pointer

  • Neuron *GetInputNeuron( void );

Description: Override this in order to associate different input link

Arguments: None

Calls: None

Return value: A newly create neuron pointer

  • Neuron *GetMidNeuron( void );

Description: Override this in order to associate different Middle link

Arguments: None

Calls: None

Return value: A newly create neuron pointer

  • Neuron *GetOutputNeuron( void );

Description: Override this in order to associate different output link

Arguments: None

Calls: None

Return value: A newly create neuron pointer

Detailed I/O Values

These two object are mainly containers of values and make use of the STL. There is not much in the object and an extensive description is not possible as there is so few to say.

IOData Object

This is a container of input values (double) associated with output values. the member 'input' and 'output' are public.

The constructor can also be created as followed

IOData( <nrInput values> , <nrOutput Values> , [<input values>] , [<output values>] )

IOData( <nrInput values> , <nrOutput Values> , <input stream containing the values> );

IOListData Object

it is a container inheriting of a vector of IOData.

The constructor can be empty, with a filename (in that case a stream will be opened and the data will be loaded - no error checking is made) or with a vector of IOData that will be copied entirely (nr is -1) or partially (nr specify the number of element to copy).