VCLib Documentation  6.12.2

SNN

SNN

Complete Two Layer SNN. More...

Data Fields

I32 IL
 set by snn_init().
 
I32 H1
 set by snn_init().
 
I32 H2
 set by snn_init().
 
I32 OL
 set by snn_init().
 
F32iLTmp
 can also keep bias neuron, allocated by snn_init(), freed by snn_deinit().
 
F32oLTmp
 can also keep bias neuron, allocated by snn_init(), freed by snn_deinit().
 
F32h1
 allocated by snn_init(), freed by snn_deinit().
 
F32h2
 allocated by snn_init(), freed by snn_deinit().
 
F32w1
 allocated by snn_init(), freed by snn_deinit().
 
F32w2
 allocated by snn_init(), freed by snn_deinit().
 
F32w3
 allocated by snn_init(), freed by snn_deinit().
 
F32delta1
 allocated by snn_init(), freed by snn_deinit().
 
F32delta2
 allocated by snn_init(), freed by snn_deinit().
 
F32delta3
 allocated by snn_init(), freed by snn_deinit().
 
F32e3
 allocated by snn_init(), freed by snn_deinit().
 
F32e2
 allocated by snn_init(), freed by snn_deinit().
 
F32e1
 allocated by snn_init(), freed by snn_deinit().
 
I32 random_seed
 set by snn_train_init().
 
F32 fmax_weight
 set by snn_train_init().
 
F32 learning_rate
 set by snn_train_init().
 
F32 momentum
 set by snn_train_init().
 
F32(* sigmoid_func )(F32)
 set by snn_init().
 
F32(* differentiation_func )(F32)
 set by snn_init().
 
F32(* confidence_func )(I32, F32 *)
 first argument is the count of Output Neurons, float argument is output from SNN, returns confidence. Set by snn_init().
 
F32(* deviation_func )(I32, F32 *, F32 *)
 first argument is the count of Output Neurons, first float argument is output from SNN, second float argument is should-be from dataset, returns deviation. Set by snn_train_init().
 
I32(* decision_func )(I32, F32 *)
 decider function returns index based on the given output Layer values. Set by snn_init().
 

Detailed Description

There is the last Neuron special in every Layer:

  • Input Layer: Last Neuron (Index: IL) contains BIAS.
  • Hidden Layers: Last Neuron (Index: H1 or H2) always equals 1 at start.
    See also
    [http://cs.uni-muenster.de/Professoren/Lippe/lehre/skripte/wwwnnscript/backprop.html#bias]
    Parameters
    ILCount of Inputs.
    H1Count of Neurons in the Hidden Layer 1.
    H2Count of Neurons in the Hidden Layer 2.
    OLCount of Outputs.
    w1Weights between the Inputs and the Neurons in the Hidden Layer 1: weight[From Input i to Layer 1 Neuron j] = *(w1 + j * (IL+1) + i).
    w2Weights between the Neurons in the Hidden Layer 1 to Layer 2: weight[From Layer 1 Neuron i to Layer 2 Neuron j] = *(w2 + j * (IL+1) + i).
    w3Weights between the Neurons in the Hidden Layer 2 to Output: weight[From Layer 2 Neuron i to Output j ] = *(w3 + j * (IL+1) + i).
    h1Buffers the h1 neuron's Output Value at the Processing Step between calculating from Input to Layer 1 and calculating from Layer 1 to Layer 2: [neuron k] = 1/(1+exp(sum_{i} w1_{k,i} * inputval_i)).
    h2Buffers the h2 neuron's Output Value at the Processing Step between calculating from Layer 1 to Layer 2 and calculating from Layer 2 to Output: [neuron k] = 1/(1+exp(sum_{i} w2_{k,i} * h1_i)).
    e3At learning: The deviation of the correct output signal from the value of the Neuron at the Hidden Layer 2: difference[correct Output, calculated Output] * non-standard weighting with correctO * (1 - correctO)
    e2At learning: The deviation of the backpropagated signal from the forward propagated signal.