Complete Two Layer SNN. More...
Data Fields | |
I32 | IL |
set by snn_init(). | |
I32 | H1 |
set by snn_init(). | |
I32 | H2 |
set by snn_init(). | |
I32 | OL |
set by snn_init(). | |
F32 * | iLTmp |
can also keep bias neuron, allocated by snn_init(), freed by snn_deinit(). | |
F32 * | oLTmp |
can also keep bias neuron, allocated by snn_init(), freed by snn_deinit(). | |
F32 * | h1 |
allocated by snn_init(), freed by snn_deinit(). | |
F32 * | h2 |
allocated by snn_init(), freed by snn_deinit(). | |
F32 * | w1 |
allocated by snn_init(), freed by snn_deinit(). | |
F32 * | w2 |
allocated by snn_init(), freed by snn_deinit(). | |
F32 * | w3 |
allocated by snn_init(), freed by snn_deinit(). | |
F32 * | delta1 |
allocated by snn_init(), freed by snn_deinit(). | |
F32 * | delta2 |
allocated by snn_init(), freed by snn_deinit(). | |
F32 * | delta3 |
allocated by snn_init(), freed by snn_deinit(). | |
F32 * | e3 |
allocated by snn_init(), freed by snn_deinit(). | |
F32 * | e2 |
allocated by snn_init(), freed by snn_deinit(). | |
F32 * | e1 |
allocated by snn_init(), freed by snn_deinit(). | |
I32 | random_seed |
set by snn_train_init(). | |
F32 | fmax_weight |
set by snn_train_init(). | |
F32 | learning_rate |
set by snn_train_init(). | |
F32 | momentum |
set by snn_train_init(). | |
F32(* | sigmoid_func )(F32) |
set by snn_init(). | |
F32(* | differentiation_func )(F32) |
set by snn_init(). | |
F32(* | confidence_func )(I32, F32 *) |
first argument is the count of Output Neurons, float argument is output from SNN, returns confidence. Set by snn_init(). | |
F32(* | deviation_func )(I32, F32 *, F32 *) |
first argument is the count of Output Neurons, first float argument is output from SNN, second float argument is should-be from dataset, returns deviation. Set by snn_train_init(). | |
I32(* | decision_func )(I32, F32 *) |
decider function returns index based on the given output Layer values. Set by snn_init(). | |
There is the last Neuron special in every Layer:
IL | Count of Inputs. |
H1 | Count of Neurons in the Hidden Layer 1. |
H2 | Count of Neurons in the Hidden Layer 2. |
OL | Count of Outputs. |
w1 | Weights between the Inputs and the Neurons in the Hidden Layer 1: weight[From Input i to Layer 1 Neuron j] = *(w1 + j * (IL+1) + i). |
w2 | Weights between the Neurons in the Hidden Layer 1 to Layer 2: weight[From Layer 1 Neuron i to Layer 2 Neuron j] = *(w2 + j * (IL+1) + i). |
w3 | Weights between the Neurons in the Hidden Layer 2 to Output: weight[From Layer 2 Neuron i to Output j ] = *(w3 + j * (IL+1) + i). |
h1 | Buffers the h1 neuron's Output Value at the Processing Step between calculating from Input to Layer 1 and calculating from Layer 1 to Layer 2: [neuron k] = 1/(1+exp(sum_{i} w1_{k,i} * inputval_i)). |
h2 | Buffers the h2 neuron's Output Value at the Processing Step between calculating from Layer 1 to Layer 2 and calculating from Layer 2 to Output: [neuron k] = 1/(1+exp(sum_{i} w2_{k,i} * h1_i)). |
e3 | At learning: The deviation of the correct output signal from the value of the Neuron at the Hidden Layer 2: difference[correct Output, calculated Output] * non-standard weighting with correctO * (1 - correctO) |
e2 | At learning: The deviation of the backpropagated signal from the forward propagated signal. |