Accumulated Error Backpropagation
The cumulative error back propagation algorithm is a variant of the error back propagation algorithm, which is based on the update rule of minimizing the cumulative error.
Error Backpropagation Error Backpropagation Algorithm (BP)
The error back propagation algorithm uses the forward neural network to calculate the training error , and then uses the training error to act back on the hidden layer neurons, thereby adjusting the connection weights and the threshold of each neuron , and through continuous updating, the training error is minimized.
Currently, most neural network training is based on the BP algorithm, which can be used not only for multi-layer feedforward neural networks, but also for recursive neural network training, etc. However, "BP network" generally refers to a multi-layer feedforward neural network trained with the BP algorithm.
BP algorithm workflow
- The input example is provided to the input neuron, which transmits the signal layer by layer until the output layer produces the result.
- Calculate the output layer error and then propagate the inverse error to the hidden layer neurons
- Adjust the connection weights and thresholds according to the errors of hidden layer neurons
Feedforward Neural Network (FP)
The feedforward neural network can be seen as a combination of multiple logistic regressions, except that the results can be obtained directly through hidden layer neurons. Its cost function is similar to the logistic function, except that it needs to sum over different categories.