The partial derivatives of error function by thresholds is equal to: The partial derivatives of error function by weight coefficients for output layers is equal to: The partial derivatives of error function by weight coefficients for hidden layers is equal to: Error of the -th element of the -th hidden layer is The error function of the network is, and is the error of the -th neuron of the output layer. The output value of the j-th neuron of the output layer is calculated by the formula:
![vector incremental hacked vector incremental hacked](https://www.fossguru.com/wp-content/uploads/2020/12/Idle-Breakout-the-best-idle-games.jpg)
The output of the k-th layer j-th neuron is calculated by the formula: Let us introduce the following notation: – input vector, – output vector,, -th weight coefficient of the -th neuron of the th layer, – threshold of the -th neuron of the -th layer, and is a target output of – th neuron. Synaptic weights are adjusted in order to maximize the output vector of the network to the target vector. This signal subsequently propagates through the network in the direction opposite to the direction of synaptic connections. During the back pass, all synaptic weights are adjusted in accordance with the error correction rule, namely: the actual output of the network is subtracted from the desired one, as a result of which an error signal is generated.
![vector incremental hacked vector incremental hacked](https://wpguynews.com/wp-content/uploads/2021/05/pdf-hacking.jpg)
During the direct pass all synaptic weights of the network are fixed. As a result, a set of output signals is generated, which is the actual response of the network to this input image.
![vector incremental hacked vector incremental hacked](https://www.mdpi.com/jcp/jcp-01-00021/article_deploy/html/images/jcp-01-00021-g012.png)
During a direct pass the input vector is fed to the input layer of the neural network, after which it spreads across the network from layer to layer. Training process by error back-propagation algorithm involves two passes of information through all layers of the network: direct pass and reverse pass. The backpropagation algorithm is one of the methods of multilayer neural networks training.