BN Batch Normalization
BN is a set of regularization methods that can speed up the training of large convolutional networks and improve the classification accuracy after convergence.
When BN is used in a certain layer of a neural network, it will standardize the internal data of each mini-batch, normalize the output to the normal distribution of N(0,1), and reduce the change of the internal neuron distribution. When training traditional deep neural networks, the input distribution of each layer changes, making training difficult, but the addition of BN can solve this problem.