Rectified Linear Unit
Linear correction unit ReLU is a commonly used activation function in artificial neural networks. It is also called a linear rectification function, usually referring to a nonlinear function represented by a ramp function and its variants.
Features of Linear Correction Unit
Commonly used linear rectification units include the ramp function f ( x ) = max ( 0 , x ) and the leaky rectifier function Leaky ReLU, where x represents the input of the neuron.
Linear rectification is believed to have certain biological principles. Since it usually works better than other commonly used activation functions in practice, it is widely used in deep neural networks, including computer vision fields such as image recognition.
As a commonly used activation function in neural networks, ReLU retains the biological inspiration of the Step function. When the input is positive, the derivative is not zero, allowing gradient-based learning. When the input is negative, ReLU's learning speed may slow down or even cause the neuron to fail directly, because the input is less than zero and the gradient is zero, which will cause the weights to fail to be updated and remain silent during the rest of the training process.