HyperAI

Stochastic Gradient Gescent

Stochastic Gradient DescentIt is a solution idea of the gradient descent algorithm, which can be used to solve the drawbacks of the gradient descent method. In the stochastic gradient descent method, only one training data can be used to update the parameters in each iteration.

Stochastic Gradient Descent Features

  • Advantages: Fast training speed
  • Disadvantages: reduced accuracy, not globally optimal, not easy to implement in parallel

Stochastic gradient descent will minimize the loss function of all training samples, so that the final solution is the global optimal solution, that is, the solution parameters will minimize the risk function; it will minimize the loss function of each sample. Although the loss function obtained in each iteration is not in the direction of the global optimal solution, the overall direction is the global optimal solution, and the final result is often near the global optimal solution.

Parent term: Gradient Descent
Related terms: batch gradient descent, mini-batch gradient descent