Stability and convergence of stochastic gradient clipping: Beyond lipschitz continuity and smoothness

VV Mai, M Johansson - International Conference on …, 2021 - proceedings.mlr.press
VV Mai, M Johansson
International Conference on Machine Learning, 2021proceedings.mlr.press
Stochastic gradient algorithms are often unstable when applied to functions that do not have
Lipschitz-continuous and/or bounded gradients. Gradient clipping is a simple and effective
technique to stabilize the training process for problems that are prone to the exploding
gradient problem. Despite its widespread popularity, the convergence properties of the
gradient clipping heuristic are poorly understood, especially for stochastic problems. This
paper establishes both qualitative and quantitative convergence results of the clipped …
Abstract
Stochastic gradient algorithms are often unstable when applied to functions that do not have Lipschitz-continuous and/or bounded gradients. Gradient clipping is a simple and effective technique to stabilize the training process for problems that are prone to the exploding gradient problem. Despite its widespread popularity, the convergence properties of the gradient clipping heuristic are poorly understood, especially for stochastic problems. This paper establishes both qualitative and quantitative convergence results of the clipped stochastic (sub) gradient method (SGD) for non-smooth convex functions with rapidly growing subgradients. Our analyses show that clipping enhances the stability of SGD and that the clipped SGD algorithm enjoys finite convergence rates in many cases. We also study the convergence of a clipped method with momentum, which includes clipped SGD as a special case, for weakly convex problems under standard assumptions. With a novel Lyapunov analysis, we show that the proposed method achieves the best-known rate for the considered class of problems, demonstrating the effectiveness of clipped methods also in this regime. Numerical results confirm our theoretical developments.
proceedings.mlr.press
Showing the best result for this search. See all results