Incremental methods for weakly convex optimization

X Li, Z Zhu, AMC So, JD Lee - arXiv preprint arXiv:1907.11687, 2019 - arxiv.org
arXiv preprint arXiv:1907.11687, 2019arxiv.org
Incremental methods are widely utilized for solving finite-sum optimization problems in
machine learning and signal processing. In this paper, we study a family of incremental
methods--including incremental subgradient, incremental proximal point, and incremental
prox-linear methods--for solving weakly convex optimization problems. Such a problem
class covers many nonsmooth nonconvex instances that arise in engineering fields. We
show that the three said incremental methods have an iteration complexity of $ O …
Incremental methods are widely utilized for solving finite-sum optimization problems in machine learning and signal processing. In this paper, we study a family of incremental methods -- including incremental subgradient, incremental proximal point, and incremental prox-linear methods -- for solving weakly convex optimization problems. Such a problem class covers many nonsmooth nonconvex instances that arise in engineering fields. We show that the three said incremental methods have an iteration complexity of for driving a natural stationarity measure to below . Moreover, we show that if the weakly convex function satisfies a sharpness condition, then all three incremental methods, when properly initialized and equipped with geometrically diminishing stepsizes, can achieve a local linear rate of convergence. Our work is the first to extend the convergence rate analysis of incremental methods from the nonsmooth convex regime to the weakly convex regime. Lastly, we conduct numerical experiments on the robust matrix sensing problem to illustrate the convergence performance of the three incremental methods.
arxiv.org
Showing the best result for this search. See all results