The geometric effects of distributing constrained nonconvex optimization problems

Q Li, X Yang, Z Zhu, G Tang… - 2019 IEEE 8th …, 2019 - ieeexplore.ieee.org
2019 IEEE 8th International Workshop on Computational Advances in …, 2019ieeexplore.ieee.org
A variety of nonconvex machine learning problems have recently been shown to have
benign geometric landscapes, in which there are no spurious local minima and all saddle
points are strict saddles at which the Hessian has at least one negative eigenvalue. For such
problems, a variety of algorithms can converge to global minimizers. We present a general
result relating the geometry of a centralized problem to its distributed extension; our result is
new in considering the scenario where the centralized problem obeys a manifold constraint …
A variety of nonconvex machine learning problems have recently been shown to have benign geometric landscapes, in which there are no spurious local minima and all saddle points are strict saddles at which the Hessian has at least one negative eigenvalue. For such problems, a variety of algorithms can converge to global minimizers. We present a general result relating the geometry of a centralized problem to its distributed extension; our result is new in considering the scenario where the centralized problem obeys a manifold constraint such as when the variables are normalized to the sphere. We show that the first/second-order stationary points of the centralized and distributed problems are one-to-one correspondent, implying that the distributed problem-in spite of its additional variables and constraints-can inherit the benign geometry of its centralized counterpart. We apply this result to show that the distributed matrix eigenvalue problem, multichannel blind deconvolution problem, and dictionary learning problem all enjoy benign geometric landscapes.
ieeexplore.ieee.org
Showing the best result for this search. See all results