no code implementations • 23 Mar 2020 • Brian Swenson, Soummya Kar, H. Vincent Poor, José M. F. Moura, Aaron Jaech
We discuss local minima convergence guarantees and explore the simple but critical role of the stable-manifold theorem in analyzing saddle-point avoidance.
Optimization and Control
no code implementations • 5 Mar 2020 • Brian Swenson, Ryan Murray, Soummya Kar, H. Vincent Poor
In centralized settings, it is well known that stochastic gradient descent (SGD) avoids saddle points and converges to local minima in nonconvex problems.
Optimization and Control
no code implementations • 18 Mar 2019 • Brian Swenson, Soummya Kar, H. Vincent Poor, Jose' M. F. Moura
The paper considers algorithms for optimizing the sum function.
no code implementations • 1 Jan 2019 • Soummya Kar, Brian Swenson
By appropriate choice of $\rho$, the set of generalized minima may be brought arbitrarily close to the set of Lloyd's minima.