The connections between Lyapunov functions for some optimization algorithms and differential equations

1 Sep 2020  ·  J. M. Sanz-Serna, Konstantinos C. Zygalakis ·

In this manuscript, we study the properties of a family of second-order differential equations with damping, its discretizations and their connections with accelerated optimization algorithms for $m$-strongly convex and $L$-smooth functions. In particular, using the Linear Matrix Inequality LMI framework developed by \emph{Fazlyab et. al. $(2018)$}, we derive analytically a (discrete) Lyapunov function for a two-parameter family of Nesterov optimization methods, which allows for the complete characterization of their convergence rate. In the appropriate limit, this family of methods may be seen as a discretization of a family of second-order ordinary differential equations for which we construct(continuous) Lyapunov functions by means of the LMI framework. The continuous Lyapunov functions may alternatively, be obtained by studying the limiting behaviour of their discrete counterparts. Finally, we show that the majority of typical discretizations of the family of ODEs, such as the Heavy ball method, do not possess Lyapunov functions with properties similar to those of the Lyapunov function constructed here for the Nesterov method.

PDF Abstract
No code implementations yet. Submit your code now

Tasks


Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here