Title: Revisiting High-Resolution ODEs for Faster Convergence Rates
Abstract: We consider the unconstrained minimization of strongly convex smooth functions using first-order accelerated methods. High-resolution ordinary differential equations (ODEs) are studied to understand the behaviour of Nesterov's accelerated gradient algorithm. A new general framework is proposed and a new general Lyapunov function for convergence proof is introduced. Using integral quadratic constraints from robust control theory, we justify our choice of Lyapunov function. Through an alternative ODE representation, Nesterov's algorithm is exactly explained. Our continuous-time analysis leads to improved convergence results on the triple momentum's ODE. After discretization, several methods are found as special cases of our framework and better convergence rate on the quasi hyperbolic momentum algorithm is achieved. To the best of our knowledge, this is the first work considering continuous time ODE representation as an effective factor in first-order accelerated algorithms.
This is a joint work with Armin Eftekhari (Umeå University) and Konstantinos Zygalakis (Edinburgh University).