Nesterov’s Accelerated Gradient Descent: The Controlled Contraction Approach
- Resource Type
- Periodical
- Authors
- Gunjal, R.; Nayyer, S.S.; Wagh, S.; Singh, N.M.
- Source
- IEEE Control Systems Letters IEEE Control Syst. Lett. Control Systems Letters, IEEE. 8:163-168 2024
- Subject
- Robotics and Control Systems
Computing and Processing
Components, Circuits, Devices and Systems
Manifolds
Trajectory
Optimization
Gradient methods
Dynamical systems
Convergence
Aerospace electronics
Contraction theory
Nesterov’s Accelerated Gradient Descent
optimization
P&I approach
- Language
- ISSN
- 2475-1456
Nesterov’s Accelerated Gradient (NAG) algorithm is a popular algorithm that provides a faster convergence to the optimal solution of an optimization problem. Despite its popularity, the origin of this algorithm is still a conceptual mystery that has motivated the proposed control theoretic perspective. This letter has derived the second-order ODE for Nesterov’s Accelerated Gradient algorithm for strongly convex functions (NAG-SC) through the notions of manifold stabilization achieved with the recently introduced P&I approach and persistence of an invariant manifold. Furthermore, the contraction of the Nesterov’s flows under the control actions (i.e., Controlled Contraction (CC)) is also proved. The contraction of Nesterov’s flows not only ensures a stable numerical integration but also reveals the multiple potentials of the NAG-SC method that motivate its usefulness in machine learning and deep learning applications.