On the Worst-Case Complexity of the Gradient Method with Exact Line Search for Smooth Strongly Convex Functions

Etienne de Klerk, Francois Glineur, Adrien Taylor

Research output: Working paperOther research output

40 Citations (Scopus)

Abstract

We consider the gradient (or steepest) descent method with exact line search applied to a strongly convex function with Lipschitz continuous gradient. We establish the exact worst-case rate of convergence of this scheme, and show that this worst-case behavior is exhibited by a certain convex quadratic function. We also extend the result to a noisy variant of gradient descent method, where exact line-search is performed in a search direction that differs from negative gradient by at most a prescribed relative tolerance.
The proof is computer-assisted, and relies on the resolution of semidefinite programming performance estimation problems as introduced in the paper [Y. Drori and M. Teboulle. Performance of first-order methods for smooth convex minimization: a novel approach. Mathematical Programming, 145(1-2):451-482, 2014].
Original languageEnglish
Place of PublicationItacha
PublisherCornell University Library
Number of pages10
Publication statusPublished - 30 Jun 2016

Publication series

NamearXiv
VolumearXiv:1606.09365

Fingerprint

Dive into the research topics of 'On the Worst-Case Complexity of the Gradient Method with Exact Line Search for Smooth Strongly Convex Functions'. Together they form a unique fingerprint.

Cite this