On the Worst-Case Complexity of the Gradient Method with Exact Line Search for Smooth Strongly Convex Functions

Etienne de Klerk, Francois Glineur, Adrien Taylor

Research output: Working paperOther research output

Abstract

We consider the gradient (or steepest) descent method with exact line search applied to a strongly convex function with Lipschitz continuous gradient. We establish the exact worst-case rate of convergence of this scheme, and show that this worst-case behavior is exhibited by a certain convex quadratic function. We also extend the result to a noisy variant of gradient descent method, where exact line-search is performed in a search direction that differs from negative gradient by at most a prescribed relative tolerance.
The proof is computer-assisted, and relies on the resolution of semidefinite programming performance estimation problems as introduced in the paper [Y. Drori and M. Teboulle. Performance of first-order methods for smooth convex minimization: a novel approach. Mathematical Programming, 145(1-2):451-482, 2014].
Original languageEnglish
Place of PublicationItacha
PublisherCornell University Library
Number of pages10
Publication statusPublished - 30 Jun 2016

Publication series

NamearXiv
VolumearXiv:1606.09365

Fingerprint

Line Search
Gradient Method
Convex function
Gradient Descent Method
Gradient
Convex Minimization
Steepest Descent Method
Semidefinite Programming
Quadratic Function
Mathematical Programming
Lipschitz
Tolerance
Rate of Convergence
First-order

Cite this

de Klerk, E., Glineur, F., & Taylor, A. (2016). On the Worst-Case Complexity of the Gradient Method with Exact Line Search for Smooth Strongly Convex Functions. (arXiv; Vol. arXiv:1606.09365). Itacha: Cornell University Library.
de Klerk, Etienne ; Glineur, Francois ; Taylor, Adrien. / On the Worst-Case Complexity of the Gradient Method with Exact Line Search for Smooth Strongly Convex Functions. Itacha : Cornell University Library, 2016. (arXiv).
@techreport{e290e913c33a4c838dc82cecf27f1fae,
title = "On the Worst-Case Complexity of the Gradient Method with Exact Line Search for Smooth Strongly Convex Functions",
abstract = "We consider the gradient (or steepest) descent method with exact line search applied to a strongly convex function with Lipschitz continuous gradient. We establish the exact worst-case rate of convergence of this scheme, and show that this worst-case behavior is exhibited by a certain convex quadratic function. We also extend the result to a noisy variant of gradient descent method, where exact line-search is performed in a search direction that differs from negative gradient by at most a prescribed relative tolerance. The proof is computer-assisted, and relies on the resolution of semidefinite programming performance estimation problems as introduced in the paper [Y. Drori and M. Teboulle. Performance of first-order methods for smooth convex minimization: a novel approach. Mathematical Programming, 145(1-2):451-482, 2014].",
author = "{de Klerk}, Etienne and Francois Glineur and Adrien Taylor",
year = "2016",
month = "6",
day = "30",
language = "English",
series = "arXiv",
publisher = "Cornell University Library",
type = "WorkingPaper",
institution = "Cornell University Library",

}

de Klerk, E, Glineur, F & Taylor, A 2016 'On the Worst-Case Complexity of the Gradient Method with Exact Line Search for Smooth Strongly Convex Functions' arXiv, vol. arXiv:1606.09365, Cornell University Library, Itacha.

On the Worst-Case Complexity of the Gradient Method with Exact Line Search for Smooth Strongly Convex Functions. / de Klerk, Etienne; Glineur, Francois; Taylor, Adrien.

Itacha : Cornell University Library, 2016. (arXiv; Vol. arXiv:1606.09365).

Research output: Working paperOther research output

TY - UNPB

T1 - On the Worst-Case Complexity of the Gradient Method with Exact Line Search for Smooth Strongly Convex Functions

AU - de Klerk, Etienne

AU - Glineur, Francois

AU - Taylor, Adrien

PY - 2016/6/30

Y1 - 2016/6/30

N2 - We consider the gradient (or steepest) descent method with exact line search applied to a strongly convex function with Lipschitz continuous gradient. We establish the exact worst-case rate of convergence of this scheme, and show that this worst-case behavior is exhibited by a certain convex quadratic function. We also extend the result to a noisy variant of gradient descent method, where exact line-search is performed in a search direction that differs from negative gradient by at most a prescribed relative tolerance. The proof is computer-assisted, and relies on the resolution of semidefinite programming performance estimation problems as introduced in the paper [Y. Drori and M. Teboulle. Performance of first-order methods for smooth convex minimization: a novel approach. Mathematical Programming, 145(1-2):451-482, 2014].

AB - We consider the gradient (or steepest) descent method with exact line search applied to a strongly convex function with Lipschitz continuous gradient. We establish the exact worst-case rate of convergence of this scheme, and show that this worst-case behavior is exhibited by a certain convex quadratic function. We also extend the result to a noisy variant of gradient descent method, where exact line-search is performed in a search direction that differs from negative gradient by at most a prescribed relative tolerance. The proof is computer-assisted, and relies on the resolution of semidefinite programming performance estimation problems as introduced in the paper [Y. Drori and M. Teboulle. Performance of first-order methods for smooth convex minimization: a novel approach. Mathematical Programming, 145(1-2):451-482, 2014].

M3 - Working paper

T3 - arXiv

BT - On the Worst-Case Complexity of the Gradient Method with Exact Line Search for Smooth Strongly Convex Functions

PB - Cornell University Library

CY - Itacha

ER -

de Klerk E, Glineur F, Taylor A. On the Worst-Case Complexity of the Gradient Method with Exact Line Search for Smooth Strongly Convex Functions. Itacha: Cornell University Library. 2016 Jun 30. (arXiv).