On the worst-case complexity of the gradient method with exact line search for smooth strongly convex functions

Etienne de Klerk, Francois Glineur, Adrien Taylor

Research output: Contribution to journalArticleScientificpeer-review

Abstract

We consider the gradient (or steepest) descent method with exact line search applied to a strongly convex function with Lipschitz continuous gradient. We establish the exact worst-case rate of convergence of this scheme, and show that this worst-case behavior is exhibited by a certain convex quadratic function. We also give the tight worst-case complexity bound for a noisy variant of gradient descent method, where exact line-search is performed in a search direction that differs from negative gradient by at most a prescribed relative tolerance. The proofs are computer-assisted, and rely on the resolutions of semidefinite programming performance estimation problems as introduced in the paper (Drori and Teboulle, Math Progr 145(1–2):451–482, 2014).
LanguageEnglish
Pages1185–1199
Number of pages15
JournalOptimization Letters
Volume11
Issue number7
DOIs
StatePublished - Oct 2017

Fingerprint

Line Search
Gradient Method
Convex function
Gradient Descent Method
Gradient
Steepest Descent Method
Semidefinite Programming
Quadratic Function
Lipschitz
Tolerance
Rate of Convergence

Keywords

  • gradient method
  • steepest descent
  • semidefinite programming
  • performance estimation problem

Cite this

@article{8cc0e8ddb6cd4f4f9dcb9531d5df7a75,
title = "On the worst-case complexity of the gradient method with exact line search for smooth strongly convex functions",
abstract = "We consider the gradient (or steepest) descent method with exact line search applied to a strongly convex function with Lipschitz continuous gradient. We establish the exact worst-case rate of convergence of this scheme, and show that this worst-case behavior is exhibited by a certain convex quadratic function. We also give the tight worst-case complexity bound for a noisy variant of gradient descent method, where exact line-search is performed in a search direction that differs from negative gradient by at most a prescribed relative tolerance. The proofs are computer-assisted, and rely on the resolutions of semidefinite programming performance estimation problems as introduced in the paper (Drori and Teboulle, Math Progr 145(1–2):451–482, 2014).",
keywords = "gradient method, steepest descent, semidefinite programming, performance estimation problem",
author = "{de Klerk}, Etienne and Francois Glineur and Adrien Taylor",
year = "2017",
month = "10",
doi = "10.1007/s11590-016-1087-4",
language = "English",
volume = "11",
pages = "1185–1199",
journal = "Optimization Letters",
issn = "1862-4472",
publisher = "Springer Verlag",
number = "7",

}

On the worst-case complexity of the gradient method with exact line search for smooth strongly convex functions. / de Klerk, Etienne; Glineur, Francois; Taylor, Adrien.

In: Optimization Letters, Vol. 11, No. 7, 10.2017, p. 1185–1199.

Research output: Contribution to journalArticleScientificpeer-review

TY - JOUR

T1 - On the worst-case complexity of the gradient method with exact line search for smooth strongly convex functions

AU - de Klerk,Etienne

AU - Glineur,Francois

AU - Taylor,Adrien

PY - 2017/10

Y1 - 2017/10

N2 - We consider the gradient (or steepest) descent method with exact line search applied to a strongly convex function with Lipschitz continuous gradient. We establish the exact worst-case rate of convergence of this scheme, and show that this worst-case behavior is exhibited by a certain convex quadratic function. We also give the tight worst-case complexity bound for a noisy variant of gradient descent method, where exact line-search is performed in a search direction that differs from negative gradient by at most a prescribed relative tolerance. The proofs are computer-assisted, and rely on the resolutions of semidefinite programming performance estimation problems as introduced in the paper (Drori and Teboulle, Math Progr 145(1–2):451–482, 2014).

AB - We consider the gradient (or steepest) descent method with exact line search applied to a strongly convex function with Lipschitz continuous gradient. We establish the exact worst-case rate of convergence of this scheme, and show that this worst-case behavior is exhibited by a certain convex quadratic function. We also give the tight worst-case complexity bound for a noisy variant of gradient descent method, where exact line-search is performed in a search direction that differs from negative gradient by at most a prescribed relative tolerance. The proofs are computer-assisted, and rely on the resolutions of semidefinite programming performance estimation problems as introduced in the paper (Drori and Teboulle, Math Progr 145(1–2):451–482, 2014).

KW - gradient method

KW - steepest descent

KW - semidefinite programming

KW - performance estimation problem

UR - http://rdcu.be/ldxh

U2 - 10.1007/s11590-016-1087-4

DO - 10.1007/s11590-016-1087-4

M3 - Article

VL - 11

SP - 1185

EP - 1199

JO - Optimization Letters

T2 - Optimization Letters

JF - Optimization Letters

SN - 1862-4472

IS - 7

ER -