Abstract
We consider the gradient (or steepest) descent method with exact line search applied to a strongly convex function with Lipschitz continuous gradient. We establish the exact worst-case rate of convergence of this scheme, and show that this worst-case behavior is exhibited by a certain convex quadratic function. We also give the tight worst-case complexity bound for a noisy variant of gradient descent method, where exact line-search is performed in a search direction that differs from negative gradient by at most a prescribed relative tolerance. The proofs are computer-assisted, and rely on the resolutions of semidefinite programming performance estimation problems as introduced in the paper (Drori and Teboulle, Math Progr 145(1–2):451–482, 2014).
| Original language | English |
|---|---|
| Pages (from-to) | 1185–1199 |
| Number of pages | 15 |
| Journal | Optimization Letters |
| Volume | 11 |
| Issue number | 7 |
| DOIs | |
| Publication status | Published - Oct 2017 |
Keywords
- gradient method
- steepest descent
- semidefinite programming
- performance estimation problem
Fingerprint
Dive into the research topics of 'On the worst-case complexity of the gradient method with exact line search for smooth strongly convex functions'. Together they form a unique fingerprint.Cite this
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver