### Abstract

Language | English |
---|---|

Pages | 1185–1199 |

Number of pages | 15 |

Journal | Optimization Letters |

Volume | 11 |

Issue number | 7 |

DOIs | |

State | Published - Oct 2017 |

### Fingerprint

### Keywords

- gradient method
- steepest descent
- semidefinite programming
- performance estimation problem

### Cite this

*Optimization Letters*,

*11*(7), 1185–1199. DOI: 10.1007/s11590-016-1087-4

}

*Optimization Letters*, vol. 11, no. 7, pp. 1185–1199. DOI: 10.1007/s11590-016-1087-4

**On the worst-case complexity of the gradient method with exact line search for smooth strongly convex functions.** / de Klerk, Etienne; Glineur, Francois; Taylor, Adrien.

Research output: Contribution to journal › Article › Scientific › peer-review

TY - JOUR

T1 - On the worst-case complexity of the gradient method with exact line search for smooth strongly convex functions

AU - de Klerk,Etienne

AU - Glineur,Francois

AU - Taylor,Adrien

PY - 2017/10

Y1 - 2017/10

N2 - We consider the gradient (or steepest) descent method with exact line search applied to a strongly convex function with Lipschitz continuous gradient. We establish the exact worst-case rate of convergence of this scheme, and show that this worst-case behavior is exhibited by a certain convex quadratic function. We also give the tight worst-case complexity bound for a noisy variant of gradient descent method, where exact line-search is performed in a search direction that differs from negative gradient by at most a prescribed relative tolerance. The proofs are computer-assisted, and rely on the resolutions of semidefinite programming performance estimation problems as introduced in the paper (Drori and Teboulle, Math Progr 145(1–2):451–482, 2014).

AB - We consider the gradient (or steepest) descent method with exact line search applied to a strongly convex function with Lipschitz continuous gradient. We establish the exact worst-case rate of convergence of this scheme, and show that this worst-case behavior is exhibited by a certain convex quadratic function. We also give the tight worst-case complexity bound for a noisy variant of gradient descent method, where exact line-search is performed in a search direction that differs from negative gradient by at most a prescribed relative tolerance. The proofs are computer-assisted, and rely on the resolutions of semidefinite programming performance estimation problems as introduced in the paper (Drori and Teboulle, Math Progr 145(1–2):451–482, 2014).

KW - gradient method

KW - steepest descent

KW - semidefinite programming

KW - performance estimation problem

UR - http://rdcu.be/ldxh

U2 - 10.1007/s11590-016-1087-4

DO - 10.1007/s11590-016-1087-4

M3 - Article

VL - 11

SP - 1185

EP - 1199

JO - Optimization Letters

T2 - Optimization Letters

JF - Optimization Letters

SN - 1862-4472

IS - 7

ER -