### Abstract

The proof is computer-assisted, and relies on the resolution of semidefinite programming performance estimation problems as introduced in the paper [Y. Drori and M. Teboulle. Performance of first-order methods for smooth convex minimization: a novel approach. Mathematical Programming, 145(1-2):451-482, 2014].

Original language | English |
---|---|

Place of Publication | Itacha |

Publisher | Cornell University Library |

Number of pages | 10 |

Publication status | Published - 30 Jun 2016 |

### Publication series

Name | arXiv |
---|---|

Volume | arXiv:1606.09365 |

### Fingerprint

### Cite this

*On the Worst-Case Complexity of the Gradient Method with Exact Line Search for Smooth Strongly Convex Functions*. (arXiv; Vol. arXiv:1606.09365). Itacha: Cornell University Library.

}

**On the Worst-Case Complexity of the Gradient Method with Exact Line Search for Smooth Strongly Convex Functions.** / de Klerk, Etienne; Glineur, Francois; Taylor, Adrien.

Research output: Working paper › Other research output

TY - UNPB

T1 - On the Worst-Case Complexity of the Gradient Method with Exact Line Search for Smooth Strongly Convex Functions

AU - de Klerk, Etienne

AU - Glineur, Francois

AU - Taylor, Adrien

PY - 2016/6/30

Y1 - 2016/6/30

N2 - We consider the gradient (or steepest) descent method with exact line search applied to a strongly convex function with Lipschitz continuous gradient. We establish the exact worst-case rate of convergence of this scheme, and show that this worst-case behavior is exhibited by a certain convex quadratic function. We also extend the result to a noisy variant of gradient descent method, where exact line-search is performed in a search direction that differs from negative gradient by at most a prescribed relative tolerance. The proof is computer-assisted, and relies on the resolution of semidefinite programming performance estimation problems as introduced in the paper [Y. Drori and M. Teboulle. Performance of first-order methods for smooth convex minimization: a novel approach. Mathematical Programming, 145(1-2):451-482, 2014].

AB - We consider the gradient (or steepest) descent method with exact line search applied to a strongly convex function with Lipschitz continuous gradient. We establish the exact worst-case rate of convergence of this scheme, and show that this worst-case behavior is exhibited by a certain convex quadratic function. We also extend the result to a noisy variant of gradient descent method, where exact line-search is performed in a search direction that differs from negative gradient by at most a prescribed relative tolerance. The proof is computer-assisted, and relies on the resolution of semidefinite programming performance estimation problems as introduced in the paper [Y. Drori and M. Teboulle. Performance of first-order methods for smooth convex minimization: a novel approach. Mathematical Programming, 145(1-2):451-482, 2014].

M3 - Working paper

T3 - arXiv

BT - On the Worst-Case Complexity of the Gradient Method with Exact Line Search for Smooth Strongly Convex Functions

PB - Cornell University Library

CY - Itacha

ER -