Abstract
In this paper, we derive a new linear convergence rate for the gradient method with
fxed step lengths for non-convex smooth optimization problems satisfying the
Polyak-Łojasiewicz (PŁ) inequality. We establish that the PŁ inequality is a necessary and sufcient condition for linear convergence to the optimal value for this
class of problems. We list some related classes of functions for which the gradient
method may enjoy linear convergence rate. Moreover, we investigate their relationship with the PŁ inequality.
fxed step lengths for non-convex smooth optimization problems satisfying the
Polyak-Łojasiewicz (PŁ) inequality. We establish that the PŁ inequality is a necessary and sufcient condition for linear convergence to the optimal value for this
class of problems. We list some related classes of functions for which the gradient
method may enjoy linear convergence rate. Moreover, we investigate their relationship with the PŁ inequality.
Original language | English |
---|---|
Pages (from-to) | 1105–1125 |
Journal | Optimization Letters |
Volume | 17 |
DOIs | |
Publication status | Published - 2023 |
Keywords
- weakly convex optimization
- gradient method
- performance estimation problem
- Polyak-Lojasiewicz inequality
- semidefinite programming