Strong Convergence of FISTA Iterates under Hölderian and Quadratic Growth Conditions
Résumé
Introduced by Beck and Teboulle, FISTA (for Fast Iterative Shrinkage-Thresholding Algorithm) is a first-order method widely used in convex optimization. Adapted from Nesterov's accelerated gradient method for convex functions, the generated sequence guarantees a decay of the function values of $\mathcal{O}\left(n^{-2}\right)$ in the convex setting. We show that for coercive functions satisfying some local growth condition (namely a H\"olderian or quadratic growth condition), this sequence strongly converges to a minimizer. This property, which has never been proved without assuming the uniqueness of the minimizer, is associated with improved convergence rates for the function values. The proposed analysis is based on a preliminary study of the Asymptotic Vanishing Damping system introduced by Su et al. in to model
Nesterov's accelerated gradient method in a continuous setting. Novel improved convergence results are also shown for the solutions of this dynamical system, including the finite length of the trajectory under the aforementioned geometry conditions.
Origine | Fichiers produits par l'(les) auteur(s) |
---|