Explicit second-order min-max optimization methods with optimal convergence guarantees - Laboratoire d'Informatique de Grenoble Accéder directement au contenu
Pré-Publication, Document De Travail Année : 2022

Explicit second-order min-max optimization methods with optimal convergence guarantees

Résumé

We propose and analyze exact and inexact regularized Newton-type methods for finding a global saddle point of a convex-concave unconstrained min-max optimization problem. Compared to their first-order counterparts, investigations of second-order methods for min-max optimization are relatively limited, as obtaining global rates of convergence with second-order information is much more involved. In this paper, we highlight how second-order information can be used to speed up the dynamics of dual extrapolation methods despite inexactness. Specifically, we show that the proposed algorithms generate iterates that remain within a bounded set and the averaged iterates converge to an ϵ-saddle point within O(ϵ −2/3) iterations in terms of a gap function. Our algorithms match the theoretically established lower bound in this context and our analysis provides a simple and intuitive convergence analysis for second-order methods without requiring any compactness assumptions. Finally, we present a series of numerical experiments on synthetic and real data that demonstrate the efficiency of the proposed algorithms.
Fichier principal
Vignette du fichier
NewtonMinMax.pdf (640.84 Ko) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-03873989 , version 1 (27-11-2022)

Identifiants

  • HAL Id : hal-03873989 , version 1

Citer

Tianyi Lin, Panayotis Mertikopoulos, Michael I Jordan. Explicit second-order min-max optimization methods with optimal convergence guarantees. 2022. ⟨hal-03873989⟩
38 Consultations
34 Téléchargements

Partager

Gmail Facebook X LinkedIn More