Convergence of Langevin-Simulated Annealing algorithms with multiplicative noise II: Total Variation - CNRS - Centre national de la recherche scientifique Accéder directement au contenu
Pré-Publication, Document De Travail Année : 2022

Convergence of Langevin-Simulated Annealing algorithms with multiplicative noise II: Total Variation

Pierre Bras

Résumé

We study the convergence of Langevin-Simulated Annealing type algorithms with multiplicative noise, i.e. for $V : \mathbb{R}^d \to \mathbb{R}$ a potential function to minimize, we consider the stochastic differential equation $dY_t = - \sigma \sigma^\top \nabla V(Y_t) dt + a(t)\sigma(Y_t)dW_t + a(t)^2\Upsilon(Y_t)dt$, where $(W_t)$ is a Brownian motion, where $\sigma : \mathbb{R}^d \to \mathcal{M}_d(\mathbb{R})$ is an adaptive (multiplicative) noise, where $a : \mathbb{R}^+ \to \mathbb{R}^+$ is a function decreasing to $0$ and where $\Upsilon$ is a correction term. Allowing $\sigma$ to depend on the position brings faster convergence in comparison with the classical Langevin equation $dY_t = -\nabla V(Y_t)dt + \sigma dW_t$. In a previous paper we established the convergence in $L^1$-Wasserstein distance of $Y_t$ and of its associated Euler scheme $\bar{Y}_t$ to $\text{argmin}(V)$ with the classical schedule $a(t) = A\log^{-1/2}(t)$. In the present paper we prove the convergence in total variation distance. The total variation case appears more demanding to deal with and requires regularization lemmas.

Dates et versions

hal-03890504 , version 1 (08-12-2022)

Identifiants

Citer

Pierre Bras, Gilles Pagès. Convergence of Langevin-Simulated Annealing algorithms with multiplicative noise II: Total Variation. 2022. ⟨hal-03890504⟩
9 Consultations
0 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More