Riemannian statistics meets random matrix theory: towards learning from high-dimensional covariance matrices - Archive ouverte HAL Access content directly
Journal Articles IEEE Transactions on Information Theory Year : 2022

Riemannian statistics meets random matrix theory: towards learning from high-dimensional covariance matrices

(1, 2)
1
2

Abstract

Riemannian Gaussian distributions were initially introduced as basic building blocks for learning models which aim to capture the intrinsic structure of statistical populations of positive-definite matrices (here called covariance matrices). While the potential applications of such models have attracted significant attention, a major obstacle still stands in the way of these applications: there seems to exist no practical method of computing the normalising factors associated with Riemannian Gaussian distributions on spaces of high-dimensional covariance matrices. The present paper shows that this missing method comes from an unexpected new connection, with random matrix theory.
Fichier principal
Vignette du fichier
journal revision 1.pdf (671.56 Ko) Télécharger le fichier
Origin : Files produced by the author(s)

Dates and versions

hal-03865913 , version 1 (22-11-2022)

Identifiers

Cite

Salem Said. Riemannian statistics meets random matrix theory: towards learning from high-dimensional covariance matrices. IEEE Transactions on Information Theory, 2022, ⟨10.1109/TIT.2022.3199479⟩. ⟨hal-03865913⟩
0 View
0 Download

Altmetric

Share

Gmail Facebook Twitter LinkedIn More