Accéder directement au contenu Accéder directement à la navigation
Article dans une revue

Kullback–Leibler Divergence Between Multivariate Generalized Gaussian Distributions

Nizar Bouhlel 1 Ali Dziri 2
1 Lab-STICC_ENSTAB_MOM_PIM
Lab-STICC - Laboratoire des sciences et techniques de l'information, de la communication et de la connaissance
Abstract : The Kullback-Leibler divergence (KLD) between two multivariate generalized Gaussian distributions (MGGDs) is a fundamental tool in many signal and image processing applications. Until now, the KLD of MGGDs has no known explicit form, and it is in practice either estimated using expensive Monte-Carlo stochastic integration or approximated. The main contribution of this letter is to present a closed-form expression of the KLD between two zero-mean MGGDs. Depending on the Lauricella series, a simple way of calculating numerically the KLD is exposed. Finally, we show that the approximation of the KLD by Monte-Carlo sampling converges to its theoretical value when the number of samples goes to the infinity.
Liste complète des métadonnées

https://hal-ensta-bretagne.archives-ouvertes.fr/hal-02304988
Contributeur : Claude Morvan <>
Soumis le : jeudi 3 octobre 2019 - 16:13:56
Dernière modification le : mercredi 24 juin 2020 - 16:19:54

Identifiants

Citation

Nizar Bouhlel, Ali Dziri. Kullback–Leibler Divergence Between Multivariate Generalized Gaussian Distributions. IEEE Signal Processing Letters, Institute of Electrical and Electronics Engineers, 2019, 26 (7), pp.1021-1025. ⟨10.1109/LSP.2019.2915000⟩. ⟨hal-02304988⟩

Partager

Métriques

Consultations de la notice

101