Kullback–Leibler Divergence Between Multivariate Generalized Gaussian Distributions - ENSTA Bretagne - École nationale supérieure de techniques avancées Bretagne Accéder directement au contenu
Article Dans Une Revue IEEE Signal Processing Letters Année : 2019

Kullback–Leibler Divergence Between Multivariate Generalized Gaussian Distributions

Résumé

The Kullback-Leibler divergence (KLD) between two multivariate generalized Gaussian distributions (MGGDs) is a fundamental tool in many signal and image processing applications. Until now, the KLD of MGGDs has no known explicit form, and it is in practice either estimated using expensive Monte-Carlo stochastic integration or approximated. The main contribution of this letter is to present a closed-form expression of the KLD between two zero-mean MGGDs. Depending on the Lauricella series, a simple way of calculating numerically the KLD is exposed. Finally, we show that the approximation of the KLD by Monte-Carlo sampling converges to its theoretical value when the number of samples goes to the infinity.
Fichier non déposé

Dates et versions

hal-02304988 , version 1 (03-10-2019)

Identifiants

Citer

Nizar Bouhlel, Ali Dziri. Kullback–Leibler Divergence Between Multivariate Generalized Gaussian Distributions. IEEE Signal Processing Letters, 2019, 26 (7), pp.1021-1025. ⟨10.1109/LSP.2019.2915000⟩. ⟨hal-02304988⟩
676 Consultations
0 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More