Back to Search Start Over

Exact Expressions for Kullback–Leibler Divergence for Univariate Distributions

Authors :
Victor Nawa
Saralees Nadarajah
Source :
Entropy, Vol 26, Iss 11, p 959 (2024)
Publication Year :
2024
Publisher :
MDPI AG, 2024.

Abstract

The Kullback–Leibler divergence (KL divergence) is a statistical measure that quantifies the difference between two probability distributions. Specifically, it assesses the amount of information that is lost when one distribution is used to approximate another. This concept is crucial in various fields, including information theory, statistics, and machine learning, as it helps in understanding how well a model represents the underlying data. In a recent study by Nawa and Nadarajah, a comprehensive collection of exact expressions for the Kullback–Leibler divergence was derived for both multivariate and matrix-variate distributions. This work is significant as it expands on our existing knowledge of KL divergence by providing precise formulations for over sixty univariate distributions. The authors also ensured the accuracy of these expressions through numerical checks, which adds a layer of validation to their findings. The derived expressions incorporate various special functions, highlighting the mathematical complexity and richness of the topic. This research contributes to a deeper understanding of KL divergence and its applications in statistical analysis and modeling.

Details

Language :
English
ISSN :
10994300
Volume :
26
Issue :
11
Database :
Directory of Open Access Journals
Journal :
Entropy
Publication Type :
Academic Journal
Accession number :
edsdoj.0c14f832e5424abda750136b710f533c
Document Type :
article
Full Text :
https://doi.org/10.3390/e26110959