Loading…
Academic Journal
Exact Expressions for Kullback–Leibler Divergence for Univariate Distributions
Victor Nawa, Saralees Nadarajah
Entropy, Vol 26, Iss 11, p 959 (2024)
Saved in:
Title | Exact Expressions for Kullback–Leibler Divergence for Univariate Distributions |
---|---|
Authors | Victor Nawa, Saralees Nadarajah |
Publication Year |
2024
|
Source |
Entropy, Vol 26, Iss 11, p 959 (2024)
|
Description |
The Kullback–Leibler divergence (KL divergence) is a statistical measure that quantifies the difference between two probability distributions. Specifically, it assesses the amount of information that is lost when one distribution is used to approximate another. This concept is crucial in various fields, including information theory, statistics, and machine learning, as it helps in understanding how well a model represents the underlying data. In a recent study by Nawa and Nadarajah, a comprehensive collection of exact expressions for the Kullback–Leibler divergence was derived for both multivariate and matrix-variate distributions. This work is significant as it expands on our existing knowledge of KL divergence by providing precise formulations for over sixty univariate distributions. The authors also ensured the accuracy of these expressions through numerical checks, which adds a layer of validation to their findings. The derived expressions incorporate various special functions, highlighting the mathematical complexity and richness of the topic. This research contributes to a deeper understanding of KL divergence and its applications in statistical analysis and modeling.
|
Document Type |
article
|
Language |
English
|
Publisher Information |
MDPI AG, 2024.
|
Subject Terms | |
This result is restricted to LU affiliated users only.
Sign in or register for an institutional account to gain full access, if eligible. |