Divergence is a Python package to compute statistical measures of entropy and divergence from probability distributions and samples.
The following functionality is provided:
- (Information) Entropy [1], [2]
- Cross Entropy: [3]
- Relative Entropy or Kullback-Leibler (KL-) Divergence [4], [5]
- Jensen-Shannon Divergence [6]
- Joint Entropy [7]
- Conditional Entropy [8]
- Mutual Information [9]
The units in which these entropy and divergence measures are calculated can be specified by the user.
This is achieved by setting the argument base
, to 2.0
, 10.0
, or np.e
.
In a Bayesian context, relative entropy can be used as a measure of the information gained by moving
from a prior distribution q
to a posterior distribution p
.
pip install divergence
See the Jupyter notebook Divergence.