Estimating differential entropy and mutual information for continuous random variables.
Perhaps see:
- It is not invariant under monotonic changes of variables (on the individual r.v.s.), and is therefore most useful with dimensionless variables. The equivalent invariance for discrete is bijective (relabelling) transformations of the individual r.v.s.
- It can be negative.
See also the limiting density of discrete points as to why the original description of differential entropy is not even dimensionally correct.
...
python setup.py install
or
pip install pypi
See Makefile
for example ops.
See https://pypi.org/project/mutual-info
Do not pin packages for now. Let's surf latest and find out when things break.
python setup.py develop
make test
- should do some rank transform of the data or something to make it invariant to monotonic transforms of the data.
- equation 3 and 9 from 2008 nips paper
- tests
- clear documentation and reminders about mutual information and the problems with continuous r.v.s
- compare to sklearn _mutual_info.py as per #2
Originally adapted by G Varoquaux in a gist for code created by R Brette, itself from several papers (see in the code). These computations rely on nearest-neighbor (radial density) statistics.
- Kozachenko, L. F. & Leonenko, N. N. 1987 Sample estimate of entropy of a random vector. Probl. Inf. Transm. 23, 95-101. In particular see eq (20).
- Evans, D. 2008 A computationally efficient estimator for mutual information, Proc. R. Soc. A 464 (2093), 1203-1215
- Kraskov A, Stogbauer H, Grassberger P. (2004). Estimating mutual information. Phys Rev E 69(6 Pt 2):066138
- F. Perez-Cruz, (2008). Estimation of Information Theoretic Measures for Continuous Random Variables. Advances in Neural Information Processing Systems 21 (NIPS). Vancouver (Canada), December.
- Damiano Lombardi, Sanjay Pant (2016) A non-parametric k-nearest neighbor entropy estimator