Statistical Validation of Mutual Information Calculations: Comparisons of Alternative Numerical Algorithms
Technical rept., 2002-2003
NAVAL MEDICAL RESEARCH CENTER SILVER SPRING MD
Pagination or Media Count:
Given two time series X and Y, their mutual information, IX,YIY,X, is the average number of bits of X that can be predicted by measuring Y and vice versa. In the analysis of observational data, calculation of mutual information occurs in three contexts identification of nonlinear correlation, determination of an optimal sampling interval particularly when embedding data, and in the investigation of causal relationships with directed mutual information. In this report a minimum description length argument is used to determine the optimal number of elements to use when characterizing the distributions of X and Y. However, even when using partitions of the X and Y axis indicated by minimum description length, mutual information calculations performed with a uniform partition of the XY plane can give misleading results. This motivated the construction of an algorithm for calculating mutual information that uses an adaptive partition. This algorithm also incorporates an explicit test of the statistical independence of X and Y in a calculation that returns an assessment of the corresponding null hypothesis.
- Numerical Mathematics
- Statistics and Probability