Differential Metrics in Probability Spaces Based on Entropy and Divergence Measures.
PITTSBURGH UNIV PA CENTER FOR MULTIVARIATE ANALYSIS
Pagination or Media Count:
This paper discussed some general methods of metrizing probability spaces through the introduction of a quadratic differential metric in the parameter manifold of a set of probability distributions. These methods extend the investigation made in Rao 1945 where the Fisher information matrix was used to construct the metric, and the geodesic distance was suggested as a measurement of dissimilarity between probability distributions. The basic approach in this paper is first to construct a divergence or a dissimilarity measure between any two probability distributions, and use it to derive a differential metric by considering two distributions whose characterizing parameters are close to each other. One measure of divergence considered is the Jensen difference based on an entropy functional as defined in Rao 1982. Another is the f-divergence measure studied by Csiszar. The latter class leads to the differential metric based on the Fisher informatin matrix. The geodesic distances based on this metric computed by various authors are listed. Additional keywords Cross entropy Quadratic entropy.
- Statistics and Probability