Accession Number:

ADA160301

Title:

Differential Metrics in Probability Spaces Based on Entropy and Divergence Measures.

Descriptive Note:

Technical rept.,

Corporate Author:

PITTSBURGH UNIV PA CENTER FOR MULTIVARIATE ANALYSIS

Personal Author(s):

Report Date:

1985-04-01

Pagination or Media Count:

23.0

Abstract:

This paper discussed some general methods of metrizing probability spaces through the introduction of a quadratic differential metric in the parameter manifold of a set of probability distributions. These methods extend the investigation made in Rao 1945 where the Fisher information matrix was used to construct the metric, and the geodesic distance was suggested as a measurement of dissimilarity between probability distributions. The basic approach in this paper is first to construct a divergence or a dissimilarity measure between any two probability distributions, and use it to derive a differential metric by considering two distributions whose characterizing parameters are close to each other. One measure of divergence considered is the Jensen difference based on an entropy functional as defined in Rao 1982. Another is the f-divergence measure studied by Csiszar. The latter class leads to the differential metric based on the Fisher informatin matrix. The geodesic distances based on this metric computed by various authors are listed. Additional keywords Cross entropy Quadratic entropy.

Subject Categories:

  • Statistics and Probability

Distribution Statement:

APPROVED FOR PUBLIC RELEASE