Accession Number:

AD0686819

Title:

ON THE CALCULATION OF MUTUAL INFORMATION

Descriptive Note:

Corporate Author:

MICHIGAN UNIV ANN ARBOR COMPUTER INFORMATION AND CONTROL ENGINEERING PROGRAM

Personal Author(s):

Report Date:

1968-01-01

Pagination or Media Count:

13.0

Abstract:

Calculating the amount of information about a random function contained in another random function has important uses in communication theory. An expression for the mutual information for continuous time random processes has been given by Gelfand and Yaglom, Chiang, and Perez by generalizing Shannons result in a natural way. Under a condition of absolute continuity of measures the continuous time expression has the same form as Shannons result. For two Gaussian processes Gelfand and Yaglom express the mutual information in terms of a mean square estimation error. We generalize this result to diffusion processes and express the solution in a different form which is more naturally related to a corresponding filtering problem. We also use these results to calculate some information rates.

Subject Categories:

  • Cybernetics

Distribution Statement:

APPROVED FOR PUBLIC RELEASE