ON THE CALCULATION OF MUTUAL INFORMATION
MICHIGAN UNIV ANN ARBOR COMPUTER INFORMATION AND CONTROL ENGINEERING PROGRAM
Pagination or Media Count:
Calculating the amount of information about a random function contained in another random function has important uses in communication theory. An expression for the mutual information for continuous time random processes has been given by Gelfand and Yaglom, Chiang, and Perez by generalizing Shannons result in a natural way. Under a condition of absolute continuity of measures the continuous time expression has the same form as Shannons result. For two Gaussian processes Gelfand and Yaglom express the mutual information in terms of a mean square estimation error. We generalize this result to diffusion processes and express the solution in a different form which is more naturally related to a corresponding filtering problem. We also use these results to calculate some information rates.