# Accession Number:

## ADA149192

# Title:

## Relative-Entropy Minimization with Uncertain Constraints--Theory and Application to Spectrum Analysis.

# Descriptive Note:

## Memorandum rept.,

# Corporate Author:

## NAVAL RESEARCH LAB WASHINGTON DC

# Personal Author(s):

# Report Date:

## 1984-12-31

# Pagination or Media Count:

## 16.0

# Abstract:

The relative-entropy principle principle of minimum cross entropy is a provably optimal information theoretic method for inferring a probability density from an initial prior estimate together with constraint information that confines the density to a specified convex set. Typically the constraint information takes the form of linear equations that specify the expectation values of given functions. This paper discusses the effect of replacing such linear-equality constraints with quadratic constraints that require linear constraints to hold approximately, to within a specified error bound. The results are applied to the derivation of a new multisignal spectrum-analysis method that simultaneously estimates a number of power spectra given 1 an initial estimate of each 2 imprecise values of the autocorrelation function of their sum 3 estimates of the error in measurement of the autocorrelation values. One application is to separate estimation of the spectra of a signal and independent additive noise, based on imprecise measurements of the autocorrelations of the signal plus noise. The new method is an extension of multisignal relative-entropy spectrum analysis with exact auto-correlations. The two methods are compared, and connections with previous related work are indicated. Mathematical properties of the new method are discussed, and an illustrative numerical example is presented. Originator-supplied keywords include Maximum entropy, cross entropy, Relative entropy, Information theory, Prior estimates, and Initial estimates.

# Descriptors:

# Subject Categories:

- Statistics and Probability
- Cybernetics