Axiomatic Derivation of the Principle of Maximum Entropy and the Principle of Minimum Cross-Entropy.
NAVAL RESEARCH LAB WASHINGTON D C
Pagination or Media Count:
We prove that, in a well-defined sense, Jayness principle of maximum entropy and Kullbacks principle of minimum cross-entropy minimum directed divergence provide uniquely correct, general methods of inductive inference when new information is given in the form of expected values. Previous justifications rely heavily on intuitive arguments and on the properties of entropy and cross-entropy as information measures. Our approach assumes that reasonable methods of inductive inference should lead to consistent results whenever there are different ways of taking the same information into account --- for example, in different coordinate systems. We formalize this requirement as four consistency axioms stated in terms of an abstract information operator the axioms make no reference to information measures. We establish this result both directly and as a special case uniform priors of an analogous, more general result for the principle of minimum cross-entropy. We obtain results both for continuous probability densities and for discrete distributions.