Accession Number:

AD0685735

Title:

BAYESIAN LEARNING IN MARKOV CHAINS WITH OBSERVABLE STATES,

Descriptive Note:

Corporate Author:

MICHIGAN STATE UNIV EAST LANSING DIV OF ENGINEERING RESEARCH

Report Date:

1969-03-01

Pagination or Media Count:

34.0

Abstract:

Two practical and related problems concerning decision-making with observations from Markov chains are considered in this report. First, Bayesian learning theory is used to develop recursive relations for the densities of the unknown parameters in a Markov chain, based on classified observations of the chains states. Computationally simple results are obtained using a matrix-beta distribution for the chains parameters. In the case of unsupervised observations, the basic relations for learning are derived and methods for their implementation are discussed. Second, the related problem of deciding which of a set of chains is active, based on state observations, is considered. Two data-generating models are proposed and decision rules are derived. A particularly useful result is derived for one model using the matrix-beta distribution for the unknown parameters. The decision rule for the more difficult model is then derived and its implications discussed. Simulation results for a specific example show the probability of error for different amounts of training data and demonstrate the inherent practicality of the results. Author

Subject Categories:

  • Personnel Management and Labor Relations
  • Cybernetics
  • Bionics

Distribution Statement:

APPROVED FOR PUBLIC RELEASE