Accession Number:

ADA270762

Title:

Equivalence and Reduction of Hidden Markov Models

Descriptive Note:

Technical rept.,

Corporate Author:

MASSACHUSETTS INST OF TECH CAMBRIDGE ARTIFICIAL INTELLIGENCE LAB

Personal Author(s):

Report Date:

1993-01-01

Pagination or Media Count:

89.0

Abstract:

Hidden Markov Models are one of the most popular and successful techniques used in statistical pattern recognition. However, they are not well understood on a fundamental level. For example, we do not know how to characterize the class of processes that can be well approximated by HMMs. This thesis tries to uncover the source of the intrinsic expressiveness of HMMs by studying when and why two models may represent the same stochastic process. Define two statistical models to be equivalent if they are models of exactly the same process. We use the theorems proved in this thesis to develop polynomial time algorithms to detect equivalence of prior distributions on an HMM, equivalence of HMMs and equivalence of HMMs with fixed priors. We characterize Hidden Markov Models in terms of equivalence classes whose elements represent exactly the same processes and proceed to describe an algorithm to reduce HMMs to essentially unique and minimal, canonical representations. These canonical forms are essentially smallest representatives of their equivalence classes, and the number of parameters describing them can be considered a representation for the complexity of the stochastic process they model. On the way to developing our reduction algorithm, we define Generalized Markov Models which relax the positivity constraint on HMM parameters. This generalization is derived by taking the view that an interpretation of model parameters as probabilities is less important than a parsimonious representation of stochastic processes.

Subject Categories:

  • Statistics and Probability
  • Cybernetics

Distribution Statement:

APPROVED FOR PUBLIC RELEASE