Accession Number:
ADA276516
Title:
Hierarchical Mixtures of Experts and the EM Algorithm
Descriptive Note:
Memorandum rept.
Corporate Author:
MASSACHUSETTS INST OF TECH CAMBRIDGE ARTIFICIAL INTELLIGENCE LAB
Personal Author(s):
Report Date:
1993-08-06
Pagination or Media Count:
31.0
Abstract:
We present a tree-structured architecture for supervised learning. The statistical model underlying the architecture is a hierarchical mixture model in which both the mixture coefficients and the mixture components are generalized linear models GLIMs. Learning is treated as a maximum likelihood problem in particular, we present an Expectation-Maximization EM algorithm for adjusting the parameters of the architecture. We also develop an on-line learning algorithm in which the parameters are updated incrementally. Comparative simulation results are presented in the robot dynamics domain.
Descriptors:
Subject Categories:
- Statistics and Probability
- Cybernetics