Accession Number:

ADA461913

Title:

Adaptive Importance Sampling for Uniformly Recurrent Markov Chains

Descriptive Note:

Corporate Author:

BROWN UNIV PROVIDENCE RI LEFSCHETZ CENTER FOR DYNAMICAL SYSTEMS

Personal Author(s):

Report Date:

2003-01-01

Pagination or Media Count:

39.0

Abstract:

Importance sampling is a variance reduction technique for efficient estimation of rare-event probabilities by Monte Carlo. In standard importance sampling schemes, the system is simulated using an a priori fixed change of measure suggested by a large deviation lower bound analysis. Recent work, however, has suggested that such schemes do not work well in many situations. In this paper, we consider adaptive importance sampling in the setting of uniformly recurrent Markov chains. By adaptive, we mean that the change of measure depends on the history of the samples. Based on a control-theoretic approach to large deviations, the existence of asymptotically optimal adaptive schemes is demonstrated in great generality. In this framework, the difference between a static change of measure and an adaptive change and a feed-back control. The implementation of the adaptive schemes is carried out with the help of a limiting Bellman equation. Also presented are numerical examples contrasting the adaptive and standard schemes.

Subject Categories:

  • Statistics and Probability

Distribution Statement:

APPROVED FOR PUBLIC RELEASE