Accession Number:

AD0489159

Title:

OPTIMAL CONTROL OF CONTINUOUS-TIME STOCHASTIC SYSTEMS.

Descriptive Note:

Research rept.,

Corporate Author:

CALIFORNIA UNIV BERKELEY ELECTRONICS RESEARCH LAB

Personal Author(s):

Report Date:

1966-08-19

Pagination or Media Count:

108.0

Abstract:

This report is concerned with determining the optimal feedback control for continuous-time, continuous-state, stochastic, nonlinear, dynamic systems when only noisy observations of the state are available. At each instant of time, the current value of the control is a functional of the entire past history of the observations. The principal mathematical apparatus used in this investigation is the following 1 the theory of probability measures and integration on infinite dimensional function spaces, 2 the Ito stochastic calculus for differentiation and integration of random functions, 3 the Frechet derivative of a functional on an infinite dimensional function space, and 4 dynamic programming. In Sections I and II, items 1 and 2 above are used to establish rigorously sufficient conditions for the existence of a conditional probability density for the current state of the system given the entire past history of the observations. A rigorous derivation is then given of a stochastic integral equation which is obeyed by an unnormalized version of the desired conditional density. In Section III, items 3 and 4 above are used heuristically to obtain a stochastic Hamilton-Jacobi equation in function space. It is shown that the solution of this equation would yield the desired feedback control. Author

Subject Categories:

  • Statistics and Probability

Distribution Statement:

APPROVED FOR PUBLIC RELEASE