# Accession Number:

## AD0668749

# Title:

## OPTIMAL CONTROL OF A DISCRETE TIME STOCHASTIC SYSTEM LINEAR IN THE STATE,

# Descriptive Note:

# Corporate Author:

## RAND CORP SANTA MONICA CALIF

# Personal Author(s):

# Report Date:

## 1968-04-01

# Pagination or Media Count:

## 12.0

# Abstract:

Consideration is given to a discrete time finite horizon stochastic control problem whose dynamic equations and loss function are linear in the state vector with random coefficients, but which may vary in a nonlinear, random manner with the control variables. The controls are constrained to lie in a given set. For this system we have the rather surprising result that the optimal control or policy is independent of the value of the state. This fact follows from a simple dynamic programming argument concerning the form of the optimal return function. Under suitable restrictions on the functions, the dynamic programming approach leads to efficient computational methods for obtaining the controls via a sequence of mathematical programming problems in fewer variables than the number of controls in the entire process. The result extends the notion of certainty equivalence for a sequential stochastic decision problem. The expectations of the random functions play the role of certainty equivalents in the sense that the optimal control can be found by solving a deterministic problem in which expectations replace random quantities. Author

# Descriptors:

# Subject Categories:

- Operations Research