Accession Number:

AD0431055

Title:

APPLICATION OF DYNAMIC PROGRAMMING TO STOCHASTIC TIME OPTIMAL CONTROL,

Descriptive Note:

Corporate Author:

SYSTEM DEVELOPMENT CORP SANTA MONICA CALIF

Personal Author(s):

Report Date:

1964-01-31

Pagination or Media Count:

13.0

Abstract:

A non-linear control process is discussed where the control is bounded as absolute value. A random element noise that appears additively as part of the control variable is assumed. The performance criterion of driving the system back to equilibrium from its present perturbed state in minimum expected time, due to the presence of the random noise is used. The principle of optimality of dynamic programming to derive a novel partial differential equation in the minimum expected time is applied. Solutions of this equation yield the optimal control policy, which is bang bang. Specifically, far from the origin of the corresponding phase space, the control is set, once only, to drive the system into the linear region near the origin. In the linear region, the control switching sequence corresponding to Bushaws theorem ensues. Author

Subject Categories:

Distribution Statement:

APPROVED FOR PUBLIC RELEASE