OPTIMAL REGULATION OF NONLINEAR DYNAMICAL SYSTEMS.
Technical summary rept.,
WISCONSIN UNIV MADISON MATHEMATICS RESEARCH CENTER
Pagination or Media Count:
The paper develops a theory of optimal control for processes described by autonomous systems of nonlinear ordinary differential equations. The admissible controls are feedback devices which operate on the sensed state of the system to automatically generate control signals that return the system to a prescribed state of equilibrium whenever an impulsive disturbance occurs. An optimal control is defined by means of a performance integral that provides a basis of comparison between certain feedback controls. By assuming that the process is stabilizable the existence and uniqueness of an optimal control is proved. Both C superscript omega and C superscript 2 systems are treated and several examples are discussed. Author
- Operations Research