Accession Number:

AD0678055

Title:

OPTIMUM SYSTEMS CONTROL,

Descriptive Note:

Corporate Author:

SOUTHERN METHODIST UNIV DALLAS TX INFORMATION AND CONTROL SCIENCES CENTER

Personal Author(s):

Report Date:

1968-06-01

Pagination or Media Count:

575.0

Abstract:

The book contains a comprehensive, up-to-date introduction to the basic concepts and principles employed in the optimization estimation and control of dynamic systems. Fifteen chapters are contained in the text. 1 Introduction, 2 Calculus of extrema and single stage decision processes, 3 Variational calculus and continuous optimal control, 4 The maximum principle and Hamilton Jacobi theory, 5 Optimum systems control examples, 6 Discrete variational calculus and the discrete maximum principle, 7 Optimum control of distributed parameter systems, 8 Optimum state estimation in linear stationary systems, 9 Optimum filtering for nonstationary continuous systems, 10 Least- squares curve fitting and state estimation in discrete linear systems, 11 Controllability and observability--the separation theorem, 12 Sensitivity in optimum systems control, 13 Direct computational methods in optimum systems control, 14 Quasilinearization, 15 Invariant imbedding. Author

Subject Categories:

  • Statistics and Probability
  • Operations Research
  • Cybernetics

Distribution Statement:

APPROVED FOR PUBLIC RELEASE