Accession Number:

AD0692476

Title:

SINGULAR TIME OPTIMAL CONTROL.

Descriptive Note:

Technical rept.,

Corporate Author:

PURDUE UNIV LAFAYETTE IND SCHOOL OF MECHANICAL ENGINEERING

Personal Author(s):

Report Date:

1969-09-01

Pagination or Media Count:

33.0

Abstract:

A slightly different definition of singular time optimal control from the usual sense is given here. A system is said to be singular if more than one optimal control exists for a given boundary condition. The characteristics of the newly defined singular control problem are studied where the system is assumed to be linear, time invariant and expressed by vector differential equations. The controls are Lebesgue measurable functions and belong to a compact convex set. It is demonstrated through example that Pontryagins Minimum Principle is useful for the singular control problem. The minimum principle contains implicit information regarding the singular system. A standard procedure is shown to solve the singular problem. Methods of linear algebra are taken to analyze the system in this paper. Author

Subject Categories:

  • Theoretical Mathematics
  • Test Facilities, Equipment and Methods

Distribution Statement:

APPROVED FOR PUBLIC RELEASE