SINGULAR TIME OPTIMAL CONTROL.
PURDUE UNIV LAFAYETTE IND SCHOOL OF MECHANICAL ENGINEERING
Pagination or Media Count:
A slightly different definition of singular time optimal control from the usual sense is given here. A system is said to be singular if more than one optimal control exists for a given boundary condition. The characteristics of the newly defined singular control problem are studied where the system is assumed to be linear, time invariant and expressed by vector differential equations. The controls are Lebesgue measurable functions and belong to a compact convex set. It is demonstrated through example that Pontryagins Minimum Principle is useful for the singular control problem. The minimum principle contains implicit information regarding the singular system. A standard procedure is shown to solve the singular problem. Methods of linear algebra are taken to analyze the system in this paper. Author
- Theoretical Mathematics
- Test Facilities, Equipment and Methods