Differential Games, Optimal Control and Directional Derivatives of Viscosity Solutions of Bellman's and Isaacs' Equations.
Technical summary rept.,
WISCONSIN UNIV-MADISON MATHEMATICS RESEARCH CENTER
Pagination or Media Count:
Recent work by the authors and others has demonstrated the connections between the dynamic programming approach to optimal control theory and to two-person, zero-sum differential games problems and the new notion of viscosity solutions of Hamilton-Jacobi PDEs introduced by M. G. Crandall and P. L. Lions. In particular, it has been proved that the dynamic programming principle implies that the value function is the viscosity solution of the associated Hamilton-Jacobi-Bellman and Isaacs equations. In the present work, it is shown that viscosity super- and subsolutions of these equations must satisfy some inequalities called super- and subdynamic programming principle respectively. This is then used to prove the equivalence between the notion of viscosity solutions and the conditions, introduced by A. Subbotin, concerning the sign of certain generalized directional derivatives. Author
- Numerical Mathematics