The Theory of Stochastic Games with Zero Stop Probabilities.
Abstract:
The authors study two person, zero sum, stochastic games with zero stop probabilities. Two distinct formulations are emphasized, 1 the infinite stage game with payoffs discounted at an interest rate close to zero and 2 the game with a large but finite number of stages. The authors give a complete theory of such games. The work implies all known existence theorems for optimal policies in Markov decision processes. It also generalizes all previous existence theorems for the value of a stochastic game. The approach differs from previous work in that it is algebraic and makes no use of the theory of Markov chains. The essential idea of this approach is to apply Tarskis Principle on real closed fields to a field of asymptotic expansions, which the authors term the field of real Puiseux series.