Planning Under Uncertainty: Moving Forward
Abstract:
Reasoning about uncertainty is an essential component of many real-world planning problems, such as robotic and space applications, military operations planning, air and ground traffic control, and manufacturing systems. Planning under uncertainty focuses on how to generate plans that will be executed in environments where actions have nondeterministic effects i.e., actions may have more than one possible outcome and the states of the world are not always fully observable. The two predominant approaches for planning under uncertainty are based on Markov Decision Processes MDPs and Symbolic Model Checking. Despite the recent advances in these approaches, the problem of how to plan under uncertainty is still very hard the planning algorithms must reason about more than one possible execution path in the world, and the sizes of the solution plans may grow exponentially. In planning environments that do not admit full observability the complexity of planning increases by an additional exponential factor since the planner does not know the exact states of the world, and therefore, it must reason over the set of all states that it believes to be in. This dissertation describes a suite of new planning algorithms for planning under uncertainty with the assumption of full observability. The new algorithms are much more efficient than the previous techniques in some cases, they find solutions exponentially faster than the previous ones.