Accession Number:

AD1053963

Title:

Determination of Fire Control Policies via Approximate Dynamic Programming

Descriptive Note:

Technical Report,01 Sep 2014,24 Mar 2016

Corporate Author:

AIR FORCE INSTITUTE OF TECHNOLOGY WRIGHT-PATTERSON AFB OH WRIGHT-PATTERSON AFB United States

Personal Author(s):

Report Date:

2016-03-24

Pagination or Media Count:

71.0

Abstract:

Given the ubiquitous nature of offensive and defensive missile systems, the catastrophe-causing potential they represent, and the limited resources available to countries for missile defense, optimizing the response to a missile attack is a necessary endeavor. For a single salvo of offensive missiles launched at a set of targets, a missile defense system must decide how many interceptors to fire at each missile. Since such missile engagements often involve the firing of more than one attack salvo, we develop a Markov decision process MDP model to examine the optimal fire control policy for the defender. Due to the computational intractability of using exact methods for all but the smallest instances, we utilize an approximate dynamic programming ADP approach to explore the efficacy of applying approximate methods. We obtain policy insights by analyzing subsets of the state space that reflect a range of defender interceptor inventories. Testing of four scenarios demonstrates that the ADP policy provides high-quality decisions for a majority of the state space, achieving a 7.74 mean optimality gap. Moreover, computational effort for the ADP algorithm requires only a few minutes versus 12 hours for the exact DP algorithm, providing a method to address more complex and realistically-sized instances.

Subject Categories:

  • Operations Research
  • Antimissile Defense Systems

Distribution Statement:

APPROVED FOR PUBLIC RELEASE