Accession Number:



Airpower: The Ethical Consequences of Autonomous Military Aviation

Descriptive Note:

Technical Report

Corporate Author:

Naval War College Newport United States

Personal Author(s):

Report Date:


Pagination or Media Count:



Simultaneous advances in remotely-piloted aircraft and artificial intelligence AI are converging on a likely new military capability fully autonomous flying weapons, capable of selecting and engaging targets without direct human control. This may bring many advantages such systems could outperform human pilots, making faster decisions and fewer errors than humans. They might also distinguish enemies from bystanders more effectively than current weapons, reducing collateral damage and civilian casualties. If these capabilities become reality, a nation might be considered irresponsible if it failed to implement them. In addition to the utilitarian benefits, potential disadvantages demand study. AI decision-making cannot always be fully explained who could be held responsible for Acts of Code, when AI-powered weapons make bad or indecipherable decisions Might the mere existence of AI weaponry affect decision-makers calculus, lowering the threshold for war Would the humans who wield AI weapons develop an unhealthy relationship with violence Could nations employing these weapons, in an effort to lower the cost of the war in military lives, raise the cost in civilian lives when enemies resort to terrorism Consequentialism, deontology, and virtue ethics, along with Just War theory, provide philosophical backing for the consideration of these questions. Ultimately, and inescapably, war is a human endeavor, and those who wage it must preserve the appropriate level of human involvement. Above all, the principles of Just War must continue to supersede any technological considerations in warfare.

Subject Categories:

  • Pilotless Aircraft

Distribution Statement: