skip to main content
ArticlePodcastReportQuick LookVideotriangleplus sign dropdown arrow Case Study All Newsfeed
Lawrence Lewis
Download full report

Significant advances in artificial intelligence (AI) over the past decade have changed our way of life, and the impacts of AI are only expected to accelerate. At the same time, the idea of adapting AI, and the related attribute of autonomy, to military applications has created considerable controversy. There are strong concerns about these technologies, even speculation that they could lead to the end of the world. Important questions to consider are: how do the actual risks of weaponizing this technology compare to those commonly discussed? And are states and the international community effectively managing these risks?

This report examines commonly held concerns about AI and autonomy in war, as reported in the media or voiced in international venues. We find that the overall premises for these concerns are either out of step with the current state of the technology, or they do not consider the way military systems are actually used (which is as part of a larger process for delivering the use of force). These concerns are not spurious—they can lead to much-needed debates and discussions regarding ethical issues of this emerging technology. However, the real risk in a military context (expressed in operational outcomes such as civilian casualties and fratricide) is low from these common concerns.

We also examine factors related to the operational use of AI and autonomy. We identify factors associated with the current and near-future state of the technology that could introduce operational risk if not mitigated, and we identify ways to mitigate them. These factors can be blind spots for militaries, which may tend to focus on developing a capability without considering the enablers necessary for the safe and effective use of AI and autonomy. We also present a framework for international and domestic discussions about the primary applicable risks of AI and autonomy in war. Finally we note that AI and autonomy provide opportunities, not just risks. States should look for opportunities to reduce risk and improve the conduct of war.

Download full report


  • Pages:
  • Document Number:
  • Publication Date: 8/28/2018
Back to Special Activities and Intelligence