For Immediate Release
Contact: Fiona Gettinger, Communications Associate
Building a Safety Net for Autonomous Weapons
New CNA report recommends comprehensive approach to limit civilian casualties
Arlington, Va. — As nations continue to develop lethal autonomous weapon systems (LAWS), the international community’s main concern is how to mitigate the risk these systems pose to civilians. A growing consensus in the U.N. Convention on Certain Conventional Weapons (CCW) argues that human control over the targeting process is a solution to the risks posed by LAWS.
A new CNA report by Dr. Larry Lewis, director of the Center for Autonomy and Artificial Intelligence at CNA, argues that this approach is “too narrow.” Instead, the report recommends a shift in focus from process considerations, such as human control at the point of final engagement, to outcome considerations, using a comprehensive approach to limit civilian casualties. Drawing upon CNA’s long history of analyzing military operations, the report examines recent battlefield experiences to yield insights into protecting civilians from LAWS.
The findings show that human judgement in final engagement decisions is fallible; for example, misidentifications were the reason for about half of all U.S.-caused civilian casualties in Afghanistan.
“We have seen how the fallibility of human judgment in real-world operations has placed civilians at risk. Rather than focusing on the human control in the final engagement decision, the U.N. should develop a comprehensive safety net woven from existing best practices,” said Lewis. “These could include a longer targeting process that considers pattern of life, collateral damage estimates and effects on civilian infrastructure.”
Lewis also said machine learning could contribute to this process by providing more accurate data and better anticipating unintended consequences.
To aid the U.N., the report recommends the following set of best practices for evaluating the risks of LAWS:
1. Specific measures to help avoid inadvertent engagements, such as careful attention to possible misidentification of civilians and pattern-of-life determinations
2. Standardized policies to strengthen oversight of procurement and use of LAWS
3. Test and evaluation practices for LAWS that change and adapt with the technology
As Lewis notes, “These steps can more comprehensively address risks posed by autonomous weapons, collectively forming a safety net that represents the application of human judgment over these emerging weapons.”
CNA is a nonprofit research and analysis organization dedicated to developing actionable solutions to complex problems of national importance. With nearly 700 scientists, analysts and professional staff, CNA takes a real-world approach to gathering data. Its one-of-a-kind field program places analysts on carriers and military bases, in squad rooms and classrooms, and working side-by-side with a wide array of government decision-makers around the world. In addition to defense-related matters for the U.S. Department of the Navy, CNA’s research portfolio includes criminal justice, homeland security, energy security, water resources, enterprise systems and data analysis, and education.
Note to writers and editors: CNA is not an acronym and is correctly referenced as "CNA, a research organization in Arlington, VA."