Larry Lewis, Ph.D.
Director, Center for Autonomy and AI
Larry Lewis is the Director of the Center for Autonomy and Artificial Intelligence at CNA. His areas of expertise include lethal autonomy, reducing civilian casualties, identifying lessons from current operations, security assistance, and counterterrorism.
Lewis spent a decade analyzing real world operations as the project lead and primary author for many of DOD's Joint Lessons Learned studies. For example, he was the lead analyst and co-author (with Dr. Sarah Sewall) for the Joint Civilian Casualty Study (JCCS) in support of GEN Petraeus, GEN McChrystal, and ADM Olson (SOCOM); GEN Petraeus described the study as "the first comprehensive assessment of the problem of civilian protection." His other areas of expertise include counterinsurgency and high value targeting in Iraq, Afghanistan, the Philippines, Colombia, and elsewhere. In addition, he authored the 2012 "Lessons from a Decade of War" report for CJCS. He also served as senior advisor to the Department of State's Assistant Secretary for Democracy, Human Rights, and Labor, and was on the U.S. delegation to the CCW regarding Lethal Autonomous Weapon Systems (LAWS).
Lewis has a Ph.D. in Physical Chemistry from Rice University.
RECENT NEWSFebruary 12, 2019
Larry Lewis says, “It's coming out with a very strong stance saying there are virtuous uses of AI”
Axios: “The Pentagon's Alluring AI Pitch to Silicon Valley”February 4, 2019
Larry Lewis provide his expert opinion on reducing civilian casualties.
Washington Post: “After Bloody Insurgent Wars, Pentagon Launches Effort to Prevent Civilian Deaths”
Larry Lewis says, “The non-combatant causality value often called the NCV, that is kind of a cap on the acceptable number of civilian causalities. That is done in addition to the legal considerations that are made during a strike. In late 2016 that number was increased , so as you say the willingness to take risks to civilians, that risk threshold was increased.”
PBS News Hour: “The U.S. Military Has a Number of Civilian Casualties That Is Deemed Acceptable. Has That Number Changed?”November 30, 2018
Larry Lewis says, “Destroying 60 buildings in a month means that twice a day, international forces are conducting the riskiest kind of strikes for civilians: structures where there is uncertainty of who may be inside them.”
Just Security: “Uptick in U.S. Air Strikes on Buildings in Afghanistan Raises Questions”
November 6, 2018
Larry Lewis says, “What does human control mean? If it means that humans pull the trigger, that's not always going to have the best outcome because humans make mistakes. For the most humanitarian outcome, you want to leverage both human and machine strengths.”
American University Washington College of Law: “The Fusion of Drones and Artificial Intelligence”September 20, 2018
Larry Lewis says, “The actions that [the State Department is] talking about in the memo are not the kinds of things that actually help reduce civilian casualties.”
PBS News Hour: “Yemen War’s Civilian Casualties Trigger Questions on Capitol Hill About U.S. Support Role”
CNA Field Program
Command and Control
Drones and Unmanned Vehicles
Autonomy and Artificial Intelligence
Center for Autonomy and Artificial Intelligence
Redefining Human Control: Lessons from the Battlefield for Autonomous Weapons
Insights for the Third Offset: Addressing Challenges of Autonomy and Artificial Intelligence in Military Operations
Decade of War: Applying Past Lessons to the Counter-ISIS Campaign
Summary Report: U.S.-UK Integration in Helmand
Improving Lethal Action Learning and Adapting in U.S. Counterterrorism Operations (U)