News

Latest News

February 12, 2019

Larry Lewis says, “It's coming out with a very strong stance saying there are virtuous uses of AI”

Axios: “The Pentagon's Alluring AI Pitch to Silicon Valley”

December 31, 2018

Larry Lewis says, “The non-combatant causality value often called the NCV, that is kind of a cap on the acceptable number of civilian causalities.  That is done in addition to the legal considerations that are made during a strike. In late 2016 that number was increased , so as you say the willingness to take risks to civilians, that risk threshold was increased.”

PBS News Hour: “The U.S. Military Has a Number of Civilian Casualties That Is Deemed Acceptable. Has That Number Changed?”

November 6, 2018

Larry Lewis says, “What does human control mean? If it means that humans pull the trigger, that's not always going to have the best outcome because humans make mistakes. For the most humanitarian outcome, you want to leverage both human and machine strengths.”

American University Washington College of Law: “The Fusion of Drones and Artificial Intelligence”

August 31, 2018

Larry Lewis says, “We have not had more progress in the past few years because we have not sufficiently defined the problem. States and other groups are still talking past one another.”

CAAI Blog: “CNA Statement to UN Group of Government Experts on Lethal Autonomous Weapon Systems, August 29 2018”

August 27, 2018

Samuel Bendett writes, “Currently, the Russian military is working on incorporating elements of AI in its electronic warfare, missile, aircraft and unmanned systems technologies, with the aim of making battlefield decision-making and targeting faster and more precise.”

CAAI Blog: “Efforts to Develop AI in the Russian Military”

July 18, 2018

Samuel Bendett writes, “The Russian government has long expressed concern that their reliance on imported IT products creates major security vulnerabilities.”

CAAI Blog: “Russian Kryptonite to Western Hi-Tech Dominance”

June 29, 2018

Larry Lewis says, “Harnessing the strengths of industry and academia (is crucial) and that is explicitly called out, it’s not necessarily going to be easy, but including the discussion about ethics and AI safety (so prominently) is going to be an important piece of that.”

Defense & Aerospace Report: “CNA’s Bendett on Russia’s Hunter UAV, Uran-9 UGV, Submarine and UUV Innovation Outlook”

June 29, 2018

Larry Lewis writes, “Even though many civilians encounter the humanitarian tolls of war, there is no public conversation on how applying artificial intelligence to waging war could help ease its tragedies.”

Breaking Defense: “AI for Good in War; Beyond Google’s ‘Don’t Be Evil’”

May 22, 2018

Larry Lewis says, "When people think about AI and autonomy as it applies to war, I think it's helpful to see what's actually out there as opposed to what we might see in Hollywood."

Salem Radio Seattle: "Live From Seattle, Interview with Dr. Larry Lewis"

May 15, 2018

Larry Lewis writes, "It would be worthwhile to think more deeply about how to use AI to reduce the humanitarian tolls of warfare."

Just Security: "AI-4-Good in Warfare"

April 12, 2018

Larry Lewis says, "You can actually create better humanitarian outcomes with AI."

CNA: "CNA Expert Speaks to U.N. Representatives About Lethal Autonomous Weapons"

April 12, 2018

At the U.N. Convention on Certain Conventional Weapons (CCW), Dr. Larry Lewis warned that human intervention at the "trigger pull" will not eliminate the risks of autonomous weapons. Read more.

April 10, 2018

Larry Lewis says, "Rather than focusing on the human control in the final engagement decision, the U.N. should develop a comprehensive safety net woven from existing best practices."

CNA: "Building a Safety Net for Autonomous Weapons"

News Releases

April 12, 2018 – At this week’s meeting of the U.N. Convention on Certain Conventional Weapons (CCW), Dr. Larry Lewis, director of the Center for Autonomy and Artificial Intelligence at CNA, warned that human intervention at the “trigger pull” will not eliminate the risks of autonomous weapons. The convention’s purpose is to, “ban or restrict the use of specific types of weapons that are considered to cause unnecessary or unjustifiable suffering to combatants or to affect civilians indiscriminately.” In recent years the subject of lethal autonomous weapons systems (LAWS) has been a major focus of the CCW.

April 10, 2018 – As nations continue to develop lethal autonomous weapon systems (LAWS), the international community’s main concern is how to mitigate the risk these systems pose to civilians. A growing consensus in the U.N. Convention on Certain Conventional Weapons (CCW) argues that human control over the targeting process is a solution to the risks posed by LAWS. A new CNA report by Dr. Larry Lewis, director of the Center for Autonomy and Artificial Intelligence at CNA, argues that this approach is “too narrow.” Instead, the report recommends a shift in focus from process considerations, such as human control at the point of final engagement, to outcome considerations, using a comprehensive approach to limit civilian casualties.

October 20, 2017–The launch of CNA’s Center for Autonomy and Artificial Intelligence (CAAI) will bring CNA’s unique "scientists on the front lines" research techniques to tomorrow’s battlefield. CAAI officials said advancements in AI related technologies will present challenges and unexpected opportunities, particularly for the Department of Defense (DOD), that CNA has the distinct experience and capabilities to address.