Download presentation
Presentation is loading. Please wait.
Published bySofia Serrano Modified over 5 years ago
1
Lethal Autonomous Weapons in the International Community
Future Warfare Conference, University of New South Wales April 8th, 2019 Ian MacLeod & Erin Hahn The Johns Hopkins University Applied Physics Lab APPROVED FOR PUBLIC RELEASE
2
The decision to kill… We want to start with a few points that continually confuse the topic and debate. Is the machine making a decision, or is it reacting or generating a response based on input data in accordance with its prescribed goals? The former is easy to grasp and sophisticated systems can give the appearance of agency and consciousness, but I think most AI researchers would agree that that is not what’s occurring. It is far easier for most people to use the term decision however. APPROVED FOR PUBLIC RELEASE
3
The decision to kill… The rhetoric surrounding the debate anchors on THE decision to kill, which we maintain is really not an accurate way to look at the issue. Modern warfare distributes the decision making around combat significantly in time, space, echelon and authority. There is no singular decision, only further derivatives and conditional ones. APPROVED FOR PUBLIC RELEASE
4
The decision to kill… Is the machine making a decision, or is it reacting or generating a response based on input data in accordance with its prescribed goals? The former is easy to grasp and sophisticated systems can give the appearance of agency and consciousness, but I think most AI researchers would agree that that is not what’s occurring. It is far easier for most people to use the term decision however. APPROVED FOR PUBLIC RELEASE
5
Lethal Decisions in Modern Warfare
Political Discourse, Values, & Authorization Doctrine, Strategy & Operational Planning Deliberate Targeting Selection & Engagement Terminal Guidance As part of our work we’ve been involved with an independent panel conducting analysis of issues surrounding LAWS for the United Nations Group of Governmental Experts on LAWS. Where do decisions occur in modern warfare? This is the crux of the debate over LAWS, and the debate tends to muddle the decision making that does occur, but there is a stark divide in ethical outlooks that helps explain the majority opinions in the debate. First need to break down where decision are occurring in modern war. Decisions over life and death are the end result of a long sequence of derivative decisions. Even the tactical terms of “select and engage” within a sortie or mission are bounded by many prior decisions over lawful targets, military necessity, proportionality. The arrow boxes start large and get progressively smaller to illustrate the derivative nature they have. If you accept this concept of decisions in warfare then you can discuss the major ethical interpretations of these decisions. The first would be the deontological camp, which is most closely associated with the human dignity argument. This camp maintains that any decision in this chain, that does not abide by the tenants of human dignity, namely that care and consideration were taken in the decision to take a life, is a violation.. This leads to the contention that autonomous systems cannot comply with human dignity and thus their use in any of these decisions cannot be allowed. This is the basis of the argument made by groups like the Campaign to Stop Killer Robots and states that have signed on to their goal of banning LAWS. The second could be called the consequentialists, which views autonomy as a tool or means to get closer to the desired endstate. This camp tends to frame autonomy in terms of optimizing outcomes to increase precision, and extend control or rather the intent of decision makers. Where is there general agreement? There is general agreement that the application of autonomy to weapon systems should not feature prominently in the left three arrows. Clearly the deontological camp prefers it would not feature in any boxes, but the consequentialist camps have maintained that autonomy in weapons is meant to be controlled and the idea of self-tasking or initialization is not desirable. There is general agreement that autonomy in weapon systems is not without risk, and that testing and understanding are essential to maintaining operational, legal and ethical alignment. APPROVED FOR PUBLIC RELEASE
6
Autonomy In Weapon Systems
Political Discourse, Values, & Authorization Doctrine, Strategy & Operational Planning Deliberate Targeting Selection & Engagement Terminal Guidance Where do we tend to see autonomy in existing weapon systems and lethal decision-making? The greatest application of autonomy occurs within terminal guidance and the engagement portions of weapon systems. Think of point defense systems like CIWS, some of our advanced missiles, and classes of sensor-fused weapons. There are emerging applications of autonomous capability, particularly ML backed applications to assist human decision-makers in the deliberate targeting process. Past that there really are no consequential applications are the upper levels of lethal decision-making. APPROVED FOR PUBLIC RELEASE
7
Thought Experiment Main Points:
Sequential thought experiment where variation could be controlled to determine actual sticking points. Analog, GPS enabled, GPS enabled plus ATR Point of divide is addition of ATR. One camp sees as an extension of intent. A more restrictive addition to decrease risk and ensure intent. One camp sees as a loss of control that increases risk, and violates tenants of Human Dignity Approved for Public Release 15 October 2019
8
Selection & Engagement
Precision or Decision? Selection & Engagement Terminal Guidance A level of precision that meet the intent of the human decisions about targeting to achieve a better outcome? A machine decision that violates human dignity despite the chain of decisions leading up to it? APPROVED FOR PUBLIC RELEASE
9
Ban? Thoughtful adaptation & adoption?
Way Forward Ban? Thoughtful adaptation & adoption? APPROVED FOR PUBLIC RELEASE
10
Ian MacLeod Ian.macleod@jhuapl.edu Erin Hahn Erin.hahn@jhuapl.edu
APPROVED FOR PUBLIC RELEASE
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.