The ethics and governance of lethal autonomous weapons systems (LAWS)—robots that can kill without direct human intervention or oversight—are the subject of active international discussions. It is imperative that we critically examine the role and nature of public engagement intended to inform decision makers. The Martens Clause, included in the additional protocols of the Geneva Conventions, makes explicit room for the public to have a say on what is deemed permissible in matters of armed conflict, especially where new technologies are concerned. However, many measures of public opinion, using methods such as surveys and polls, have been designed in a way that is subject to bias. For example, some only consider specific drone use cases instead of general ethical aspects of the technology under consideration. This paper surveys studies that have been conducted to gauge public opinion on the use of military drones (autonomous and remotely operated), including the recent international poll conducted by the Open Roboethics initiative. By drawing on evidence from moral psychology, the authors highlight the effects that particular question framings have on the measured outcomes, and outline considerations that should be taken into account when designing and determining the applicability of public opinion measures to questions of the governance of LAWS. Such considerations can help public engagement objectives live up to the spirit of the Martens Clause.
Military drones have recently emerged as one of the most controversial new military technologies. Unmanned and often weaponised, these robotic systems can be relatively inexpensive, can patrol the skies continuously, and have the potential to do much of the work of traditional manned military aircraft without putting pilots at risk. For these reasons and others, drones have come to occupy a central role in the overall military strategy of those nations that have them.
Currently, military drones, including other ground-based robotic weapons systems, are remotely operated, and sometimes referred to as Remotely Operated Weapons Systems (ROWS). With ROWS, the decision to use lethal force remains a human decision. However, technology that could support the ability of military drones to autonomously make the decision to use lethal force is under development. That is, eventually, military robots could kill without human intervention. The very real prospect of those new Lethal Autonomous Weapons Systems (LAWS) raises important ethical questions that have been taken up by the public media, governments, civil society, and the United Nations.
The decisions whether or not to build or use LAWS are a matter of democratic and humanitarian concern. International law underscores the importance of public engagement in such matters. The Martens Clause, included in the additional protocols of the Geneva Conventions, makes explicit room for the public to have a say on what is, and is not, deemed permissible in matters of armed conflict, especially where new technologies are concerned. It reads:
“Recalling that, in cases not covered by the law in force, the human person remains under the protection of the principles of humanity and the dictates of the public conscience.” (Additional Protocol II to the Geneva Conventions)
Though legal scholars often disagree on how best to interpret and implement the Martens Clause, it remains a fact that from the perspective of the Clause, the public is called upon to help shape international laws of armed conflict that have yet to be established. Public engagement is one way to support the requirements set out in the Clause.
Public opinion polls have been conducted to gauge people’s attitudes towards drones. Most recently, the Open Roboethics initiative conducted one such international public opinion poll in the spirit of the Martens Clause. Prior to their work, the public survey work that had been conducted on the topic was mostly limited to English-speaking, often US-based, perspectives. In addition, most of that polling focused on very specific drone use cases, often asking whether or not people supported their use in fighting terrorists.
This paper examines the various drone related public opinion polls that have been conducted to date, and examines the kind of framings used in their question design. By drawing from evidence in moral psychology, we critique the applicability of certain question types for supporting the kind of governance objectives intended by the Martens Clause. If we are to understand where the “public conscience” stands on the issue of LAWS, we need to design questions that probe individuals’ opinions about the nature of LAWS, rather than their specific use cases.
This paper sets out some general considerations that policymakers can use when designing or interpreting public opinion polls related to drones. As such, it is intended for use in ethics, law and policy.
Jason Millar and AJung Moon will present How to Engage the Public on the Ethics and Governance of Lethal Autonomous Weapons on Friday, April 1st at 11:45 AM with discussant Peter Asaro at the University of Miami Newman Alumni Center in Coral Gables, Florida.