249
6
aRmED mIlItaRy DRonEs
The Ethics behind Various Degrees of Autonomy
6.1 Focus on Teleoperated and Autonomous Armed Military Robots
Military robot technology has gained momentum. All over the
world, military robots are currently being developed and thousands
of military robots are already deployed during military operations.
According to Peter Singer, this development forms the new “revo-
lution in military aairs” (Singer, 2009b). e U.S. Future Combat
Systems, a program for future weapons and communications sys-
tems costing in excess of U.S. $200 billion—commissioned by the
Pentagon—has had a major impact. Military robots are a focal
point in this program, and many industrialized countries have
become inspired.
What stands out in many defense policy documents on military
robots, such as the Unmanned Systems Roadmap 2009–2034 (U.S.
Department of Defense, 2009), is that scant attention is paid to the
social and ethical aspects of military robotics. As is often the case
with new technological innovations, the emphasis is on promot-
ing technology itself, not on reection on possible, unwanted side
eects. e developers are not inclined to anticipate any negative
aspects, probably from a fear that this would hamper further
developments.
e use of military robots and, in particular, armed military
robots will, however, entail a large number of ethical issues. Now
that military robots are increasingly used in military operations and
we seem to be heading for the next military revolution, many scien-
tic publications have appeared that discuss the possible negative
250 Just ordinAry robots
consequences and the ethical aspects of military robots (see, e.g.,
the special issue on armed military robots of the journal Ethics and
Information Technology [14(2) (2013)], and the edited book Killing
by Remote Control: e Ethics of an Unmanned Military (Strawser,
2013)). ese publications show that armed robots present us with
very dierent social and ethical questions compared to unarmed mil-
itary robots. In order to study the ethical and social consequences
of military robots (see Section 6.5), it is important to distinguish
between them in this chapter.
6.1.1 Unarmed Military Robots
Unarmed military robots have an added value because they perform
dirty, dangerous, and dull tasks to solve operational problems and for
the eective and ecient performance of tasks. Examples include
“around-the-clock” performance, detecting roadside bombs, conduct-
ing reconnaissance sorties, guarding military compounds or camps,
improving situational awareness, and even carrying out potentially
oensive tasks such as raiding a hostile building, where a robot forces
open a door and explores who may be hiding inside. In particular,
these tasks should contribute to increasing the safety of the military
sta. Nowadays, these operational bottlenecks mainly occur in peace
and reconstruction missions, when the armed forces deal with insur-
gents carrying out asymmetric operations, such as the use of roadside
bombs and urban warghting. In urban warghting, it is dicult to
distinguish the ghters from citizens, with the result that civilians are
inextricably part of the battle space and at the same time the insur-
gents are becoming protected from the threats and dangers of war
(Johnson, 2013). Robotics technology may oer a possible solution
in response to these operations by insurgents, since robots, such as
unmanned aerial vehicles (UAVs), the Micro Air Vehicle or the Global
Hawk (see Box 6.1), can provide military personnel with crucial
information at a distance, information that facilitates better decisions
that are more consistent with the principles of proportionality and
discrimination.
Unarmed military robots provide a positive contribution to the
completion of soldiers’ tasks. Few objections can be made against
251Armed militAry drones
deploying these robots. A possible objection could be that the privacy
of citizens would be aected (see also Chapter 5).
6.1.2 Armed Military Robots
In contrast, armed military robots raise many urgent and signicant
social and ethical issues (see, e.g., Krishnan, 2009; Sharkey, 2008b).
We, therefore, focus on the ethical and social issues that are raised in
the deployment and development of armed military robots, especially
armed military drones, or unmanned combat aerial vehicles (UCAVs).
e United States expects a great deal from UCAVs for the coming
years considering the huge investment that has been made in this type
BOX 6.1 UNARMED UNMANNED
AERIAL VEHICLES
An example of an unarmed UAV is the unmanned reconnaissance
helicopter, the Micro Air Vehicle. It is a remote-controlled pro-
peller plane as small as a model airplane, with a weight of about 20
grams (or 0.7 ounce) to a few hundred grams and equipped with
powerful regular or infrared cameras for autonomous observa-
tion tasks. e cameras’ images are so sharp that people planting
parcel bombs or roadside bombs can be detected and monitored,
alerting local forces to act. Also, these aircraft can search for tar-
gets and communicate the position for conventional bombing. At
the end of 2001, the United States deployed about 10 unmanned
reconnaissance aircraft in Afghanistan, but by 2008 this number
had already grown to more than 7000 (Singer, 2009b). In addi-
tion to these small aircraft, there is the reconnaissance Global
Hawk, with a wingspan of nearly 40meters (or 130 feet) and
a maximum altitude of almost 20 kilometers(or 13 miles). e
Global Hawk can survey large geographic areas with pinpoint
accuracy, giving military decision makers real-time information
regarding enemy locations, resources, and personnel.*
*
http://www.northropgrumman.com/Capabilities/RQ4Block20GlobalHawk/
Documents/HALE_Factsheet.pdf.
252 Just ordinAry robots
of robots compared to other armed robots.* Armed military robots
raise three broad questions:
1. Are armed military robots in breach of humanitarian war
law? (Section 6.3)
2. Who can be held responsible when an autonomous armed
robot is involved in an act of violence? (Section 6.4)
3. Does the proliferation of armed military robots cause irre-
sponsible risks? (Section 6.5)
e answers to these questions often depend on the degree of the armed
military robot’s autonomy. In military robotics, there is a process in
which the degree of autonomy of an unmanned vehicle is gradually being
extended from limited autonomy by teleoperation to fully independent or
autonomous functioning systems and, in the future, even to self-learn-
ing systems (see Section 6.2). is shift is phrased by Sharkey (2010)
as “from in-the-loop to on-the-loop to out-of-the-loop.” We distinguish
three types of military robots with regard to their degree of autonomy:
tele-guided, autonomous, and self-learning. is chapter focuses on
teleoperated and autonomous armed military robots and especially on
drones. e debate over the deployment and development of drones is
a very timely one, since drones are linked to existing questions of the
appropriateness, legitimacy, and potential illegality of the drone strikes.
Furthermore, the United States is the major player in the development
of military robots, and more than 90% of its budget for the development
of military robots for the coming 5years (2014–2018) has been made
available for the development of drones. For the United States, drones
have become one of the key elements of military operations abroad, since
drones deliver precision strikes without the need for more intrusive mili-
tary action. Finally, the question about the level of autonomy for military
robots is most relevant for drones, since drones suer the least from the
frame problem because they y, and the sky is not a very complex envi-
ronment, in contrast to a mountainous environment for ground robots
(see Section 1.3.4). As a consequence, drones could be a fully indepen-
dent or autonomous functioning system in the near future.
Most military robots are still remotely controlled by human opera-
tors and thus are tele-guided systems. Based on input from sensors,
*
http://robotenomics.com/2014/01/07/us-military-to-spend-23-9-billion-on-
drones-and-unmanned-systems/.
253Armed militAry drones
these robots are programmed to perform or not to perform certain
tasks, with an operator controlling activities to varying degrees.
Nonautonomous robots require humans to authorize any decision to
use lethal force, that is, they require a “man-in-the-loop.
When an armed military robot performs tasks and decisions com-
pletely independently that relate to whether to destroy military targets,
thus without human intervention, we speak of autonomous systems.
ese autonomous systems have explicit task programming and act
according to a certain xed algorithm. is means that the acts of the
autonomous military robot are predictable and can be traced afterward.
Learning military robots, based on neural networks, genetic algo-
rithms, and agent architecture, are able to decide on a course of action and
to act without human intervention. e rules by which they act are not
xed during the production process, but can be changed during the oper-
ation of the robot by the robot itself (Matthias, 2004). e problem with
these robots is that there will be a class of actions in relation to which no
one is capable of predicting the future behavior of these robots. So these
robots would become a “black box” for dicult moral decisions, prevent-
ing any second-guessing of their decisions. e control is then transferred
to the robot itself, but it is nonsensical to hold the robot responsible at that
moment, since robots that will be built in the next two decades will not
possess anything like intentionality or a real capability for agency. e
deployment of armed learning military robots would constitute a respon-
sibility gap (Matthias, 2004), since it would constitute the injustice of
holding people responsible for the actions of robots over which they could
not have any control.* Although learning armed military robots appear
high on the U.S. military agenda (Sharkey, 2008a), the deployment of
these robots is, at least under present and near-term research develop-
ments, not likely to happen within the next two decades (Arkin, 2009a).
We will not discuss this type of military robot in this chapter, because
they will not be developed within the coming decades, and statements
about these learning robots would be very speculative.
*
For a discussion of possible mechanisms and principles for the assignment of moral
responsibility to autonomous learning (intelligent) robots, we refer to Hellström
(2013) and Sparrow (2007).
Barring some signicant breakthrough in articial intelligence research, situational
awareness cannot be incorporated in software for lethal military robots (Fitzsimonds
& Mahnken, 2007; Gulam, 2006; Kenyon, 2006; Sharkey, 2008a; Sparrow, 2007).
..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
3.147.55.69