327AutomAtion From love to wAr
driving behavior. And Turkle’s (2011) fear of the antisocial inuence
of social robots is based on her long-term research on the inuence of
social media and mobile telephones on the communication between
young people. Turkle states that the youngest generation is much
less empathic than preceding generations, because intimacy can be
avoided and relations through the Internet or devices are much less
binding. e core concern about dehumanization actually forces us to
continually address the question of what it means to be human in a
robot society.
In debating this issue, we should consider the use of robots from
the view of the long-term process of the rationalization of society.
In Section 7.1, it was concluded that by enveloping our daily envi-
ronment in an ICT-friendly infosphere, we are laying the basis for
the application of service and social robots in various social practices.
is enveloping is a socio-technical process, which is about building
technical infrastructures, such as GPS, 4G mobile networks, and data
management systems, and also about social shaping. For example, the
eective use of a robot vacuum cleaner depends on “roombarization
(Sung etal., 2007): the rationalization of the home environment to
make it robot-friendly. As said earlier, rationalization presents all
kinds of opportunities. e concern about dehumanization reminds
us about the risk we are running, as Floridi (2014, p. 150) explains:
“[B]y enveloping the world, our technologies might shape our physi-
cal and conceptual environments and constrain us to adjust to them
because that is the best, or easiest, or indeed sometimes the only, way
to make things work.
We should try to avoid building robotic systems that push us into
an irritating straightjacket of eciency. is is certainly a topical sub-
ject, especially because subtle forms of systematism easily escape our
attention (Van’t Hof, Van Est, & Daemen, 2011, pp. 142–143). For
example, facial recognition in digital passports compels every citizen
not to laugh upon having their passport photograph taken. In this way,
we quietly lost something beautiful: a moment of freedom of expres-
sion. e dehumanization perspective keeps us aware of the potential
cumulative eect of these easily unnoticed ways in which rationalized
robotic systems may constrain us. In particular, the eect of drones
on our privacy and autonomy calls for a timely public discussion.
Living under drones in war zones not only gives rise to anxiety and
328 Just ordinAry robots
psychological trauma among civilian communities, but may also have
a chilling eect on the public’s behavior in peaceful times. Purposely
echoing Bentham and Foucault, Canadian Information and Privacy
Commissioner Ann Cavoukian (2012, p. 1) states that “the increased
use of drones … has the potential to result in widespread deployment
of panoptic structures that may persist invisibly throughout society.
In a society with both surveillance—Big Brother who is watching us
from above—and sousveillance—Little Brothers who are watching us
from below—people will anticipate the fact that they can be observed
anywhere at any time, and autonomy is lost.
7.7 Governance of Robotics in Society
e discourse on how to enact democratic governance of innovation
properly is currently dominated by the notion of responsible inno-
vation. Responsible innovation aims to achieve ethically sustainable
and, from a societal point of view, acceptable types of innovation
(Von Schomberg, 2011). Achieving responsible innovation requires
the timely involvement of the opinions and capabilities of relevant
societal actors. With respect to the organization of the required inter-
active processes, we will restrict ourselves to two remarks. First of all,
it is important that the introduction of robotics in society is brought
to the attention of policy makers, politicians, and the broader public.
Governments have a responsibility to ensure that the debate on the
application of new robotics is started early. For example, the debate
on the utilization of military robots has clearly begun way too late.
Second, this section starts with a strong plea for user involvement.
e remainder of this section then reviews a number of political and
regulatory issues that need to be tackled in order to introduce robotics
into society in a responsible manner.
7.7.1 Putting Users at the Center
With innovation endeavors, it is important to bridge the gap between
the expert views of (technological) developers and the wishes of
users. erefore, the views, needs, and worries of future users must
be addressed as early as possible in the developmental process. In the
current R&D climate, users are often only consulted at a late stage,
329AutomAtion From love to wAr
or not at all (Van der Plas, Smits, & Wehrman, 2010). Every subject,
and thus also the subject of robot systems, has an assigned value. It is
important that during the design process the various views of stake-
holders are taken into consideration. is can be done by disciplines
known as “constructive technology assessment” (Schot & Rip, 1997)
and “value sensitive design.” With this last approach, specic atten-
tion is given to the ethical issues and the broad set of values that play
a role in the development and application of technology.
Users must be educated early in the process about the possibili-
ties and impossibilities of robotics systems. e technologies change
the various social practices, and this often happens stealthily. is
is a concern in all of the elds of application. In health care educa-
tion, serious attention must be given to telecare and the consequent
changing role of caregivers. A similar problem can be seen in the
shift fromairplane pilot to airplane operator. In the eld of mobility,
it appears important that drivers become familiar with, for example,
Adaptive Cruise Control. A rst step could be to make driving using
task-supporting systems an obligatory element of passing one’s driv-
ing test.
7.7.2 Political and Regulatory Issues
Worldwide investment in research on and the utilization of military
robot technology has rapidly accelerated over the last decade. is novel
arms race calls for a broad international debate on the consequences of
using military robotics. An important goal for such a debate is to curb
the proliferation of armed military robots and to keep these machines
out of the hands of fundamentalists and terrorists. A second goal is to
develop common ethical and legal principles for a responsible utiliza-
tion of armed military robots. An essential condition of international
humanitarian law is that at all times someone can be held respon-
sible for an undesirable consequence. It is, therefore, important that
humans always stay “in the loop” and take decisions on life-and-death
issues. is principle is under threat because of the trend of “man-in-
the-loop” heading to “man-on-the-loop” and ultimately to “man-out-
of-the-loop.” erefore, an international ban on autonomous armed
military robots within the foreseeable future is desirable. e start
of a worldwide discussion on that topic was witnessed in May 2014,
330 Just ordinAry robots
when the UN Convention on Certain Conventional Weapons hosted
an informal meeting about autonomous killer robots.
e gradual but steadily persistent robotization of trac, too, calls
for timely governmental action and the formation of views on its
future. For robot cars, the comparative assessment between safety and
privacy plays a central role. In several countries, self-driving cars are
being promoted to increase trac safety. Discussing safety in terms
of such a long-term option sounds like a clever maneuver to evoke a
political debate about the use of technical measures that already exist.
A debate should already be taking place on the use and necessity of the
introduction of a system for automatic speed adaptation. e expected
safety gain, meaning the decrease in the number of trac casual-
ties, makes a serious political assessment obligatory. As indicated in
Section 7.2, the core ethic at stake here is the principle of unnecessary
risk (cf. Strawser, 2010). In essence, this concerns the political ques-
tion of whether we, as a society, accept that trac accidents cause
casualties. If we choose to take this view, like the Swedish parliament
in its Vision Zero, we thereby choose to make trac systems optimally
safe by means of the best available (robotics) technologies.
Cooperative driving may begin to play a role in the medium term.
While it will still take years before it is safe enough, it is already high
time that governments, industries, knowledge institutes, and relevant
societal organizations begin talking about the technical and legal
aspects, which deserve attention because of the potential eects of coop-
erative driving, such as system safety and standardization, and liability
if there is a malfunction. Good management of these aspects requires
a lot of time. It is also important to anticipate the gradual introduction
of autonomous cars, and thinking about dierent scenarios in order to
get to grips with the potential radical consequences for public trans-
portation, car ownership, road usage and urban planning, and so on,
is important. e autonomous car also forces regulators to rethink the
relationship between the car and the driver. In the near future, maybe
the car will be obliged to take a driving test. Finally, autonomous cars
will encounter situations in which they have to make life-and-death
choices. Encoding moral decisions into software is a technological and
ethical challenge, but also a political and regulatory one.
In the eld of home robots, societal issues that are raised by sex
robots deserve attention. is concerns ethical and legal questions
331AutomAtion From love to wAr
surrounding the possible use of child robots for sex and the question
of whether sex robots could eventually become an acceptable substi-
tute for human prostitution. In various countries, the solicitation of
or possession of virtual child pornography is punishable. e legisla-
tion used for that purpose, however, is not sucient for making sex
with child robots punishable. If politicians wish to forbid this kind
of behavior, lawmakers must develop a legal framework for this. At
the moment, the prostitution industry knows about very distress-
ing abuse, ranging from the tracking of women to unpaid labor
(slavery). is justies research on the question of whether sex robots
could be reasonable alternatives to human prostitutes in the future.
e subject of care, especially the expected breakthrough of telec-
are, or home automation, calls for a formation of views. How can we
deal with privacy, patient autonomy, and informed consent? A rst
challenge is to search for a good balance between privacy and health
and patient safety. Robotics and home automation could be utilized
to increase the autonomy and independence of patients. However, it
is important that such technological support is always assessed from
a patient perspective. In many cases, outside care is experienced as
an infringement of autonomy, but it does help to ght loneliness.
Loneliness is a large social problem and we must watch out that home
automation does not make it worse.
e demand among law enforcement agencies, citizens, and busi-
nesses to use drones in the city is on the rise. e deployment of public
and civil drones, however, raises various safety, security, and privacy
issues that are in need of regulatory guidelines. Safety is the overrid-
ing concern among European and U.S. policy makers. Fragmented
rules exist to authorize the ad hoc use of drones. A more comprehen-
sive regulatory framework is needed, however, that safely integrates
a wide variety and use of drones into the aviation system. Currently,
the progress in regulating the safe use of drones in both the United
States and Europe is slow. Although basic privacy rules apply to all
drone operators, a coherent way to address some key issues with
regard to privacy does not exist. In particular, the exploding use of
civilian drones is a threat to privacy. To deal with the privacy issue,
it would be a good idea to set up a baseline consumer protection law
that details permissible use of drones in domestic airspace by both law
enforcement agencies and private parties (Schlag, 2013).
..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
18.117.145.11