332 Just ordinAry robots
7.7.3 Balancing Precaution and Proaction
Policymaking can be guided by both precautionary and proactive atti-
tudes toward risk governance: the regulatory focus of precautionary
policy makers is on the prevention of worst outcomes, whereas pro-
active ones seek to promote the best available opportunities (Fuller,
2012). Regulation should aim to balance precaution and proaction.
Robotics is developing within a network of existing national and
international legal frameworks. On the one hand, existing regula-
tory structures may pose unnecessary barriers to the utilization of
robots and may hamper innovation. On the other hand, regulation
is needed to introduce robotics into society in a responsible manner.
Where this concerns privacy, think of the European Data Protection
Directive, or of military robots that must meet the requirements of
international humanitarian law. However, we saw that new robotics
often requires the adaptation of existing laws or even entirely new
types of regulatory frameworks. For example, many new robot sys-
tems lack a detailed legal–social system for dealing properly with the
issue of liability. is lack of regulation, too, can hamper innova-
tion. In such an uncertain situation, it is important to create space for
innovation, and at the same time to, step-by-step, build a legal–social
system. Our plea is to have experimental zones in which experiments
with robotics can take place as long as certain rules are met. e
government should take responsibility for various risks. is space
for experiments ought to be linked to the stimulation of the setting
of standards and the development of safety standards. ink of the
Japanese Robot Safety Center, which is working on a certication
system for service robots.
Finally, besides challenging us to make legal frameworks future
proof, robotics also challenges us to reect on our existing norms and
laws, and sometimes obliges us to reconsider what is currently seen as
feasible and desirable. is in turn may stimulate debate and lead to
the formation of new policy goals and the instruments to meet them.
For example, the prospect of a car robot that makes autonomous ethi-
cal decisions raises complex questions about which types of ethical-
decision proles should be used, and whether the user should be free
to select the car’s ethics setting. Lin (2014) thinks that allowing cars
to have an adjustable ethics setting is a bad idea. But that is exactly
333AutomAtion From love to wAr
how we have organized the current situation, in which users are free
to buy a heavyweight car with a strong safety bumper that especially
diminishes the damage done to its passengers. us, in an interesting
manner, the dicult moral questions that are raised by the prospect of
robotic systems may actually stimulate us to reect on the norms and
values underlying our existing socio-technical systems. Making such
norms and values visible and debatable and, where deemed necessary,
reconsidering them and adjusting current systems accordingly, may
be the best way to prepare for the responsible introduction of robotic
systems in our society.
7.8 Epilogue
When I lost my faith in people, I put my trust in things
To avoid the disappointment trusting people brings…
I tried to do it all myself then, surrounded by my stu
All I found were limitations I could not rise above
ere are gadgets and contraptions, immaculate machines
ere’s a program you can download now that will even dream your
dreams
It’ll even dream your dreams for a monthly fee
Clear up your complexion, you get a hundred hours free
Possessions cannot save you the way some body can
When I learned to care for others then the boy became a man
John Gorka (2001)
e introduction of robotics into society is paired with an enormous
human challenge. Making use of opportunities and dealing with their
social, legal, and ethical aspects calls for human wisdom. Trust in
our technological capabilities is an important part of this. But let us
hope that trust in technology will not get the upper hand, because
that would be a perfect formula for creating a technocracy, or more
aptly here, robocracy. Trust in humans and the acceptance of human
abilities, but also human failures, ought to be the driving force behind
our actions.
Like no other technology, new robotics is inspired by humans.
is technology aims to copy and subsequently improve human
334 Just ordinAry robots
characteristics and capacities. It appears to be a race between
machines and us. Bill Joy (2001) is afraid of this race, because he
fears that humans will ultimately be worse o. His miserable worst-
case scenario describes “a future that does not need us.” In such a
hyper-rationalized world, the robots and the elite owning them have
come to control us or even replace us. Rationalization through robot-
ization, however, should never be a goal in itself. e ultimate goal
of robotics should not be to create an autonomous and socially and
morally capable machine. is is an engineer’s dream. Such a vision
should not be the leading principle. Robotics, namely, is not about
building the perfect machine, but about supporting the well-being
of humans.
Our exploratory study shows that there is a balance in social prac-
tices between “hard” and “soft” tasks. e police take care of law
enforcement and criminal investigation and also oer support to citi-
zens. e war must be won, but also the “hearts and minds” of people.
Care is about “taking care” (washing and feeding) and “caring for”
through a kind word or a good conversation. We enjoy making love,
but we especially want to give and receive love. Robotics can play
a role in the functional side of social practices. But we must watch
out that the focus on technology does not erase the “soft” human
side of the practice in question. We should, therefore, take heed of
insights from people like Nordholt, former chief superintendent of
the Amsterdam police force, who states: “e current thinking in
terms of control is strongly motivated by technology” (De Jong &
Schuilenburg, 2007). Such a trap can easily lead to appalling prac-
tices: inhumane care, a repressive police force, a hardening of our sex
culture and cruel war crimes.
Robotics does not exist for itself, but for society. Robotics ought
to support humankind, not overshadow it. Our objective should not
be to build a high-tech, robot-friendly world, but to create a human-
friendly world. is begins with the realization that robotics oers
numerous opportunities for improving our lives and the insight
that how we envelop the world in a robot-friendly environment will
dene whether we seize those chances or whether that leads to a
dehumanized society, guided solely by a strong belief in eciency.
is implies that sometimes there simply is no place for robots. A
robot exists that can play the trumpet very well (see Figure 7.1).
335AutomAtion From love to wAr
And yet, it would be disgraceful if the daily performance of the
“Last Post” in Ypres in Belgium (see Figure 7.2), in memory of mil-
lions of victims of the World War I, were to be outsourced to a
robot. Furthermore, we must watch out that trust in technology
does not lead to technological paternalism. Even if, in the very dis-
tant future, there are robots that are better at raising our children
than we are, we must still do this ourselves. An important aspect of
this is the notion of personal responsibility and the human right to
Figure 7.1 A robot that can play the trumpet, developed by Toyota. (Photo courtesy of Rinie
van Est.)
Figure 7.2 In Ypres, the “Last Post” is played every day in honor of the liberators of Ypres. (Photo
courtesy of Joost van den Broek/Hollandse Hoogte.)
336 Just ordinAry robots
make autonomous decisions and mistakes. Even if robots can carry
out some tasks better than humans can, it might still make more
sense for humans to carry on doing those tasks, despite doing them
less well.
References
Arkin, R. (2009). Governing lethal behaviour in autonomous robots. Boca Raton,
FL: CRC Press.
Arkin, R., Ulam, P., & Duncan, B. (2009). An ethical governor for constrain-
ing lethal action in an autonomous system (Technical Report No. GIT-
GVU-09-02). Atlanta, GA: Georgia Institute of Technology.
Borenstein, J., & Pearson, Y. (2010). Robot caregivers: Harbingers of expanded
freedom for all? Ethics and Information Technology, 12(3), 277–288.
Broadbent, S., Dewandre, N., Ess, C., Floridi, L., Ganascia, J.-G.,
Hildebrandt, M., … Verbeek, P.-P. (2013). e online manifesto: Being
human in a hyperconnected era. Brussels, Belgium: European Commission.
Canning, J. S. (2006). A concept of operations for armed autonomous sys-
tems. Presented at the third annual disruptive technology conference,
September 67, Washington, DC. www.dtic.mil/ndia/2006disruptive_
tech/2006disruptive_tech.html (accessed January 23, 2014).
Cavoukian, A. (2012). Privacy and drones: Unmanned aerial vehicles. Toronto,
Ontario, Canada: Information and Privacy Commissioner. http://www.ipc.
on.ca/images/Resources/pbd-drones.pdf (accessed November 16, 2014).
Coeckelbergh, M. (2013). Drones, information technology, and distance:
Mapping the moral epistemology of remote ghting. Ethics and
Information Technology, 15(2), 87–98.
Cummings, M. L. (2006). Automation and accountability in decision support
system interface design. Journal of Technology Studies, 32(1), 2331.
De Jong, A., & Schuilenburg, M. (2007). Een cultuur van controle. Interview
met Eric Nordholt. Gonzo (circus), 79, 12–15.
Duy, B. R. (2006). Fundamental issues in social robotics. International
Review of Information Ethics, 6, 31–36.
European Commission. (2010). Towards a European road safety area: Policy orienta-
tions on road safety 2011–2020. Brussels, Belgium: European Commission.
http://ec.europa.eu/transport/road_safety/pdf/road_safety_citizen/
road_safety_citizen_100924_en.pdf (accessed September 4, 2014).
Flatley, J. L. (2010). Robot safety center opens up in Japan: Crash test dummies still
an unfortunate name for a band. http://www.engadget.com/2010/12/28/
robot-safety-center-opens-up-in-japan-crash-test-dummies-still/
(accessed August 1, 2014).
Floridi, L. (2014). e fourth revolution: How the infosphere is reshaping human
reality. Oxford, UK: Oxford University Press.
Fogg, B. J. (2003). Persuasive technology: Using computers to change what we
think and do. San Francisco, CA: Morgan Kaufmann.
..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
3.14.130.45