106 Just ordinAry robots
they are taken to court. In this context, it is not surprising that there
are still so few commercial care robots about today, although there
is clearly a market for them. It is dicult for an individual company,
or even a single country, to dene acceptable safety regulations. In
order to allow the industry to move forward, international discus-
sion and consensus are needed; in February 2014, the International
Organisation of Standards released ISO 13482 regarding safety
requirements for personal care robots based on the standardization
eorts on robot safety of the European Union funded project euRo-
botics.* e eagerly awaited standard gives researchers, manufactur-
ers, and regulators a basis against which to measure and monitor
products. ISO 13482:2014 species requirements and guidelines for
the inherently safe design of, protective measures for, and informa-
tion about the use of care robots.
e standard describes hazards
associated with the use of these robots and sets down requirements
to eliminate the risks associated with these hazards or reduce them
to an acceptable level. e scope of the standard is limited primarily
to human care–related hazards but, where appropriate, it includes
domestic animals and property (dened as safety-related objects)
when the personal care robot is properly installed and maintained
and used for its intended purpose or under conditions that can rea-
sonably be foreseen.
ere are, however, some concerns with regard to this standard. For
example, Brian Scassellati, Associate Professor of Computer Science
at Yale and an expert on socially assistive robotics and human–robot
interaction (HRI) who is involved with the Socially Assistive Robotics
project
§
(a National Science Foundation–funded initiative designed to
improve the performance of child-centered care robots), states that
the ISO 13482 standard may give the false impression that care robots
are ready for the mainstream and for widespread industry develop-
ment and that this may cause more harm than good. According to
Scassellati, we do not have a clear understanding of the basic science
behind HRI: “Roboticists and other stakeholders don’t have a clear
*
http://www.eurobotics-project.eu.
https://www.iso.org/obp/ui/#iso:std:iso:13482:ed-1:v1:en.
For a more detailed analysis of ISO 13482, we refer to Jacobs and Gurvinder (2014).
§
http://www.robotshelpingkids.com/index.php.
107tAking CAre oF our PArents
understanding of the roles that care robots should play, the kind of
support the robots should or should not provide, and the impact that
robot care will have on their users.”*
3.3.2.2 Designing Care Robots Despite the need for care robots for
the rapidly aging population and the potential success of some care
robots, other care robots have had a poor response. Broadbent,
Staord, and MacDonald (2009) attribute this poor response to the
fact that designers do not properly assess the needs of the human
user—the caregiver as well as the care recipient—and then match
the robot’s role, appearance, and behavior to these needs. e per-
ceptual worlds of caregiver, care recipients, and designers vary due to
dierences in background and experiences. For designers the eects
will involve technical goals, and caregivers are mainly interested in
eects on workload and quality of care, while care recipients are
inuenced by usability eects. In general, designers neglect the views
of the elderly people who are the users of the care robots. Frennert
and Östlund (2014) found that in most health care research, older
people are described as objects and that designers do not involve
older people as subjects in their research, and therefore not in the
development of robotic technologies either. In contrast to the stereo-
typical view of older people, evidence indicates that they are far from
passive consumers; instead, they are technogenarians: older individ-
uals who creatively adapt and utilize technological artifacts to suit
their needs (Joyce & Loe, 2010). Designers should take into account
the older user’s capabilities and limitations. As people age, motor
behaviors, such as disrupted coordination, change; sensory abilities,
such as vision and hearing, reduce; aspects of memory decline; and
so on. According to Rogers and Mynatt (2003), designers must rec-
ognize and accommodate those abilities that do decline while at the
same time capitalize on the abilities that remain intact. A perfect
single design method for every care robot is unlikely to exist, but in
any case, designers should take into consideration the wishes and
needs of caregivers as well as those of care recipients in their design
process. Both of these user groups should be involved as early as
*
http://www.roboticsbusinessreview.com/article/new_international_standards_
boon_ to_personal_care_robotics.
108 Just ordinAry robots
possible in the design process in such a way that the technological
knowledge of the designers and the contextual knowledge of the
users are married in a design (Van der Plas, Smits, & Wehrmann,
2010). e European projects Mobiserv, CompanionAble, KSERA,
and HOBBIT have shown how this can be done. In the KSERA
project, the designers adopted a user- centered design framework to
link the design with the needs and the context of the lives of the
people to which it is addressed (Johnson etal., 2014). is frame-
work is explicitly intended to be a dynamic process in which the end
users are involved from the beginning of the project, not as sub-
jects but as active agents that inuence decisions, development, and
implementation.
To include the ethical aspects into a design process relating to
care robots, one must rst identify the signicant moral values and
then describe how to operationalize these values (Van Wynsberghe,
2013). is ethical evaluation ensures that the design and intro-
duction of a care robot do not impede the promotion of moral val-
ues and the dignity of caregivers. For this ethical evaluation of care
robots, Van Wynsberghe developed a framework that incorporates
the recognition of the specic context of use, the unique needs
of users and the tasks for which the robot will be used, as well as
the technical capabilities of the robot, based on a value- sensitive
design approach. Such ethical evaluation provides guidance for
robotic design and development on how to proceed and what to
strive for in current and future work (Nylander, Ljungblad, &
Villareal, 2012).
3.3.2.3 Physical Appearance Another aspect of designing care robots
is physical appearance, since the robots appearance inuences how
people appraise the abilities of the robot and has profound eects
on its acceptance (Wu, Fassert, & Rigaud, 2011). For example, par-
ticipants in the study of Wu etal. (2011) were reluctant to interact
with some humanoid robots that have inauthentic expressions and
oer ersatz interactions and companionships. Acceptability further
depends on the acceptance of and attitudes of others toward the robot
(Salvani, Laschi, & Dario, 2010), facilitating conditions, perceived
usefulness, perceived ease of use, and perceived enjoyment and trust
(Heerink, Kröse, Evers, & Wielinga, 2010).
109tAking CAre oF our PArents
3.4 Specic Ethical Issues with Regard to the Role of Care Robots
In this section, we will deepen our ethical reection on the use of
health care robots for the elderly by distinguishing between three dif-
ferent roles: (1) the robot as companion for the care recipient; (2) the
robot as cognitive assistant for the care recipient; and (3) the robot
as (supporter of the) caregiver. Distinguishing these three types of
care robots enables us to introduce various relevant ethical concerns
in a contextualized manner, that is, in relation to the specic role a
care robot is envisioned as playing in certain social practice. In this
way, the ve following relevant ethical concerns will be discussed (cf.
Vallor, 2011):
1. Deception: e potential for the “relationship” with the care
robots to be inherently deceptive or infantilizing.
2. Autonomy: e potential of robots to enlarge or reduce the
opportunities, freedom, autonomy, and/or dignity of the care
recipients.
3. Dehumanization: e ethical issue of senior citizens being
reduced to objects who are basically problematic and whose
care problems can be solved with the use of robot technology.
4. Quality of care: e ethical discussion point about the quality
of care that can be provided by care robots.
5. Human contact: e potential of robots to increase or reduce
the care recipient’s human contact with family and caregivers.
Before we start discussing these ethical issues, it is important to real-
ize that the actual deployment of these types of robots is not to be
expected in the short term. Some companion robots are already being
commercially produced, but the social interaction of these robots is
very limited. is is because even the best robots are no match for
the social and thinking abilities used in the interaction of the average
toddler (Bringsjord, 2008). Breakthroughs in AI are needed in order
to make the companion robot a success story (see also Chapter 2). In
contrast to the robot as cognitive assistant for the care recipient, the
nal type of robot we will discuss is the one that is able to provide
physical assistance by supporting basic activities such as eating, bath-
ing, going to the toilet and getting dressed, helping with basic house-
hold tasks, and providing assistance with walking. It will take a while
110 Just ordinAry robots
before these robots enter the eld of care. As we have seen in Chapter 2,
these tasks require very complex decisions, and designers also have
to deal with the frame problem, which will probably not be solved
within the next 10years. e robot as cognitive assistant has been
developed as part of an intelligent assistive environment for elderly
people. It monitors and supervises the activities of daily living and
monitors health and safety. A lot of research is being done in relation
to these robots, such as the four robot projects mentioned in Box3.3
in the previous section. Many of the cognitive assistant robots are still
in the development and testing phase. Some have been commercially
produced, but they turned out not to be successful (Broadbent etal.,
2009). e lack of uptake of these care robots is due to a number of
factors, such as users losing interest after a certain period of time,
users not perceiving any benets, and the commercial market proving
to be a dicult area to break into. Not all care robots can be put in just
one of the aforementioned three categories. A robot can, for example,
be programmed to perform monitoring tasks and at the same time can
provide companionship.
3.4.1 e Robot as a Companion
3.4.1.1 Deception e robot as companion technology raises con-
troversial images: lonely elderly people who only have contact with
robot animals or humanoid robots. e ethical concerns about the
pet robot focus on the degree of human contact that such technology
brings about or, instead, on depriving and deceiving patients with
dementia, for example (Borenstein & Pearson, 2010; Coeckelbergh,
2010; Sharkey & Sharkey, 2012; Sparrow & Sparrow, 2006;
Turkle, 2006). Sparrow and Sparrow (2006) describe care robots
for the elderly as “simulacra” replacing real social interaction (see
also Section 2.3). ey think that robots, which can be neglected,
paused or disabled, probably cannot build a meaningful relationship
with the user. Characteristic of our relationship with another human
being is that the other party has its own needs and wishes, regard-
less of our own needs and desires. “e demands that our friends
… make on us are therefore unpredictable, sometimes unexpected
and often inconvenient. is is an essential part of what makes rela-
tionships with other people, or animals, interesting, involving and
..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
18.221.25.217