Robot Enhanced Therapy for Children with Autism Disorders · Regarding the use of robots in...

13
54 IEEE TECHNOLOGY AND SOCIETY MAGAZINE JUNE 2016 1932-4529/16©2016IEEE Andreea Peca, Mark Coeckelbergh, Ramona Simut, Cristina Costescu, Sebastian Pintea, Daniel David, and Bram Vanderborght ISTOCK Digital Object Identifier 10.1109/MTS.2016.2554701 Date of publication: 2 June 2016 Measuring Ethical Acceptability Robot Enhanced Therapy for Children with Autism Disorders C hildren with autism spectrum disor- ders (ASD) have persistent deficits in social communication and social interaction, as well as restricted, repetitive patterns of behavior, in- terests, or activities [1]. The preva- lence of autism is estimated at 1–2 per 1000, and close to 6 per 1000 for ASD [23]. ASD is a lifelong disorder, and many individuals need high levels of support throughout their lives [28]. Even though no cure has been found, early interven- tion is critical for a positive long-term outcome. The interventions that have received the most empirical support are early behavioral interventions. They usually involve one-on-one training provided by a therapist, in which children are trained to respond to environmental changes, understand and use lan- guage, and interact appropriately with others in so- cial settings [8]. In the last two decades, an increasing number of studies have explored how robots can be used in the therapy of children with ASD. The goal is to use robots as a means to affect the social and communicative behavior of children with autism for either assessment or therapeutic purposes [33]. It is assumed that robots may represent an optimal social skill learning environ- ment, because individuals with ASD 1) show strengths in understanding the physical, object-related world and weaknesses in understanding the social world, 2) are more responsive to feedback given by a computer than a human, and 3) are more interested in treatment involving technology/robots [10]. This new research field is termed socially assistive robotics (SAR). Further- more, social robots may be designed to reduce the workload of the therapists by monitoring multiple desir- able and problematic behaviors and providing quantita- tive assessments.

Transcript of Robot Enhanced Therapy for Children with Autism Disorders · Regarding the use of robots in...

54 IEEE TEchnology and SocIETy MagazInE ∕ j u n e 2 0 1 61932-4529/16©2016IEEE

Andreea Peca, Mark Coeckelbergh, Ramona Simut, Cristina Costescu, Sebastian Pintea, Daniel David, and Bram Vanderborght

ISTO

CK

Digital Object Identifier 10.1109/MTS.2016.2554701 Date of publication: 2 June 2016

Measuring Ethical Acceptability

Robot Enhanced Therapy for Children with Autism Disorders

Children with autism spectrum disor-ders (ASD) have persistent deficits in social communication and social interaction, as well as restricted, repetitive patterns of behavior, in-terests, or activities [1]. The preva-lence of autism is estimated at 1–2

per 1000, and close to 6 per 1000 for ASD [23]. ASD is a lifelong disorder, and many individuals need high levels of support throughout their lives [28]. Even though no cure has been found, early interven-tion is critical for a positive long-term outcome. The interventions that have received the most empirical support are early behavioral interventions. They usually involve one-on-one training provided by a therapist, in which children are trained to respond to environmental changes, understand and use lan-

guage, and interact appropriately with others in so-cial settings [8].

In the last two decades, an increasing number of studies have explored how robots can be used in the therapy of children with ASD. The goal is to use robots as a means to affect the social and communicative behavior of children with autism for either assessment or therapeutic purposes [33]. It is assumed that robots may represent an optimal social skill learning environ-ment, because individuals with ASD 1) show strengths in understanding the physical, object-related world and weaknesses in understanding the social world, 2) are more responsive to feedback given by a computer than a human, and 3) are more interested in treatment involving technology/robots [10]. This new research field is termed socially assistive robotics (SAR). Further-more, social robots may be designed to reduce the workload of the therapists by monitoring multiple desir-able and problematic behaviors and providing quantita-tive assessments.

55j u n e 2 0 1 6 ∕ IEEE TEchnology and SocIETy MagazInE

Since our belief is that children with ASD, as end-users of social robots, should play an active role in the process of designing social robots, in a previous study we investigated the impact of the physical design of social robots on the perceptions and prefer-ences of children, both typical developing (TD) and children with autism spectrum disorders (ASD) [24]. A high diversity of preferences for different robots was revealed, but also a high preference for simplified designs, with exaggerated facial features. This study provided an innovative instrument for studying chil-dren’s perception about social robots that should be used in future studies investigating robotic designs for children.

Efficacy and Effectiveness ResearchEfficacy and effectiveness research on the use of robots in ASD therapy is still in its infancy [10], but some research data indicate increased engagement, increased levels of attention (see Feil-Seifer and Mataric, Kozima et al., Kim et al., Stanton et al., Pioggia et al., and Robins et al. cited in [25], [28], [36]), and novel social behaviors such as joint attention, spontaneous imitation and turn-taking when robots are part of the interaction (see Diehl et al., Ricks and Colton, Scassella-ti, Kozima et al., Ferrari et al.,Feil-Seifer and Mataric, and Kozima et al. cited in [28], [35]). While some of these behaviors may be attributed to the novelty of the robot, other behaviors, such as triadic interaction, turn-taking with another child, cooperation with peers (Rob-ins et al., Feil-Seifer and Mataric, François et al., and Wainer et al. cited in [28]) suggest that robots may func-tion, for children with ASD, as facilitators of the interac-tion with another person, bringing structure in the interaction and lowering the confusion and distress level. Although RET has shown some success, there is little data about the conditions necessary for the gener-alization of the acquired skills in interaction with human partners, or about the parameters of these facilitative interactions [28]. Therefore, until rigorous clinical trials are conducted and replicated, the clinical use of robots for ASD should be considered an experimental approach [10].

The development of the SAR domain seems promis-ing, but also raises many questions regarding the ethi-cal implications of using social robots. As a recent Eurobarometer study of public attitudes towards robots shows [12], 66% of the respondents declared they are totally uncomfortable about the idea of hav-ing their children or elderly parents cared for by a robot. Only 4% of the EU sample considered that using robots for care of children, elderly, and the dis-abled is a priority. Sixty percent of EU citizens declare that robots should be banned in the care of children,

the elderly, and people with disabilities. Portugal (35%), Bulgaria (40%), and Malta (49%) are the only countries where this view is not held by the majority of the sample population. There is also still considerable general opposition to using robots in other ‘’human’’ areas: 34% of respondents say robots should be banned in education. The percentage is highest in Luxembourg (58%), France (56%), Belgium (51%), and the Netherlands (50%), where the majority of EU citi-zens believe that robots should be banned in the edu-cation area. This view is least widely expressed in Finland (14%), Slovenia (17%), and Slovakia (19%) [12]. A possible explanation for the rejection of robots in education and in the care of vulnerable populations is that people fear the fact that robots might function as an isolating factor [30], [34] for people that already suf-fer from isolation, such as elderly people or children with ASD.

Contrary to this view, Shibata et al. [32] have found that the use of a social robot, Paro, with children increases the amount of communication with each other and with caretakers, when used as mediators in triadic interaction.

Another possible explanation may be related to a low level of human trust in robotic systems, which becomes manifest when robots are used outside con-trolled environments and reaches the highest values when it comes to the issue of taking care of the most vulnerable populations. But what is this trust concept referring to? Hancock, Billings, Schaefer et al. [17] per-formed a meta-analysis of factors affecting trust in human-like interaction. Factors related to robots were parsed into two subcategories: robot performance-based factors (e.g., reliability, false alarm rate, failure rate) and attribute-based factors (e.g., proximity, robot personality, and anthropomorphism). The results sug-gested that performance factors were more strongly associated with trust development and maintenance. This effect might be due to the fact that robots are still perceived more as tools [12] and less as intentional agents, capable of independent decision making and

The goal is to use robots as a means to affect the social and communicative behavior of children with autism.

56 IEEE TEchnology and SocIETy MagazInE ∕ j u n e 2 0 1 6

possessing their own personality features. ‘’However, we stand on the verge of a great change in robotic capabilities that will justify, given the exponential prog-ress forecasted by Moore’s Law, a change in our per-ception about robots and their individual motive force; At that juncture, the issue of trust in technological sys-tems will be as influential in social development as it is in our own human-human relationships’’ (Epley et al., Moravec, cited in [17]). The level of trust in social robots would be particularly critical in RET and diagnosis of ASD, since trust would affect the willingness of thera-pists, teachers, and caregivers to accept robot-pro-duced information and follow the data and indications provided by robots.

On the other hand, Arras and Cerqui [2] performed a large scale survey in Switzerland investigating questions related to the image of robotics, the potential accep-tance of robots in daily life, and appearance preferenc-es. The results indicate a high acceptance rate for robots that share physical and psychological space with humans, named personal robots, with 71% yes-votes. Regarding the use of robots in healthcare, 83% of the respondents declared they would accept a robot to help them partially regain their independence in case they cannot handle the tasks of daily life anymore (i.e., due to age or handicap).

Costescu and David [7] developed a survey in order to investigate the attitudes of children and adults toward using social robots in mental health services. Considering the effectiveness in children’s psychothera-py, 54.2% of the parents agreed that social robots could enhance the therapeutic process and also the psycho-diagnosis process. When it comes to using social robots to take care of the children, 37.4% of the participants disagree, while 33% nether disagree nor agree, and only 29.2% agree.

These results reveal an ambivalent attitude towards social robots. While social robots are considered benefi-cial for the effectiveness of the therapy as long as they are used as assistive tools for the therapist, there seems to be a concern regarding the use of social robots in the absence of the therapist.

The difference in results may also be partially due to the target audience of the survey. While the Eurobarome-ter included 26 751 participants from 27 EU countries, a representative sample of participants for all social groups, the survey of Arras and Cerqui was completed by 2000 visitors to the “Robotics” pavilion at the Swiss National Exhi-bition. Furthermore, the sample of participants who com-pleted the Arras and Cerqui survey was overrepresented by participants who had higher levels of education, younger age, and were more interested in science, features that are associated with a higher acceptance of technology [29]. Therefore, more research should be conducted in

order to clarify people’s attitudes towards using robots in healthcare, and more specifically, in physical and psycho-logical recovery processes.

Studies Investigating Ethical Issues of Using SAR in ASDThere has been theoretical work on ethical challenges in SAR, including ethical issues raised by working with autistic children [13], [30]. However, further research should investigate the underlying reasons whereupon potential users decide to accept robots [9]. Therefore there is a great need to perform ethical appraisal stud-ies about robot’s acceptance by direct users, caregiv-ers, therapists, and so on. A review of the existing literature reveals surveys about using robots in care, but most of them are about physical therapy [4], [20], [22]. Furthermore, the surveys performed on caregiv-ers of children with autism investigate issues related to robotic design [20], [24], and not ethical issues. To our knowledge, no prior survey has specifically investigat-ed ethical issues related to using RET with children with ASD.

Goal of the Present StudyThis article examines the ethical challenges of socially assistive robotics. It investigates whether people believe it is ethically acceptable to use robots as assistive tools for children with ASD; whether they think it’s acceptable to use robots to monitor and record child behaviors and facilitate the diagnosis process; whether they consider it to be ethically acceptable that children form bonding rela-tionships with social robots; whether they consider it ethi-cally acceptable to design robots with a human or non-human appearance. We examine whether age, gen-der, education, previous experience with robots, or involvement with persons with autism spectrum disorder influences people’s attitudes about the use of robots in RET. Unless these questions are addressed, robots cannot be developed and implemented for RET in an ethically responsible way, and policy makers will have too little information about how robots are perceived by people, and so will not be able to deal with ethical issues related to those perceptions [6].

MethodologyThe survey was developed by a multidisciplinary team consisting of psychologists, therapists, engineers, and ethicists, and it was approved by the Ethical Commis-sion from Twente University, The Netherlands. Partici-pants were recruited based on databases of persons involved in past research, and messages were posted on relevant blogs, Facebook, newsletters, and websites of autism organizations from Romania, Belgium, Nether-lands, United Kingdom, and U.S.A. There were no

57j u n e 2 0 1 6 ∕ IEEE TEchnology and SocIETy MagazInE

Pleo-Innvo Lab Romibo-OrigamiRoboticsShick, 2013

Nao-AldebaranRoboticsGouaillier etal, 2009

Probo-VrijeUniversiteitBrusselGoris et al,2011

Kaspar-Universityof HerdfortshireDautenhahn et al,2009

Keepon-NICTKozima etal., 2009

IROMEC-Ferrari,Robins,Dautenhahn,2009

Figure 1. The robots presented in the video.

exclusion criteria for participation in this study and no compensation was offered to participants in the study. The time frame for survey completion was 10 months.

Participants were required to indicate their response on a Likert-type scale with 5 points: 1- strongly disagree, 2- disagree, 3- nor agree nor disagree, 4- agree, and 5- strongly agree.

The initial version of the scale was developed in Eng-lish. The translation to Romanian and Dutch was done by the authors of the paper, native speakers of the two languages.

The survey was completed on-line for the majority of the respondents. For 143 Romanian participants, the survey was completed in paper-and-pencil format. These participants were either students, or were parents or pro-fessionals working in centers for autism intervention.

For the online version of the survey, the free and open source online survey application LimeSurvey was used. The application was installed on the VUB web-server and available in three languages: English, Roma-nian, and Dutch. Since robots exist in different shapes for a wide range of applications, but in our survey we focus on social robots, we introduced a one-minute video presenting several social robots that the respon-dents were required to view before completing the sur-vey. The video contained short clips of a selection of social robots representing a large variety of physical designs. These varied physical designs included robots that possess anthropomorphic features [26], humanoid robots [16], robots that are designed as ani-mals [18], robots that have cartoon-like features [15], [19], or robots that are not designed to resemble any biological species [3] (see Fig. 1). No children were shown in the video. Moreover, in the video, a neutral voice narration using lay terminology was used: http://www.youtube.com/watch?v=DSklqn49gD8. (Video used here with creator’s permission.)

ParticipantsA total of 416 subjects agreed to participate in the sur-vey. Data from 22 participants who completed only demographic data was excluded. All other outputs, even if incomplete, were included in the analysis. The demo-graphic data of the remaining 394 participants indicat-ed that 82.19% were female and 17.81% were male (see Fig. 2(a)). Age range of the participants was between 16 and 64 years old (M = 29.86 years, Sd = 11.54). The distribution of the responses indicates a predominance of younger participants in the sample (see Fig. 2(e)). As for the education, the data analysis indicates a high proportion of participants (87.52%) with higher educa-tion (see Fig. 2(c)). The nationality analysis indicates that 63.96% of the participants are Romanian (M = 24.38, Sd = 7.06), 18.53% are Belgian (M = 40.02, Sd = 12.20), 14.72% are Dutch (M = 39.46, Sd = 10.78), and 2.79% have other nationality (see Fig. 2(b)). Parents of children with an autism spectrum disorder constituted 22.59% of the participants, and 16.75% of the partici-pants were therapists or teachers of children with autism spectrum disorder (see Fig. 2(d)).

Data AnalysisThe data were quantitatively analyzed with the Statistical Program for Social Sciences (IBM SPSS Statistics 20).

Analysis of ComponentsAfter the initial elaboration of an extended pool of items, preliminary analysis retained 12 items for data collec-tion. Principal components analysis (PCA) was carried out including the 12 items of the Ethical Acceptability Scale. The analysis using Varimax rotations revealed three factors with eigenvalues of 5.33, 1.25, and 1.20, accounting for 64.95% of the total variance. Factor I was termed Ethical Acceptability for Use and it accounted for 44.47% of variance. Factor II was termed Ethical

58 IEEE TEchnology and SocIETy MagazInE ∕ j u n e 2 0 1 6

64

100.0%

90.0%

80.0%

70.0%

60.0%

Percent

50.0%

40.0%

30.0%

20.0%

10.0%

0.0%RomanianFemale Male Belgian Dutch

NationalityGender

(b)(a)

Other

60.0%

70.0%

Percent

50.0%

40.0%

30.0%

20.0%

10.0%

0.0%

82.19%

17.81%

63.96%

18.53%14.72%

2.792%

(d)

Percent

0.0%

0 11 16 19 21 23 25 27 29 31 33 35 37 39 41 43 45 47 49 51 53 56 59 61

15.0%

10.0%

5.0%

Age

(e)

(c)

60.0%

Percent

50.0%

40.0%

30.0%

20.0%

10.0%

0.0%PrimarySchool

SecondarySchool

BachelorEducation

MasterPhd Parent-ChildAutism

Therapist/Teacher-Child

Autism

Other

60.0%

70.0%Percent

50.0%

40.0%

30.0%

20.0%

10.0%

0.0%

Role

60.66%

16.75%22.59%

2.799%

26.21%

58.52%

10.94%

1.527%

Figure 2. Demographical data of the participants. (a) Gender distribution. (b) Nationality distribution. (c) Education distribution. (d) Role distribution. (e) Age distribution.

59j u n e 2 0 1 6 ∕ IEEE TEchnology and SocIETy MagazInE

acceptability of Human-Like Interaction and it account-ed for 10.44% of variance. Factor III was termed Ethical Acceptability of Non-Human Appearance and it account-ed for 10.04% of variance. The loading of the items on each of the three factors are presented in Table 1.

ReliabilityThe reliability analysis using the Cronbach’s alpha revealed a good internal consistency for all subscales (ethical acceptability for use a = 0.86, ethical accept-ability of human-like interaction a = 0.72, ethical acceptability of non-human appearance a = 0.76) and also for the total score (ethical acceptability a = 0.87).

Descriptive analysis of responses to items on the Ethical Acceptability for Use subscaleThe analysis of the distribution of responses to the first two questions, ‘’It is ethically acceptable that social robots are used in therapy for children with autism’’ (86.33%) (see Fig. 3) and ‘’It is ethically

50.0%

40.0%

30.0%

20.0%

10.0%

0.0%

Str

ongl

yD

isag

ree

Dis

agre

e

Nor

Agr

ee,

Nor

Dis

agre

e

1.453% 2.035%

10.17%

46.8%

39.53%

Per

cent

Agr

ee

Str

ongl

yA

gree

It is Ethically Acceptable That Social Robots are Used inTherapy for Children with Autism.

Figure 3. The graphical representation of the responses to the item: It is ethically acceptable that social robots are used in therapy for children with autism.

50.0%

40.0%

30.0%

20.0%

10.0%

0.0%

Per

cent

Str

ongl

yD

isag

ree

Dis

agre

e

Nor

Agr

ee,

Nor

Dis

agre

e

Agr

ee

Str

ongl

yA

gree

It is Ethically Acceptable that Social Robots are Used in Healthcare.

1.744% 2.907%

10.47%

48.84%

36.05%

Figure 4. The graphical representation of the responses to the item: It is ethically acceptable that social robots are used in healthcare.

Table 1. Items and Factor loadings for the three ethical acceptability Subscale Factors.

Item Loading

Ethical acceptability for use

1. It is ethically acceptable that social robots are used in therapy for children with autism.

0.800

2. It is ethically acceptable that social robots are used in healthcare.

0.785

3. It is ethically acceptable that social robots are used to monitor the progress and help in the diagnosis process of a child with autism.

0.780

4. It is ethically acceptable that information is recorded and stored by a robot when it interacts with a child with autism.

0.780

5. It is ethically acceptable that social robots are used in therapy to support the interaction between the therapist and the child with autism.

0.650

Ethical acceptability of human-like interaction

6. It is ethically acceptable that, as a result of their therapy, children with autism perceive social robots as friends.

0.845

7. It is ethically acceptable that children become attached to social robots.

0.825

8. It is ethically acceptable to make social robots that look like humans.

0.604

9. It is ethically acceptable to use social robots that replace therapists for teaching skills to children with autism (e.g. imitation, social skills).

0.388

Ethical acceptability of non-human appearance

10. It is ethically acceptable to make social robots that look like objects.

0.868

11. It is ethically acceptable to make social robots that look like imaginary creatures.

0.801

12. It is ethically acceptable to make social robots that look like animals.

0.579

60 IEEE TEchnology and SocIETy MagazInE ∕ j u n e 2 0 1 6

acceptable that social robots are used in healthcare’’ (84.89%) (see Fig. 4) indicate that a significant majority of the respondents agrees with using robots in the health care system and specifically, in RET for ASD children. This is good news for the robotics communi-ty. The same positive attitude towards using robots in RET for ASD can be observed when it comes to the

issue of using robots for recording and storing informa-tion related to the behavior of children with ASD (78.89% agreement) (see Fig. 6), to using robots for monitoring the progress in therapy and to provide help in the diagnosis process (83.92% agreement) (see Fig. 5), and to using social robots as mediators of the interaction between therapists and children (84.01% agreement) (see Fig. 7).

50.0%

40.0%

30.0%

20.0%

10.0%

0.0%

Per

cent

Str

ongl

yD

isag

ree

Dis

agre

e

Nor

Agr

ee,

Nor

Dis

agre

e

Agr

ee

Str

ongl

yA

gree

2.632% 4.386%9.064%

44.15%39.77%

It is Ethically Acceptable That Social Robots are Used tomonitor the Progress and Help the Diagnosis of a Childwith Autism.

Figure 5. The graphical representation of the responses to the item: It is ethically acceptable that social robots are used to monitor the progress and help in the diagnosis process of a child with autism.

50.0%

60.0%

40.0%

30.0%

20.0%

10.0%

0.0%

Per

cent

Str

ongl

yD

isag

ree

Dis

agre

e

Nor

Agr

ee,

Nor

Dis

agre

e

Agr

ee

Str

ongl

yA

gree

2.639% 3.812%

14.66%

50.44%

28.45%

It is Ethically Acceptable that it is recorded and stored by aRobot When it Interacts with a Child with Autism.

Figure 6. The graphical representation of the responses to the item: It is ethically acceptable that information is recorded and stored by a robot when it interacts with a child with autism.

50.0%

60.0%

40.0%

30.0%

20.0%

10.0%

0.0%

Per

cent

Str

ongl

yD

isag

ree

Dis

agre

e

Nor

Agr

ee,

Nor

Dis

agre

e

Agr

ee

Str

ongl

yA

gree

1.744% 3.198%

11.05%

50.29%

33.72%

It is Ethically Acceptable That Social Robots are Used in Therapy to Support the Interaction Between the Therapistand the Child with Autism.

Figure 7. The graphical representation of the responses to the item: It is ethically acceptable that social robots are used in therapy to support the interaction between the therapist and the child with autism.

5.865%

16.13%

13.09%34.6%

12.32%

40.0%

30.0%

20.0%

10.0%

0.0%

Per

cent

Str

ongl

yD

isag

ree

Dis

agre

e

Nor

Agr

ee,

Nor

Dis

agre

e

Agr

ee

Str

ongl

yA

gree

It is Ethically Acceptable That, as a Result of Their Therapy,Children with Autism Perceive Social Robots as Friends.

Figure 8. The graphical representation of the responses to the item: It is ethically acceptable that, as a result of their therapy, children with autism perceive social robots as friends.

61j u n e 2 0 1 6 ∕ IEEE TEchnology and SocIETy MagazInE

Descriptive analysis of responses to items on the Ethical Acceptability of Human-Like Interaction subscaleThe distribution of responses to the question, “It is ethi-cally acceptable that, as a result of their therapy, chil-dren with autism perceive social robots as friends’’ were analyzed. The results showed a significantly higher per-centage of the respondents (46.92%) did not see an ethi-cal problem with the children perceiving the robots as friends, compared with only 21.7% of the respondents who did see an ethical problem. A high percentage of the respondents (31.09%) were undecided on this ques-tion (see Fig. 8). Furthermore, a higher percentage of respondents (40.98%) declared not having a problem with the idea that children with ASD might develop an emotional bond with a social robot, compared to a lower percentage of respondents who considered that it is not ethically acceptable that children may become attached by social robots (26.74%) (see Fig. 9).

Regarding the issue of replacing therapists with social robots for the teaching of specific skills, a sig-nificantly higher percentage of respondents (44.19%) opposed this idea, compared to only 27.32% of the respondents who agree with using social robot replacement of therapists for specific skills teaching, and 20.49% of the respondents who were undecided (see Fig. 10). For the question regarding the develop-ment of robots with human appearance, 55.39% of the respondents considered that to be ethically acceptable, while 21.3% of the respondents thought robots should not have a human appearance. Again, a high percentage of respondents (23.32%) were unde-cided (see Fig. 11).

Descriptive analysis of responses to items on the Ethical Acceptability of Non-Human Appearance subscaleRegarding the non-human appearance of robots, the data indicate that a majority of respondents consider it ethically acceptable for social robots to have a non-human appearance, with 66.67% willing to accept an object-like appearance (see Fig. 12), 63.26% willing to accept an imaginary-like appearance (see Fig. 13), and 74.85% willing to accept an animal-like appearance (see Fig. 14).

Correlates of ethical acceptability for useIn order to identify the predictors of response for the three subscale factors of the Ethical Acceptability scale, a series of statistical analyses were performed.

Since the survey was mainly completed online, there was little control of the behavior of the respondents, and as a consequence, some participants missed some questions or did not completely finish the survey. Due to

the missing data, for different items of the survey, differ-ent numbers of participants were reported.

GenderAn independent-samples t-test was conducted to com-pare the Acceptability For Use factor in female and male

5.523%

21.22%

32.27%34.3%

6.686%

40.0%

30.0%

20.0%

10.0%

0.0%

Per

cent

Str

ongl

yD

isag

ree

Dis

agre

e

Nor

Agr

ee,

Nor

Dis

agre

e

Agr

ee

Str

ongl

yA

gree

It is Ethically Acceptable that Childern Become Attachedto Social Robots

Figure 9. The graphical representation of the responses to the item: It is ethically acceptable that children become attached to social robots.

11.05%

33.14%

28.49%

18.6%

8.721%

40.0%

30.0%

20.0%

10.0%

0.0%

Per

cent

Str

ongl

yD

isag

ree

Dis

agre

e

Nor

Agr

ee,

Nor

Dis

agre

e

Agr

ee

Str

ongl

yA

gree

It is Ethically Acceptable to use Social Robots thatReplace Therapists for Teaching Skills to Children with Autism (e.g. Imitation, Social Skills).

Figure 10. The graphical representation of the responses to the item: It is ethically acceptable to use social robots that replace therapists for teaching skills to children with autism (e.g. imitation, social skills).

62 IEEE TEchnology and SocIETy MagazInE ∕ j u n e 2 0 1 6

participants. The results indicate a significant difference in the scores of male participants (M = 4.30, Sd = 0.59) and female participants (M = 4.08, Sd = 0.72), t (338) = −2.26, p = 0.024, with a small effect size (d = −0.34). These results suggest that gender is related to the Acceptability for Use. Specifically, our results suggest that men manifest a higher level of Acceptability for Use compared to women.

AgeA Pearson correlation coefficient was computed to assess the relationship between the age of the partici-pants and Acceptability for Use. There was a negative correlation between the two variables, r = −0.191, N = 394, p = 0.000. The results indicate that an increase in age was associated with a decrease in the level of Acceptability for Use.

5.248%

16.03%

23.32%

39.94%

15.45%

40.0%

30.0%

20.0%

10.0%

0.0%

Per

cent

Str

ongl

yD

isag

ree

Dis

agre

e

Nor

Agr

ee,

Nor

Dis

agre

e

Agr

ee

Str

ongl

yA

gree

It is Ethically Acceptable to Make Social Robots that Look Like Humans.

Figure 11. The graphical representation of the responses to the item: It is ethically acceptable to make social robots that look like humans.

40.0%

50.0%

30.0%

20.0%

10.0%

0.0%

Per

cent

3.509%

9.357%

48.54%

18.13%20.47%

Str

ongl

yD

isag

ree

Dis

agre

e

Nor

Agr

ee,

Nor

Dis

agre

e

Agr

ee

Str

ongl

yA

gree

It is Ethically Acceptable to Make SocialRobots that Look Objects.

Figure 12. The graphical representation of the responses to the item: It is ethically acceptable to make social robots that look like objects.

40.0%

50.0%

30.0%

20.0%

10.0%

0.0%

Per

cent

4.082%

10.5%

44.02%

19.24%22.16%

Str

ongl

yD

isag

ree

Dis

agre

e

Nor

Agr

ee,

Nor

Dis

agre

e

Agr

ee

Str

ongl

yA

gree

It is Ethically Acceptable to Make SocialRobots that Look Like Imaginary Creatures.

Figure 13. The graphical representation of the responses to the item: It is ethically acceptable to make social robots that look like imaginary creatures.

40.0%

50.0%

30.0%

20.0%

10.0%

0.0%

Per

cent

Str

ongl

yD

isag

ree

Dis

agre

e

Nor

Agr

ee,

Nor

Dis

agre

e

Agr

ee

Str

ongl

yA

gree

It is Ethically Acceptable to Make SocialRobots that Look Like Animals.

2.924%7.895%

14.33%

50.58%

24.27%

Figure 14. The graphical representation of the responses to the item: It is ethically acceptable to make social robots that look like animals.

63j u n e 2 0 1 6 ∕ IEEE TEchnology and SocIETy MagazInE

EducationA Spearman correlation coefficient was computed to assess the relationship between the education level of the participants and the Acceptability for Use. The corre-lation between the two variables was not significant, r = −0.002, N = 341, p = 0.967. The results indicate that the education level of the participants and the Acceptabil-ity for Use are not associated.

Social roleFor the Acceptability for Use, the analysis of vari-ance revealed a significant difference between groups with different social roles, F(2, 394) = 3.42, p = 0.034. Post hoc analysis using the Sheffe post hoc criterion for significance indicated that the par-ent group manifested a lower level of Acceptability for Use (M = 3.95, Sd = 0.76) compared to the others group (M = 4.19, Sd = 0.66) and the mean difference MD = −1.18 was significant, at p = 0.04, with a small effect size (d = 0.33).

Experience with robotsFor the Acceptability for Use, the analysis of variance revealed a significant difference between groups with different levels of experience with robots, F(2, 198) = 8.85, at p = 0.00

Post hoc analysis using the Sheffe post hoc criteri-on for significance indicated that the group with spe-cific experience with robots manifested a higher level of Acceptability for Use (M = 4.40, Sd = 0.50) com-pared to the group with no experience with robots (M = 3.81, Sd = 0.76) and the mean difference MD = 2.93 was significant, p = 0.00, with a high effect size (d = 0.905).

Acceptability of human-like interaction

GenderAn independent-samples t-test was conducted to com-pare the Acceptability of Human Interaction in female and male participants. The results indicate a significant difference in the scores of male participants (M = 3.43, Sd = 0.87) and female participants (M = 3.12, Sd = 0.77), t(338) = −2.693, p = 0.007, with a small effect size (d = −0.33). These results suggest that gender is related to the Acceptability of Human Interaction. Spe-cifically, our results suggest that men manifest a higher level of Acceptability of Human Interaction compared to women.

AgeA Pearson correlation coefficient was computed to assess the relationship between the age of the partici-pants and Acceptability of Human Interaction. The

correlation between the two variables was not significant, r = −0.036, N = 394, p = 0.510. The results indicate that the age of the participants and the Acceptability of Human Interaction are not associated.

EducationA Spearman correlation coefficient was computed to assess the relationship between the education level of the participants and the Acceptability of Human Inter-action. The correlation between the two variables was not significant, r = 0.070, N = 394, p = 0.196. The results indicate that the education level of the partici-pants and the Acceptability of Human Interaction are not associated.

Social roleFor the Acceptability of Human Interaction, the analy-sis of variance indicates that there are no significant dif-ferences between groups of parents, therapists, and others on the Acceptability of Human Interaction sub-scale F(2, 394) = 2.00, p = 0.13.

Experience with robotsFor the Acceptability of Human Interaction, the analy-sis of variance revealed a significant difference between groups with different levels of experience with robots, F(2, 198) = 5.17, p = 0.006.

Post hoc analysis using the Sheffe post hoc criterion for significance indicated that the group with specific experience with robots manifested a higher level of Acceptability of Human Interaction (M = 3.47, Sd = 0.72) compared to the group with no experience with robots (M = 2.98, Sd = 0.86) and the mean difference MD = 1.95 was significant, p = 0.007, with an average effect size (d = 0.61).

Acceptability of non-human appearance

GenderAn independent-samples t-test was conducted to com-pare the Acceptability of Non-Human Appearance in female and male participants. The results indicate a

A significant majority of the respondents agree with using robots in the health care system and specifically, in ASD children.

64 IEEE TEchnology and SocIETy MagazInE ∕ j u n e 2 0 1 6

significant difference in the scores of male participants (M = 4.05, Sd = 0.73) and female participants (M = 3.65, Sd = 0.82), t(338) = −3.37, p = 0.001, with a mod-erate effect size (d = −0.50). These results suggest that gender is related to the Acceptability of Non-Human Appearance. Specifically, our results suggest that men manifest a higher level of Acceptability of Non-Human Appearance compared to women.

AgeA Pearson correlation coefficient was computed to assess the relationship between the age of the partici-pants and Acceptability of Non-Human Appearance. The correlation between the two variables was not sig-nificant, r = −0.068, N = 395, p = 0.209. The results indicate that the age of the participants and the Accept-ability of Non-Human Appearance are not associated.

EducationA Spearman correlation coefficient was computed to assess the relationship between the education level of the participants and the Acceptability of Non-Human Appearance. The negative correlation between the two variables was not significant, r = 0.062, N = 394, p = 0.252. The results indicate that the education level of the participants and the Acceptability of Non-Human Appearance are not associated.

Social roleFor the Acceptability of Non-Human Appearance, the analysis of variance indicates that there are no signifi-cant differences between groups of parents, therapists, and others on the Acceptability of Non-Human Appear-ance subscale F(2, 394) = 1.38, p = 0.25.

Experience with robotsFor the Acceptability of Non-Human Appearance, the analysis of variance revealed a significant difference between groups with different levels of experience with robots, F(2, 198) = 7.66, p = 0.001.

Post hoc analysis using the Sheffe post hoc criterion for significance indicated that the group with specific

experience with robots manifested a higher level of Acceptability of Human Interaction (M = 4.08, Sd = 0.70) compared to the group with no experience with robots (M = 3.45, Sd = 0.89) and the mean difference MD = 1.88 was significant, p = 0.001, with a high effect size (d = 0.779).

Ethical Responsibility in Robotic DevelopmentOverall, the data analysis indicates that the majority of the respondents have a positive view regarding the use of social robots in the RET domain for ASD. Furthermore, the majority of the respondents consid-ered it ethically acceptable to use social robots as assistive tools for therapists, while opposing the use of the robots in the absence of the therapists. A pos-itive attitude regarding the use of social robots for diagnosis, recording, and storing information and monitoring the evolution of the therapeutic process was revealed.

Regarding the ethical acceptability of human-like interaction between children with ASD and social robots, a significantly higher percentage of the respondents consider it ethically acceptable to devel-op and use in RET social robots that relate to children in a ways similar to the way humans do. However, a high percentage of respondents seem to be undecid-ed, indicating a lack of a defined attitude regarding human-like interaction between children with ASD and social robots. This effect might be due to the novelty of social robotics domain, and to the fact that robots are still perceived more as tools [12] and less as intentional agents possessing their own personality features. Interestingly, the high percentage of unde-cided respondents may tip the balance to one side or another.

Regarding the non-human appearance of robots, the data indicates that a majority of respondents consider it ethically acceptable for social robots to have non-human (animal, object, or imaginary crea-ture) appearance.

Analysis of the relation between different socio-demographic predictors and ethical acceptability fac-tors reveals a more nuanced image of public attitudes toward social robots. Overall, men manifest a higher level of acceptance of social robots in RET for children with ASD. This data is congruent with the data obtained by previous studies [2], [12]. Moreover, the analysis of the relation between age and ethical acceptability suggests that younger adults are more accepting of the use of social robots for therapy for children with ASD, while this tendency decreases in older adults. This data is congruent with the data obtained by previous studies [2], [5]. On the other hand, the acceptability of human-like interaction and

It [is] ethically acceptable for social robots to have non-human (animal, object, or imaginary creature) appearance.

65j u n e 2 0 1 6 ∕ IEEE TEchnology and SocIETy MagazInE

the acceptability of non-human appearance do not seem to be influenced by age.

The data analysis of the relation between education and ethical acceptability indicate that the level of educa-tion is not associated with the acceptability of social robots for ASD, on any of the three subscales.

The analysis of the relation between social role and ethical acceptability indicates that parents are open to the use of social robots in therapy for ASS, but are less accepting compared with the group of participants that is not involved with ASD. These data are extremely important and must be carefully taken into consider-ation by the developers and implementers of socially assistive robotics. Even if the mean difference between the two groups is small, the tendency observed in this study justifies investigating further if the acceptability for use is influenced by the affective involvement with the child with ASD. A difference in the level of affective involvement might be also responsible for the discrep-ancies found in the Eurobarometer [12] between the general positive view of robots and the strong general opposition of the respondents of having their child or elderly parent taken care by a robot. This effect might be due to the higher responsibility level experienced by people who are more deeply involved with raising and educating children with ASD. But it also might be relat-ed to fears that parents might have about the possible isolating effect of using robots in therapy, and to a lack of trust therapists might have in the capabilities of social robots.

No difference was found between parents, thera-pists, and others for the other two factors, accept-ability of human-like interaction, and acceptability of non-human appearance. According to our results, the level of experience with robots has a significant effect on the level of acceptance of using robots in therapy. More specifically, adults with specific expe-rience with robots are more willing to accept them in the therapy for ASD, compared to adults who do not have any experience with robots. The same results were found for the acceptability of human-like interaction, and also for the acceptability of non-human appearance. Overall, specific experience with robots increases the acceptance of social robots for ASD. These data are congruent with those obtained in the Eurobarometer [12], where respondents with personal experience with robots also have a higher level of general acceptance of robots than those without any experience.

In conclusion, our data indicate a positive view of social robots and a high level of ethical acceptability on all the three subscales in RET of ASD children. The survey shows that social-demographic factors such as gender, age, previous experience with robots,

and social involvement with children with ASD are important determinants of view about social robots. Men seem to be more positive and have a higher level of ethical acceptability on all the measured dimensions, and younger respondents are more willing to accept the use of social robots in RET for ASD compared to older participants. Regarding the relation between the involvement with children with ASD and ethical acceptability, the data show that parents generally agree with using robots in ASD, but their agreement level is lower compared with the group of participants who are not involved directly with ASD children.

New developments in robotics suggest that we stand on the verge of a significant change in robotic capabili-ties. It is important that these changes occur in an ethi-cal and socially responsible way. We recommend that roboticists who develop new robots for human-robot interaction, including for autism therapy, take into care-ful consideration people’s perceptions and attitudes. We suggest that roboticists systematically use feed-back information to minimize psychological barriers in human-robot interaction, and that they develop and implement social robots in an ethically responsible way. This means making sure that all kinds of stake-holders are included in the research, with a view to developing technology and therapeutic practices that are more inclusive and sensitive to social difference and inequality. For example, one limitation of the study refers to the overrepresentation of the sample of respondents by young and highly educated partici-pants. Women were also overrepresented in our survey pool. Therefore, our results are limited to our sample characteristics. Future studies will need to investigate these aspects on a larger sample that is more repre-sentative of all social categories.

Author InformationA. Peca, C. Costescu, and D. David are with Babes- Bolyai University, Department of Clinical Psychology and Psy-chotherapy, Cluj-Napoca, Romania. Email: [email protected].

S. Pintea is with the Department of Psychology, and with the Department of Clinical Psychology and Psy-chotherapy, Babes-Bolyai University, Cluj-Napoca, Romania.

M. Coeckelbergh is with the Centre for Computing and Social Responsibility, De Montfort University, Leicester, U.K.

R. Simut is with Vrije Universiteit Brussel, Clinical and Life Span Psychology Department, Brussels, Belgium.

B. Vanderborght is with Vrije Universiteit Brussel, Robotics and Multibody Mechanics Research Group, Brussels, Belgium.

66 IEEE TEchnology and SocIETy MagazInE ∕ j u n e 2 0 1 6

AcknowledgmentThe authors thank the participants who made this study possible by completing the survey. This work has been supported by the CNCSIS-Bucharest, Romania project PN-II-IDPCE-2011-3-0484-Exploring Robot-Assisted Ther-apy for children with ASD and the European FP7 project DREAM (grant no. 611391).

References[1] American Psychiatric Association, Diagnostic and Statistical Man-ual of Mental Disorders, 5th ed. Arlington, VA: Amer. Psychiatric Pub.. 2014[2] K.O. Arras and D. Cerqui, “Do we want to share our lives and bodies with robots? A 2000 people survey,” Tech. rep. no. 0605-001, Autonomous Systems Lab, Swiss Federal Institute of Technology Lausanne (EPFL), 2005.[3] T. Bernd, G.J. Gelderblom, S, Vanstipelen, and L.D. De Witte, “Short term effect evaluation of IROMEC involved therapy for chil-dren with intellectual disabilities,” in ICSR’10  Proc. Second  Int. Conf. Social Robotics, 2010, pp. 259-264.[4] E. Broadbent, I.H. Kuo, Y.I. Lee, J. Rabindran, N. Kerse et al., “Attitudes and reactions to a healthcare robot. Telemedicine and e-Health. 16(5), 608-613. 2010[5] Broadbent, E., Stafford, R., & MacDonald, B. (2009). Acceptance of healthcare robots for the older population: Review and Future Directions, Int. J. Social Robotics, vol. 1, no. 4, pp. 319-330, 2009.[6] M. Coeckelbergh et al., “A survey of expectations about the role of social robots in robot-assisted therapy for children with ASD: Ethi-cal acceptability, trust, sociability, appearance, and attachment,” Science and Engineering Ethics, accepted for publication, 2015.[7] C. Costescu and D. David, “Attitudes toward using social robots in psychotherapy,” Transylvanian J. Psychology, vol. 15, no. 1), 2014.[8] G. Dawson and J. Osterling, “Early intervention in autism: Effec-tiveness and common elements of current approaches,” in The Effectiveness of Early Intervention: Second Generation Research, Guralnick, Ed. Baltimore, MD: Brookes, 1997, pp. 307–326.[9] M. De Graaf and S. Ben Allouch, “Exploring influencing variables for the acceptance of social robots,” Robotics and Autonomous Syst., vol. 61, no. 12, pp. 1476-1486, 2013.[10] J.J. Diehl et al., “Clinical applications of robots in autism spec-trum disorder diagnosis and treatment,” in Comprehensive Guide to Autism. New York, NY: Springer, 2014, pp. 411-422.[11] J.J. Diehl et al., The clinical use of robots for individuals with autism spectrum disorders: A critical review, Res. Autism Spectrum Disor-ders, vol. 6, no. 1, pp. 249–262, 2012.[12] European Commission, Special Eurobarometer 382: Public Attitudes towards Robots, Rep., 2012; http://ec.europa.eu/pub-lic_opinion/archives/ebs/ebs_382_en.pdf, accessed Nov. 14, 2012.[13] D.J. Feil-Seifer and M.J. Mataric, “Ethical principles for socially assistive robotics,” IEEE Robotics and Automation Mag. (Special Issue on Roboethics, G. Veruggio, J. Solis, and M. Van der Loos, Eds.), vol. 18, no. 1, pp. 24-31, 2011.[14] E. Ferrari et al., “Therapeutic and educational objectives in robot assisted play for children with autism,” presented at 18th IEEE Int. Workshop on Robot and Human Interactive Communication (RO-MAN 09), Toyama, Japan, Sept. 2009.[15] K. Goris et al., “Mechanical design of the huggable robot probo,” Int. J. on Humanoid Robotics, vol. 8, no. 3, pp. 481–511, 2011.[16] D. Gouaillier et al. “Mechatronic design of NAO humanoid,” in Proc. IEEE Int. Conf. on Robotics and Automation, (ICRA), Japan, May 12-17, 2009, pp. 769-774. [17] P.A. Hancock, “A meta-analysis of factors affecting trust in human-like interaction,” Human Factors: The J. of the Human Fac-tors and Ergonomics Society, vol. 53, p. 517, 2011.

[18] E. Kim et al., “Social robots as embedded reinforcers of social behavior in children with autism,” J. Autism and Developmental Disorders, vol. 43, no. 5, pp. 1038-1049, 2013.[19] H. Kozima et al., Int. J. Social Robotics , vol.1, no. 1, pp. 3-18, 2009.[20] J. Lee, H. Takehashi, C. Nagai, G. Obinata, and D. Stefanov, “Which robot features can stimulate better response to children with autism in robot-assisted therapy?,” Int. J. Advanced Robotic Systems, vol. 9, p. 72, 2012.[21] M. Lee, M. Rittenhouse, H.A. Adbdullah, “Design issues for ther-apeutic robot systems: Results from a survey of physiotherapists,” J. Intelligent and Robotic Systems, vol. 42, no. 3, pp. 239-252, 2005.[22] E.C. Lu, “The development of an upper limb stroke reha-bilitation robot: identification of clinical practices and design requirements through a survey of therapists,” Disability and Rehabilitation: Assistive Technology, vol. 6, no. 5, pp. 420-431, 2011.[23] C.J. Newschaffer, “The epidemiology of autism spectrum disor-ders,” Ann. Rev.Public Health, vol. 28, pp. 235–258, 2007.[24] A. Peca, R. Simut, S. Pintea, C. Pop, and B. Vanderborght, “How do typically developing children and children with autism per-ceive different social robots?” Computers in Human Behavior, vol. 41, pp. 268-277, 2014.[25] C. Pop et al., “Enhancing play skills, engagement and social skills in a play task in ASD children by using robot-based interven-tions. A pilot study.” Interaction Studies, vol. 15, no. 2, pp. 292-320, 2014.[26] B. Robins, K. Dautenhahn, and P. Dickerson, “From isolation to communication: A case study evaluation of robot assisted play for children with autism with a minimally expressive human-oid robot,” in  Proc. ACHI ‘09. Second Int. Conf. on Advanc-es in  Computer-Human Interactions (Cancun, Mexico), 2009, pp. 205–211.[27] B. Robins et al., “Robot-mediated joint attention in children with autism: A case study in robot-human interaction,” Interaction Studies, vol.5, no. 2, pp. 161–198, 2004.[28] B. Scassellati et al., “Robots for use in autism research,” Ann. Rev. Biomedical Engineering, vol. 14, pp. 275-294, 2012.[29] M. Scopelliti et al., “If I had a robot… Peoples’ representation of domestic robots,” Design for a more inclusive world, S. Keates et al. Eds. London, U.K.: Springer-Verlag, 2004, pp. 257-266.[30] A. Sharkey and N. Sharkey, “Children, the elderly and interac-tive robots,” IEEE Robotics & Automation Mag., vol. 18, no. 1, pp. 32-38, 2011.[31] N. Sharkey and A. Sharkey, “The crying shame of robot nannies: An ethical appraisal,” Interaction Studies, vol. 11, no. 2, pp. 161-190, 2010.[32] T. Shibata et al., “Mental commit robot and its application to therapy of children,” in Proc. IEEE/ASME Int. Conf. on AIM’01, pap. 182 (CD-ROM), July 2001.[33] K.M. Tsui et al., “Survey of domain-specific performance measures in assistive robotic technology,” in Proc. 8th Work-shop on Performance Metrics for Intelligent Systems, 2008, pp. 116-123.[34] S. Turkle, “Relational artifacts/children/elders: The complexi-ties of cybercompanions,” in Proc. Toward Social Mechanisms Android Science: COGSCI 2005 Workshop (Stresa, Italy), July 2005, p. 6273.[35] A. Tapus, “Children with autism social engagement in interac-tion with Nao, an imitative robot – A series of single case experi-ments, Interaction Studies, vol. 13, no. 3, pp. 315-347, 2012.[36] B. Vanderborght, “Using the social robot Probo as social story telling agent for children with ASD,” Interaction Studies, vol.13, no. 3, pp. 348–372, 2012.