Jump to ContentJump to Main Navigation
Show Summary Details
More options …

Paladyn, Journal of Behavioral Robotics

Editor-in-Chief: Schöner, Gregor

1 Issue per year

Open Access
Online
ISSN
2081-4836
See all formats and pricing
More options …

How to Build a Supervised Autonomous System for Robot-Enhanced Therapy for Children with Autism Spectrum Disorder

Pablo G. Esteban
  • Corresponding author
  • Robotics and Multibody Mechanics Research Group, Agile & Human Centered Production and Robotic Systems Research Priority of Flanders Make, Vrije Universiteit Brussel, Brussels, Belgium
  • Email
  • Other articles by this author:
  • De Gruyter OnlineGoogle Scholar
/ Paul Baxter
  • Centre for Robotics and Neural Systems, Plymouth University, Plymouth, United Kingdom of Great Britain and Northern Ireland
  • Other articles by this author:
  • De Gruyter OnlineGoogle Scholar
/ Tony Belpaeme
  • Centre for Robotics and Neural Systems, Plymouth University, Plymouth, United Kingdom of Great Britain and Northern Ireland
  • Other articles by this author:
  • De Gruyter OnlineGoogle Scholar
/ Erik Billing / Haibin Cai / Hoang-Long Cao
  • Robotics and Multibody Mechanics Research Group, Agile & Human Centered Production and Robotic Systems Research Priority of Flanders Make, Vrije Universiteit Brussel, Brussels, Belgium
  • Other articles by this author:
  • De Gruyter OnlineGoogle Scholar
/ Mark Coeckelbergh
  • Centre for Computing and Social Responsibility, Faculty of Technology, De Montfort University, Leicester, United Kingdom of Great Britain and Northern Ireland
  • Email
  • Other articles by this author:
  • De Gruyter OnlineGoogle Scholar
/ Cristina Costescu
  • Department of Clinical Psychology and Psychotherapy, Babeş-Bolyai University, Cluj-Napoca, Romania
  • Other articles by this author:
  • De Gruyter OnlineGoogle Scholar
/ Daniel David
  • Department of Clinical Psychology and Psychotherapy, Babeş-Bolyai University, Cluj-Napoca, Romania
  • Other articles by this author:
  • De Gruyter OnlineGoogle Scholar
/ Albert De Beir
  • Robotics and Multibody Mechanics Research Group, Agile & Human Centered Production and Robotic Systems Research Priority of Flanders Make, Vrije Universiteit Brussel, Brussels, Belgium
  • Other articles by this author:
  • De Gruyter OnlineGoogle Scholar
/ Yinfeng Fang
  • School of Computing, University of Portsmouth, Portsmouth, United Kingdom of Great Britain and Northern Ireland
  • Other articles by this author:
  • De Gruyter OnlineGoogle Scholar
/ Zhaojie Ju
  • School of Computing, University of Portsmouth, Portsmouth, United Kingdom of Great Britain and Northern Ireland
  • Other articles by this author:
  • De Gruyter OnlineGoogle Scholar
/ James Kennedy
  • Centre for Robotics and Neural Systems, Plymouth University, Plymouth, United Kingdom of Great Britain and Northern Ireland
  • Other articles by this author:
  • De Gruyter OnlineGoogle Scholar
/ Honghai Liu
  • School of Computing, University of Portsmouth, Portsmouth, United Kingdom of Great Britain and Northern Ireland
  • Other articles by this author:
  • De Gruyter OnlineGoogle Scholar
/ Alexandre Mazel / Amit Pandey / Kathleen Richardson
  • Centre for Computing and Social Responsibility, Faculty of Technology, De Montfort University, Leicester, United Kingdom of Great Britain and Northern Ireland
  • Other articles by this author:
  • De Gruyter OnlineGoogle Scholar
/ Emmanuel Senft
  • Centre for Robotics and Neural Systems, Plymouth University, Plymouth, United Kingdom of Great Britain and Northern Ireland
  • Other articles by this author:
  • De Gruyter OnlineGoogle Scholar
/ Serge Thill / Greet Van de Perre
  • Robotics and Multibody Mechanics Research Group, Agile & Human Centered Production and Robotic Systems Research Priority of Flanders Make, Vrije Universiteit Brussel, Brussels, Belgium
  • Other articles by this author:
  • De Gruyter OnlineGoogle Scholar
/ Bram Vanderborght
  • Robotics and Multibody Mechanics Research Group, Agile & Human Centered Production and Robotic Systems Research Priority of Flanders Make, Vrije Universiteit Brussel, Brussels, Belgium
  • Other articles by this author:
  • De Gruyter OnlineGoogle Scholar
/ David Vernon / Hui Yu
  • School of Computing, University of Portsmouth, Portsmouth, United Kingdom of Great Britain and Northern Ireland
  • Other articles by this author:
  • De Gruyter OnlineGoogle Scholar
/ Tom Ziemke
Published Online: 2017-05-11 | DOI: https://doi.org/10.1515/pjbr-2017-0002

Abstract

Robot-Assisted Therapy (RAT) has successfully been used to improve social skills in children with autism spectrum disorders (ASD) through remote control of the robot in so-called Wizard of Oz (WoZ) paradigms.However, there is a need to increase the autonomy of the robot both to lighten the burden on human therapists (who have to remain in control and, importantly, supervise the robot) and to provide a consistent therapeutic experience. This paper seeks to provide insight into increasing the autonomy level of social robots in therapy to move beyond WoZ. With the final aim of improved human-human social interaction for the children, this multidisciplinary research seeks to facilitate the use of social robots as tools in clinical situations by addressing the challenge of increasing robot autonomy.We introduce the clinical framework in which the developments are tested, alongside initial data obtained from patients in a first phase of the project using a WoZ set-up mimicking the targeted supervised-autonomy behaviour. We further describe the implemented system architecture capable of providing the robot with supervised autonomy.

Keywords : Robot-Enhanced Therapy; Autism Spectrum Disorders; Supervised Autonomy; Multi-sensory Data; Cognitive Controller

References

  • [1] American Psychiatric Association, Diagnostic and statistical manual of mental disorders, Arlington: American Psychiatric Publishing, 2013.Google Scholar

  • [2] Howlin P, Goode S, Hutton J, Rutter M, Adult outcome for children with autism, Journal of Child Psychology and Psychiatry 45(2):212-229, 2004.Google Scholar

  • [3] Mordre M, Groholt B, Knudsen AK, Sponheim E, Mykletun A, Myhre AM, Is long-term prognosis for pervasive developmental disorder not otherwise specified different from prognosis for autistic disorder? findings from a 30-year follow-up study, Journal of autism and developmental disorders 42(6):920-928, 2012.Google Scholar

  • [4] Dawson G, Osterling J, Early intervention in autism, The effectiveness of early intervention, 307-326, 1997.Google Scholar

  • [5] Roberts JM, Ridley G, Review of the research to identify the most effective models of best practice in the management of children with autism spectrum disorders, Centre for Development Disability Studies, 2004, University of Sydney.Google Scholar

  • [6] Eldevik S, Hastings RP, Hughes JC, Jahr E, Eikeseth S, Cross S, Meta-analysis of early intensive behavioral intervention for children with autism, Journal of Clinical Child & Adolescent Psychology 38(3):439-450, 2009.Google Scholar

  • [7] Peters-Scheffer, N., Didden, R., Korzilius, H., and Sturmey, P., A meta-analytic study on the effectiveness of comprehensive ABA-based early intervention programs for children with autism spectrum disorders. Research in Autism Spectrum Disorders, 5(1):60-69, 2011.Google Scholar

  • [8] De Rivera C, The use of intensive behavioural intervention for children with autism, Journal on developmental disabilities 14(2):1-15, 2008.Google Scholar

  • [9] Ozonoff S, Reliability and validity of the Wisconsin Card Sorting Test in studies of autism, Neuropsychology 9(4):491, 1995.CrossrefGoogle Scholar

  • [10] Diehl JJ, Schmitt LM, Villano M, Crowell CR, The clinical use of robots for individuals with autism spectrum disorders: A critical review, Research in autism spectrum disorders 6(1):249-262, 2012.Google Scholar

  • [11] Robins B, Dautenhahn K, Dubowski J, Does appearance matter in the interaction of children with autism with a humanoid robot?, Interaction Studies 7(3):509-542, 2006.Google Scholar

  • [12] David D, Matu SA, David OA, Robot-based psychotherapy: Concepts development, state of the art, and new directions, International Journal of Cognitive Therapy 7(2):192-210, 2014.Google Scholar

  • [13] Barakova EI, Lourens T, Expressing and interpreting emotional movements in social games with robots, Personal and ubiquitous computing 14(5):457-467, 2010.Google Scholar

  • [14] Chevalier P, Isableu B, Martin JC, and Tapus A, Individuals with autism: Analysis of the first interaction with nao robot based on their proprioceptive and kinematic profiles, In Advances in robot design and intelligent control, 225-233, 2016.Google Scholar

  • [15] Tapus A, Tapus C, Matarić MJ, User-robot personality matching and assistive robot behavior adaptation for post-stroke rehabilitation therapy, Intelligent Service Robotics 1(2):169-183, 2008.Google Scholar

  • [16] Albo-Canals J, Yanez C, Barco A, Bahón CA, Heerink M, Modelling social skills and problem solving strategies used by children with ASD through cloud connected social robots as data loggers: first modelling approach, Proceedings of New Friends 2015: the 1st international conference on social robots in therapy and education, 22-23, 2016.Google Scholar

  • [17] Boccanfuso L, Scarborough S, Abramson RK, Hall AV, Wright HH, O’Kane JM, A low-cost socially assistive robot and robot-assisted intervention for children with autism spectrum disorder: field trials and lessons learned, Autonomous Robots, 1-19, 2016.Google Scholar

  • [18] Yun SS, Kim H, Choi J, Park SK, A robot-assisted behavioral intervention system for children with autism spectrum disorders, Robotics and Autonomous Systems 76:58-67, 2016.Google Scholar

  • [19] Vanderborght B, Simut R, Saldien J, Pop C, Rusu AS, Pintea S, Lefeber D, David DO, Using the social robot probo as a social story telling agent for children with asd, Interaction Studies 13(3):348-372, 2012.CrossrefGoogle Scholar

  • [20] Simut R, Costescu CA, Vanderfaeillie J, Van de Perre G, Vanderborght B, Lefeber D, Can you cure me? children with autism spectrum disorders playing a doctor game with a social robot, International Journal on School Health 3(3),(Inpress), 2016.Google Scholar

  • [21] Simut R, Pop C, Vanderfaeillie J, Lefeber D, Vanderborght B, Trends and future of social robots for asd therapies: potential and limits in interaction, presented at the International Conference on Innovative Technologies for Austism Spectrum Disorders (ASD): tools, trends and testimonials, 2012.Google Scholar

  • [22] Huijnen C, Lexis M, Jansens R, Witte LP, Mapping Robots to Therapy and Educational Objectives for Children with Autism Spectrum Disorder, Journal of autism and developmental disorders 46(6):2100-2114, 2016.Google Scholar

  • [23] Landauer TK, Psychology as a mother of invention.,ACM SIGCHI Bulletin 18(4):333-335, 1987.Google Scholar

  • [24] Wilson J, Rosenberg D, Rapid prototyping for user interface design, Handbook of Human-Computer Interaction 39:859-873, 1988.Google Scholar

  • [25] Scassellati B, Admoni H, Mataric M, Robots for use in autism research, Annual review of biomedical engineering 14:275-294, 2012.Google Scholar

  • [26] Thill S, Pop CA, Belpaeme T, Ziemke T, Vanderborght B, Robot assisted therapy for autism spectrum disorders with (partially) autonomous control: Challenges and outlook, Paladyn, Journal of Behavioral Robotics 3(4):209-217, 2012.Google Scholar

  • [27] Cabibihan JJ, Javed H, Ang Jr M and Aljunied SM Why robots? A survey on the roles and benefits of social robots in the therapy of children with autism. International journal of social robotics 5(4):593-618, 2013.Google Scholar

  • [28] Robins B, Otero N, Ferrari E and Dautenhahn K, Eliciting requirements for a robotic toy for children with autism-results from user panels, Proceedings of the 16th IEEE international symposium on robot and human interactive communication (RO-MAN),101-106, 2007.Google Scholar

  • [29] Ferrari E, Robins B and Dautenhahn K, Therapeutic and educational objectives in robot assisted play for children with autism, Proceedings of the 18th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN), 108-114, 2009.Google Scholar

  • [30] Michaud F, Duquette A and Nadeau I, Characteristics of mobile robotic toys for children with pervasive developmental disorders, Proceedings of the IEEE International Conference on Systems, Man and Cybernetics, 3:2938-2943, 2003.Google Scholar

  • [31] Dennett DC, The intentional stance, MIT press, 1989.Google Scholar

  • [32] Arkin RC, Homeostatic control for a mobile robot: Dynamic replanning in hazardous environments. Journal of Robotic Systems, 9(2):197-214, 1992.Google Scholar

  • [33] Feil-Seifer D, Mataric MJ, B3IA: A control architecture for autonomous robot-assisted behavior intervention for children with Autism Spectrum Disorders, Proceedings on the 17th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN), 328-333, 2008.Google Scholar

  • [34] Cao HL, Gómez Esteban P, De Beir A, Simut R, Van De Perre G, Lefeber D, Vanderborght B, Toward a platform-independent social behavior architecture for multiple therapeutic scenarios, Proceedings of Conference New Friends, 3-32, 2015.Google Scholar

  • [35] Coeckelbergh M, Pop C, Simut R, Peca A, Pintea S, David D, Vanderborght B, A survey of expectations about the role of robots in robot-assisted therapy for children with asd: Ethical acceptability, trust, sociability, appearance, and attachment, Science and engineering ethics 22(1):47-65, 2016.Google Scholar

  • [36] Peca A, Robot enhanced therapy for children with autism disorders: Measuring ethical acceptability, IEEE Technology and Society Magazine 35(2):54-66, 2016.Google Scholar

  • [37] Dream project, 2016, http://www.dream2020.eu/Google Scholar

  • [38] Lord C, Risi S, Lambrecht L, Cook EH, Leventhal BL, DiLavore PC, Pickles A and Rutter M, The Autism Diagnostic Observation Schedule-Generic: A standard measure of social and communication deficits associated with the spectrumof autism, Journal of autism and developmental disorders 30(3):205-223, 2000.Google Scholar

  • [39] Ingersoll B, The social role of imitation in autism: Implications for the treatment of imitation deficits, Infants & Young Children 21(2):107-119, 2008.Google Scholar

  • [40] Baxter P, Wood R, Belpaeme T., A touchscreen-based sandtray to facilitate, mediate and contextualise human-robot social interaction, Proceedings of 7th ACM/IEEE International Conference on Human-Robot Interaction (HRI), 105-106, 2012.Google Scholar

  • [41] Shriberg LD, Paul R, McSweeny JL, Klin A, Cohen DJ and Volkmar FR, Speech and prosody characteristics of adolescents and adults with high functioning autism and Asperger syndrome, Journal of Speech, Language, and Hearing Research, 44:1097-1115, 2001.Google Scholar

  • [42] Oller D, Niyogi P, Gray S, Richards J, Gilkerson J, Xu D, Yapanel U and Warren S, Automated vocal analysis of naturalistic recordings from children with autism, language delay, and typical development, Proceedings of the National Academy of Sciences, 107(30):13354--13359, 2010.Google Scholar

  • [43] Halberstadt AG, Dunsmore JC and Denham SA, Spinning the pinwheel, together: More thoughts on affective social competence, Social Development, 10:130-136, 2001.CrossrefGoogle Scholar

  • [44] Cai H, Zhou X, Yu H, Liu H, Gaze estimation driven solution for interacting children with asd, Proceedings of 26th International Symposium on Micro-Nano Mechatronics and Human Science, 1-6, 2015.Google Scholar

  • [45] Xiong X, De la Torre F, Supervised descent method and its applications to face alignment, Proceedings of IEEE Conference on Computer Vision and Pattern Recognition, 532-539, 2013.Google Scholar

  • [46] Dementhon DF, Davis LS, Model-based object pose in 25 lines of code, International Journal of Computer Vision 15:123-141, 1995.Google Scholar

  • [47] Cai H, Liu B, Zhang J, Chen S, Liu H, Visual focus of attention estimation using eye center localization, IEEE Systems Journal 99:1-6, 2015.Google Scholar

  • [48] Cai H, Yu H, Yao C, Chen S, Liu H, Convolution-based means of gradient for fast eye centre localization, Proceedings of International Conference on Machine Learning and Cybernetics, 759-764, 2015.Google Scholar

  • [49] Timm F, Barth E, Accurate eye center localisation by means of gradients. Proceedings of 6th International Conference on Computer Vision Theory and Applications, 125-130, 2011.Google Scholar

  • [50] Bobick AF, Davis JW, The recognition of human movement using temporal templates, IEEE Transactions on Pattern Analysis and Machine Intelligence, 23:257-267, 2001.Google Scholar

  • [51] Niebles JC, Li F, A hierarchical model of shape and appearance for human action classification, Proceedings of IEEE Conference on Computer Vision and Pattern Recognition, 1-8, 2007.Google Scholar

  • [52] Niebles JC,Wang H, Li F, Unsupervised learning of human action categories using spatial-temporal words, International Journal of Computer Vision 79:299-318, 2008.Google Scholar

  • [53] Liu B, Yu H, Zhou X, Liu H, Combining 3d joints moving trend and geometry property for human action recognition, Proceedings of IEEE International Conference on Systems, Man, and Cybernetics, 1-6, 2016.Google Scholar

  • [54] Chang CC, Lin CJ, Libsvm: A library for support vector machines, ACM Transactions on Intelligent Systems and Technology 2:1-27, 2011.Google Scholar

  • [55] Li W, Zhang Z, Liu Z, Action recognition based on a bag of 3d points, Proceedings of IEEE Conference on Computer Vision and Pattern Recognition Workshops, 9-14, 2010.Google Scholar

  • [56] Ju Z, Ji X, Li J, Liu H, An integrative framework of human hand gesture segmentation for human-robot interaction, IEEE Systems Journal 99:1-11, 2015. Google Scholar

  • [57] Wang Y, Yu H, Stevens B, Liu H, Dynamic facial expression recognition using local patch and lbp-top, Proceedings of the 8th International Conference on Human System Interactions, 362-367, 2015.Google Scholar

  • [58] Ojala T, Pietikainen M, Maenpaa T, Multiresolution gray-scale and rotation invariant texture classification with local binary patterns, IEEE Transactions on Pattern Analysis and Machine Intelligence 24:971-987, 2002.Google Scholar

  • [59] Zhao G, Pietikainen M, Dynamic texture recognition using local binary patterns with an application to facial expression, IEEE Transactions on Pattern Analysis and Machine Intelligence 29:915-928, 2007.Google Scholar

  • [60] Agarwal L, Lakhwani K, Optimization of frame rate in real time object detection and tracking, International Journal of Scientific & Technology Research, 2:132-134, 2013.Google Scholar

  • [61] Zhou X, Yu H, Liu H, Li YF, Tracking multiple video targets with an improved GM-PHD tracker, Sensors 15(12):30240-30260, 2015.CrossrefGoogle Scholar

  • [62] Zhou X, Li YF, He B, Entropy distribution and coverage rate-based birth intensity estimation in GM-PHD filter for multi-target visual tracking, Signal Processing 94:650-660, 2014.CrossrefGoogle Scholar

  • [63] Zhou X, Li YF, He B, Bai T, GM-PHD-based multi-target visual tracking using entropy distribution and game theory, IEEE Transactions on Industrial Informatics 10:1064-1076, 2014.CrossrefGoogle Scholar

  • [64] Zhang S, Yu H, Dong J, Wang T, Qi L, Liu H, Combining kinect and pnp for camera pose estimation, Proceedings of 8th International Conference on Human System Interactions, 357-361, 2015.Google Scholar

  • [65] Kumatani K, Arakawa T, Yamamoto K, McDonough J, Raj B, Singh R and Tashev I, Microphone array processing for distant speech recognition: Towards real-world deployment. In IEEE Signal and Information Processing Association Annual Summit and Conference (APSIPA ASC), 1-10, 2012.Google Scholar

  • [66] Tashev, I, Recent advances in human-machine interfaces for gaming and entertainment. International journal of information technologies and security 3(3):69-76, 2011.Google Scholar

  • [67] Kinnunen T, Li H, An overview of text-independent speaker recognition: From features to super-vectors, Speech communication 52:12-40, 2010.CrossrefGoogle Scholar

  • [68] Drejing K, Thill S, Hemeren P, Engagement: A traceable motivational concept in human-robot interaction, Proceedings of the 2015 International Conference on Affective Computing and Intelligent Interaction (ACII), 956-961, 2015.Google Scholar

  • [69] Sloman A., Beyond shallow models of emotion, Cognitive Processing 2(1):177-198, 2001.Google Scholar

  • [70] Norman DA, Ortony A and Russell DM, Affect and machine design: Lessons for the development of autonomous machines, IBM Systems Journal 42(1):38-44, 2003.Google Scholar

  • [71] Belpaeme T, Baxter P, Read R, Wood R, Cuayáhuitl H, Kiefer B, Racioppa S, Kruijff-Korbayová I, Athanasopoulos G, Enescu V and Looije R,Multimodal child-robot interaction: Building social bonds. Journal of Human-Robot Interaction, 1(2):33-53, 2012.Google Scholar

  • [72] Gómez Esteban P, Cao HL, De Beir A, Van de Perre G, Lefeber D, Vanderborght B, A multilayer reactive system for robots interacting with children with autism, ArXiv preprint arXiv:1606.03875, 2016.Google Scholar

  • [73] Fujiwara K, Kanehiro F, Kajita S, Kaneko K, Yokoi K, Hirukawa H, Ukemi: falling motion control to minimize damage to biped humanoid robot, Proceeding of International Conference on Intelligent Robots and Systems, 2521-2526, 2002.Google Scholar

  • [74] Yun SK, Goswami A, Hardware experiments of humanoid robot safe fall using Aldebaran Nao, Proceedings of IEEE International Conference on Robotics and Automation (ICRA), 71-78, 2012.Google Scholar

  • [75] Ford C, Bugmann G, Culverhouse P, Modeling the human blink: A computational model for use within human-robot interaction, International Journal of Humanoid Robotics 10(1), 2013.Google Scholar

  • [76] Ferreira JF, and Dias J, Attentional Mechanisms for Socially Interactive Robots-A Survey, IEEE Transactions on Autonomous Mental Development 6(2):110-125, 2014.Google Scholar

  • [77] Zaraki A, Mazzei D, Giuliani M, De Rossi D, Designing and evaluating a social gaze-control system for a humanoid robot, IEEE Transactions on Human-Machine Systems 44(2):157-168, 2014.Google Scholar

  • [78] Lanillos P, Ferreira JF, and Dias J, Designing an artificial attention system for social robots, In IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 4171-4178, 2015.Google Scholar

  • [79] Sutton RS and Barto AG, Reinforcement learning: An introduction. 1(1), 1998, Cambridge: MIT press.Google Scholar

  • [80] LeCun Y, Bengio Y, and Hinton G, Deep learning. Nature, 521(7553):436-444, 2015.Google Scholar

  • [81] Amershi S, Cakmak M, Knox WB, and Kulesza T, Power to the people: The role of humans in interactive machine learning. AI Magazine, 35(4):105-120, 2014.CrossrefGoogle Scholar

  • [82] Fails JA and Olsen DR, Interactive machine learning, Proceedings of the 8th international conference on Intelligent user interfaces. ACM, 2003.Google Scholar

  • [83] Argall BD, Chernova S, Veloso M, and Browning B, A survey of robot learning from demonstration. Robotics and autonomous systems, 57(5):469-483, 2009.Google Scholar

  • [84] Billard A, Calinon S, Dillmann R, and Schaal S, Robot programming by demonstration. In Springer handbook of robotics, 1371-1394, 2008.Google Scholar

  • [85] Senft E, Baxter P, Kennedy J, Belpaeme T, Sparc: Supervised progressively autonomous robot competencies, Proceedings of International Conference on Social Robotics, 603-612, 2015.Google Scholar

  • [86] Senft E, Baxter P, Belpaeme T, Human-guided learning of social action selection for robot-assisted therapy, 4th Workshop on Machine Learning for Interactive Systems, 2015.Google Scholar

  • [87] Senft E, Lemaignan S, Baxter P and Belpaeme, T, SPARC: an efficient way to combine reinforcement learning and supervised autonomy, Future of Interactive Learning Machines workshop, 2016.Google Scholar

  • [88] Thomaz AL, and Breazeal C, Teachable robots: Understanding human teaching behavior to build more effective robot learners, Artificial Intelligence 172(6):716-737, 2008.CrossrefGoogle Scholar

  • [89] Sloman A, Varieties of Meta-cognition in Natural and Artificial Systems, In AAAI Workshop on Metareasoning, 8:12--20, 2011.Google Scholar

  • [90] Coeckelbergh M, Are emotional robots deceptive?, IEEE Transactions on Affective Computing, 3(4):388-393, 2012.Google Scholar

  • [91] Richardson K, An Anthropology of Robots and AI: Annihilation Anxiety and Machines, Routledge, 2015.Google Scholar

  • [92] Anderson M, Anderson SL, Machine ethics, Cambridge University Press, 2011.Google Scholar

  • [93] Eurobarometer, Public attitudes towards robots, 2012, http://ec.europa.eu/public_opinion/archives/ebs/ebs_382_en.pdfGoogle Scholar

  • [94] Itoh K, Miwa H, Matsumoto M, Zecca M, Takanobu H, Roccella S, Carrozza M, Dario P, Takanishi A, Various emotional expressions with emotion expression humanoid robot we-4rii, Proceedings of IEEE technical exhibition based conference on robotics and automation, 35-36, 2004. Google Scholar

  • [95] Sugiyama O, Kanda T, Imai M, Ishiguro H, Hagita N, Natural deictic communication with humanoid robots, Proceedings of IEEE/RSJ International Conference on Intelligent Robots and Systems, 1441-1448, 2007.Google Scholar

  • [96] Ido J, Matsumoto Y, Ogasawara T, Nisimura R, Humanoid with interaction ability using vision and speech information, Proceedings of IEEE/RSJ International conference on Intelligent Robots and Systems, 1316-1321, 2006.Google Scholar

  • [97] Zecca M, Mizoguchi Y, Endo K, Iida F, Kawabata Y, Endo N, Itoh K, Takanishi A, Whole body emotion expressions for kobian humanoid robot: preliminary experiments with different emotional patterns, Proceedings of the 18th IEEE International symposium on robot and human interactive communication, 381-386, 2009.Google Scholar

  • [98] Ekman P, Friesen W, Facial Action Coding System, Consulting Psychologists Press, 1978.Google Scholar

  • [99] Saldien J, Goris K, Vanderborght B, Vanderfaeillie J, Lefeber D, Expressing emotions with the social robot probo, International Journal of Social Robotics 2(4):377-389, 2010.Google Scholar

  • [100] De Beir A, Cao HL, Gómez Esteban P, Van De Perre G, Vanderborght B, Enhancing Nao Expression of Emotions Using Pluggable Eyebrows, International Journal of Social Robotics, 1-9, 2015.Google Scholar

  • [101] Atkinson AP, Dittrich WH, Gemmell AJ, Young AW, et al, Emotion perception from dynamic and static body expressions in point-light and full-light displays, Perception-London 33(6):717-746, 2004.Google Scholar

  • [102] Van de Perre G, Van Damme M, Lefeber D, Vanderborght B, Development of a generic method to generate upper-body emotional expressions for different social robots, Advanced Robotics 29(9):597-609, 2015.Google Scholar

  • [103] RocketBox (2016) http://www.rocketbox-libraries.comGoogle Scholar

  • [104] Hirai K, Hirose M, Haikawa Y, Takenaka T, The development of honda humanoid robot, Proceedings of the 1998 IEEE International Conference on Robotics and Automation, 2:1321-1326, 1998.Google Scholar

  • [105] Van de Perre G, De Beir A, Cao HL, Esteban PG, Lefeber D, and Vanderborght B, Reaching and pointing gestures calculated by a generic gesture system for social robots. Robotics and Autonomous Systems, 83:32-43, 2016.Google Scholar

  • [106] Yussof H, Salleh MH, Miskam MA, Shamsuddin S, Omar AR, Asknao apps targeting at social skills development for children with autism, Proceedings of IEEE International Conference on Automation Science and Engineering (CASE), 973-978, 2015.Google Scholar

About the article

Received: 2016-08-05

Accepted: 2017-04-09

Published Online: 2017-05-11

Published in Print: 2017-04-25


Citation Information: Paladyn, Journal of Behavioral Robotics, ISSN (Online) 2081-4836, DOI: https://doi.org/10.1515/pjbr-2017-0002.

Export Citation

© by Pablo G. Esteban. This work is licensed under the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 License. BY-NC-ND 4.0

Citing Articles

Here you can find all Crossref-listed publications in which this article is cited. If you would like to receive automatic email messages as soon as this article is cited in other publications, simply activate the “Citation Alert” on the top of this page.

[1]
Xiongyi Liu, Qing Wu, Wenbing Zhao, and Xiong Luo
Applied Sciences, 2017, Volume 7, Number 10, Page 1051

Comments (0)

Please log in or register to comment.
Log in