Skip to content
BY-NC-ND 4.0 license Open Access Published by De Gruyter Open Access May 11, 2017

How to Build a Supervised Autonomous System for Robot-Enhanced Therapy for Children with Autism Spectrum Disorder

  • Pablo G. Esteban EMAIL logo , Paul Baxter , Tony Belpaeme , Erik Billing , Haibin Cai , Hoang-Long Cao , Mark Coeckelbergh , Cristina Costescu , Daniel David , Albert De Beir , Yinfeng Fang , Zhaojie Ju , James Kennedy , Honghai Liu , Alexandre Mazel , Amit Pandey , Kathleen Richardson , Emmanuel Senft , Serge Thill , Greet Van de Perre , Bram Vanderborght , David Vernon , Hui Yu and Tom Ziemke


Robot-Assisted Therapy (RAT) has successfully been used to improve social skills in children with autism spectrum disorders (ASD) through remote control of the robot in so-called Wizard of Oz (WoZ) paradigms.However, there is a need to increase the autonomy of the robot both to lighten the burden on human therapists (who have to remain in control and, importantly, supervise the robot) and to provide a consistent therapeutic experience. This paper seeks to provide insight into increasing the autonomy level of social robots in therapy to move beyond WoZ. With the final aim of improved human-human social interaction for the children, this multidisciplinary research seeks to facilitate the use of social robots as tools in clinical situations by addressing the challenge of increasing robot autonomy.We introduce the clinical framework in which the developments are tested, alongside initial data obtained from patients in a first phase of the project using a WoZ set-up mimicking the targeted supervised-autonomy behaviour. We further describe the implemented system architecture capable of providing the robot with supervised autonomy.


[1] American Psychiatric Association, Diagnostic and statistical manual of mental disorders, Arlington: American Psychiatric Publishing, 2013.10.1176/appi.books.9780890425596Search in Google Scholar

[2] Howlin P, Goode S, Hutton J, Rutter M, Adult outcome for children with autism, Journal of Child Psychology and Psychiatry 45(2):212-229, 2004.10.1111/j.1469-7610.2004.00215.xSearch in Google Scholar PubMed

[3] Mordre M, Groholt B, Knudsen AK, Sponheim E, Mykletun A, Myhre AM, Is long-term prognosis for pervasive developmental disorder not otherwise specified different from prognosis for autistic disorder? findings from a 30-year follow-up study, Journal of autism and developmental disorders 42(6):920-928, 2012.10.1007/s10803-011-1319-5Search in Google Scholar PubMed PubMed Central

[4] Dawson G, Osterling J, Early intervention in autism, The effectiveness of early intervention, 307-326, 1997.Search in Google Scholar

[5] Roberts JM, Ridley G, Review of the research to identify the most effective models of best practice in the management of children with autism spectrum disorders, Centre for Development Disability Studies, 2004, University of Sydney.Search in Google Scholar

[6] Eldevik S, Hastings RP, Hughes JC, Jahr E, Eikeseth S, Cross S, Meta-analysis of early intensive behavioral intervention for children with autism, Journal of Clinical Child & Adolescent Psychology 38(3):439-450, 2009.10.1080/15374410902851739Search in Google Scholar PubMed

[7] Peters-Scheffer, N., Didden, R., Korzilius, H., and Sturmey, P., A meta-analytic study on the effectiveness of comprehensive ABA-based early intervention programs for children with autism spectrum disorders. Research in Autism Spectrum Disorders, 5(1):60-69, 2011.10.1016/j.rasd.2010.03.011Search in Google Scholar

[8] De Rivera C, The use of intensive behavioural intervention for children with autism, Journal on developmental disabilities 14(2):1-15, 2008.Search in Google Scholar

[9] Ozonoff S, Reliability and validity of the Wisconsin Card Sorting Test in studies of autism, Neuropsychology 9(4):491, 1995.10.1037/0894-4105.9.4.491Search in Google Scholar

[10] Diehl JJ, Schmitt LM, Villano M, Crowell CR, The clinical use of robots for individuals with autism spectrum disorders: A critical review, Research in autism spectrum disorders 6(1):249-262, 2012.10.1016/j.rasd.2011.05.006Search in Google Scholar PubMed PubMed Central

[11] Robins B, Dautenhahn K, Dubowski J, Does appearance matter in the interaction of children with autism with a humanoid robot?, Interaction Studies 7(3):509-542, 2006.10.1075/is.7.3.16robSearch in Google Scholar

[12] David D, Matu SA, David OA, Robot-based psychotherapy: Concepts development, state of the art, and new directions, International Journal of Cognitive Therapy 7(2):192-210, 2014.10.1521/ijct.2014.7.2.192Search in Google Scholar

[13] Barakova EI, Lourens T, Expressing and interpreting emotional movements in social games with robots, Personal and ubiquitous computing 14(5):457-467, 2010.10.1007/s00779-009-0263-2Search in Google Scholar

[14] Chevalier P, Isableu B, Martin JC, and Tapus A, Individuals with autism: Analysis of the first interaction with nao robot based on their proprioceptive and kinematic profiles, In Advances in robot design and intelligent control, 225-233, 2016.10.1007/978-3-319-21290-6_23Search in Google Scholar

[15] Tapus A, Tapus C, Matarić MJ, User-robot personality matching and assistive robot behavior adaptation for post-stroke rehabilitation therapy, Intelligent Service Robotics 1(2):169-183, 2008.10.1007/s11370-008-0017-4Search in Google Scholar

[16] Albo-Canals J, Yanez C, Barco A, Bahón CA, Heerink M, Modelling social skills and problem solving strategies used by children with ASD through cloud connected social robots as data loggers: first modelling approach, Proceedings of New Friends 2015: the 1st international conference on social robots in therapy and education, 22-23, 2016.Search in Google Scholar

[17] Boccanfuso L, Scarborough S, Abramson RK, Hall AV, Wright HH, O’Kane JM, A low-cost socially assistive robot and robot-assisted intervention for children with autism spectrum disorder: field trials and lessons learned, Autonomous Robots, 1-19, 2016.10.1007/s10514-016-9554-4Search in Google Scholar

[18] Yun SS, Kim H, Choi J, Park SK, A robot-assisted behavioral intervention system for children with autism spectrum disorders, Robotics and Autonomous Systems 76:58-67, 2016.10.1016/j.robot.2015.11.004Search in Google Scholar

[19] Vanderborght B, Simut R, Saldien J, Pop C, Rusu AS, Pintea S, Lefeber D, David DO, Using the social robot probo as a social story telling agent for children with asd, Interaction Studies 13(3):348-372, 2012.10.1075/is.13.3.02vanSearch in Google Scholar

[20] Simut R, Costescu CA, Vanderfaeillie J, Van de Perre G, Vanderborght B, Lefeber D, Can you cure me? children with autism spectrum disorders playing a doctor game with a social robot, International Journal on School Health 3(3),(Inpress), 2016.10.17795/intjsh-29584Search in Google Scholar

[21] Simut R, Pop C, Vanderfaeillie J, Lefeber D, Vanderborght B, Trends and future of social robots for asd therapies: potential and limits in interaction, presented at the International Conference on Innovative Technologies for Austism Spectrum Disorders (ASD): tools, trends and testimonials, 2012.Search in Google Scholar

[22] Huijnen C, Lexis M, Jansens R, Witte LP, Mapping Robots to Therapy and Educational Objectives for Children with Autism Spectrum Disorder, Journal of autism and developmental disorders 46(6):2100-2114, 2016.10.1007/s10803-016-2740-6Search in Google Scholar PubMed PubMed Central

[23] Landauer TK, Psychology as a mother of invention.,ACM SIGCHI Bulletin 18(4):333-335, 1987.10.1145/1165387.275653Search in Google Scholar

[24] Wilson J, Rosenberg D, Rapid prototyping for user interface design, Handbook of Human-Computer Interaction 39:859-873, 1988.10.1016/B978-0-444-70536-5.50044-0Search in Google Scholar

[25] Scassellati B, Admoni H, Mataric M, Robots for use in autism research, Annual review of biomedical engineering 14:275-294, 2012.10.1146/annurev-bioeng-071811-150036Search in Google Scholar PubMed

[26] Thill S, Pop CA, Belpaeme T, Ziemke T, Vanderborght B, Robot assisted therapy for autism spectrum disorders with (partially) autonomous control: Challenges and outlook, Paladyn, Journal of Behavioral Robotics 3(4):209-217, 2012.10.2478/s13230-013-0107-7Search in Google Scholar

[27] Cabibihan JJ, Javed H, Ang Jr M and Aljunied SM Why robots? A survey on the roles and benefits of social robots in the therapy of children with autism. International journal of social robotics 5(4):593-618, 2013.10.1007/s12369-013-0202-2Search in Google Scholar

[28] Robins B, Otero N, Ferrari E and Dautenhahn K, Eliciting requirements for a robotic toy for children with autism-results from user panels, Proceedings of the 16th IEEE international symposium on robot and human interactive communication (RO-MAN),101-106, 2007.10.1109/ROMAN.2007.4415061Search in Google Scholar

[29] Ferrari E, Robins B and Dautenhahn K, Therapeutic and educational objectives in robot assisted play for children with autism, Proceedings of the 18th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN), 108-114, 2009.10.1109/ROMAN.2009.5326251Search in Google Scholar

[30] Michaud F, Duquette A and Nadeau I, Characteristics of mobile robotic toys for children with pervasive developmental disorders, Proceedings of the IEEE International Conference on Systems, Man and Cybernetics, 3:2938-2943, 2003.Search in Google Scholar

[31] Dennett DC, The intentional stance, MIT press, 1989.10.1017/S0140525X00058611Search in Google Scholar

[32] Arkin RC, Homeostatic control for a mobile robot: Dynamic replanning in hazardous environments. Journal of Robotic Systems, 9(2):197-214, 1992.10.1002/rob.4620090207Search in Google Scholar

[33] Feil-Seifer D, Mataric MJ, B3IA: A control architecture for autonomous robot-assisted behavior intervention for children with Autism Spectrum Disorders, Proceedings on the 17th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN), 328-333, 2008.10.1109/ROMAN.2008.4600687Search in Google Scholar

[34] Cao HL, Gómez Esteban P, De Beir A, Simut R, Van De Perre G, Lefeber D, Vanderborght B, Toward a platform-independent social behavior architecture for multiple therapeutic scenarios, Proceedings of Conference New Friends, 3-32, 2015.Search in Google Scholar

[35] Coeckelbergh M, Pop C, Simut R, Peca A, Pintea S, David D, Vanderborght B, A survey of expectations about the role of robots in robot-assisted therapy for children with asd: Ethical acceptability, trust, sociability, appearance, and attachment, Science and engineering ethics 22(1):47-65, 2016.10.1007/s11948-015-9649-xSearch in Google Scholar PubMed

[36] Peca A, Robot enhanced therapy for children with autism disorders: Measuring ethical acceptability, IEEE Technology and Society Magazine 35(2):54-66, 2016.10.1109/MTS.2016.2554701Search in Google Scholar

[37] Dream project, 2016, in Google Scholar

[38] Lord C, Risi S, Lambrecht L, Cook EH, Leventhal BL, DiLavore PC, Pickles A and Rutter M, The Autism Diagnostic Observation Schedule-Generic: A standard measure of social and communication deficits associated with the spectrumof autism, Journal of autism and developmental disorders 30(3):205-223, 2000.10.1023/A:1005592401947Search in Google Scholar

[39] Ingersoll B, The social role of imitation in autism: Implications for the treatment of imitation deficits, Infants & Young Children 21(2):107-119, 2008.10.1097/01.IYC.0000314482.24087.14Search in Google Scholar

[40] Baxter P, Wood R, Belpaeme T., A touchscreen-based sandtray to facilitate, mediate and contextualise human-robot social interaction, Proceedings of 7th ACM/IEEE International Conference on Human-Robot Interaction (HRI), 105-106, 2012.10.1145/2157689.2157707Search in Google Scholar

[41] Shriberg LD, Paul R, McSweeny JL, Klin A, Cohen DJ and Volkmar FR, Speech and prosody characteristics of adolescents and adults with high functioning autism and Asperger syndrome, Journal of Speech, Language, and Hearing Research, 44:1097-1115, 2001.10.1044/1092-4388(2001/087)Search in Google Scholar

[42] Oller D, Niyogi P, Gray S, Richards J, Gilkerson J, Xu D, Yapanel U and Warren S, Automated vocal analysis of naturalistic recordings from children with autism, language delay, and typical development, Proceedings of the National Academy of Sciences, 107(30):13354--13359, 2010.10.1073/pnas.1003882107Search in Google Scholar PubMed PubMed Central

[43] Halberstadt AG, Dunsmore JC and Denham SA, Spinning the pinwheel, together: More thoughts on affective social competence, Social Development, 10:130-136, 2001.10.1111/1467-9507.00153Search in Google Scholar

[44] Cai H, Zhou X, Yu H, Liu H, Gaze estimation driven solution for interacting children with asd, Proceedings of 26th International Symposium on Micro-Nano Mechatronics and Human Science, 1-6, 2015.10.1109/MHS.2015.7438336Search in Google Scholar

[45] Xiong X, De la Torre F, Supervised descent method and its applications to face alignment, Proceedings of IEEE Conference on Computer Vision and Pattern Recognition, 532-539, 2013.10.1109/CVPR.2013.75Search in Google Scholar

[46] Dementhon DF, Davis LS, Model-based object pose in 25 lines of code, International Journal of Computer Vision 15:123-141, 1995.10.1007/BF01450852Search in Google Scholar

[47] Cai H, Liu B, Zhang J, Chen S, Liu H, Visual focus of attention estimation using eye center localization, IEEE Systems Journal 99:1-6, 2015.Search in Google Scholar

[48] Cai H, Yu H, Yao C, Chen S, Liu H, Convolution-based means of gradient for fast eye centre localization, Proceedings of International Conference on Machine Learning and Cybernetics, 759-764, 2015.10.1109/ICMLC.2015.7340650Search in Google Scholar

[49] Timm F, Barth E, Accurate eye center localisation by means of gradients. Proceedings of 6th International Conference on Computer Vision Theory and Applications, 125-130, 2011.Search in Google Scholar

[50] Bobick AF, Davis JW, The recognition of human movement using temporal templates, IEEE Transactions on Pattern Analysis and Machine Intelligence, 23:257-267, 2001.10.1109/34.910878Search in Google Scholar

[51] Niebles JC, Li F, A hierarchical model of shape and appearance for human action classification, Proceedings of IEEE Conference on Computer Vision and Pattern Recognition, 1-8, 2007.10.1109/CVPR.2007.383132Search in Google Scholar

[52] Niebles JC,Wang H, Li F, Unsupervised learning of human action categories using spatial-temporal words, International Journal of Computer Vision 79:299-318, 2008.10.1007/s11263-007-0122-4Search in Google Scholar

[53] Liu B, Yu H, Zhou X, Liu H, Combining 3d joints moving trend and geometry property for human action recognition, Proceedings of IEEE International Conference on Systems, Man, and Cybernetics, 1-6, 2016.Search in Google Scholar

[54] Chang CC, Lin CJ, Libsvm: A library for support vector machines, ACM Transactions on Intelligent Systems and Technology 2:1-27, 2011.10.1145/1961189.1961199Search in Google Scholar

[55] Li W, Zhang Z, Liu Z, Action recognition based on a bag of 3d points, Proceedings of IEEE Conference on Computer Vision and Pattern Recognition Workshops, 9-14, 2010.10.1109/CVPRW.2010.5543273Search in Google Scholar

[56] Ju Z, Ji X, Li J, Liu H, An integrative framework of human hand gesture segmentation for human-robot interaction, IEEE Systems Journal 99:1-11, 2015. Search in Google Scholar

[57] Wang Y, Yu H, Stevens B, Liu H, Dynamic facial expression recognition using local patch and lbp-top, Proceedings of the 8th International Conference on Human System Interactions, 362-367, 2015.10.1109/HSI.2015.7170694Search in Google Scholar

[58] Ojala T, Pietikainen M, Maenpaa T, Multiresolution gray-scale and rotation invariant texture classification with local binary patterns, IEEE Transactions on Pattern Analysis and Machine Intelligence 24:971-987, 2002.10.1109/TPAMI.2002.1017623Search in Google Scholar

[59] Zhao G, Pietikainen M, Dynamic texture recognition using local binary patterns with an application to facial expression, IEEE Transactions on Pattern Analysis and Machine Intelligence 29:915-928, 2007.10.1109/TPAMI.2007.1110Search in Google Scholar PubMed

[60] Agarwal L, Lakhwani K, Optimization of frame rate in real time object detection and tracking, International Journal of Scientific & Technology Research, 2:132-134, 2013.Search in Google Scholar

[61] Zhou X, Yu H, Liu H, Li YF, Tracking multiple video targets with an improved GM-PHD tracker, Sensors 15(12):30240-30260, 2015.10.3390/s151229794Search in Google Scholar PubMed PubMed Central

[62] Zhou X, Li YF, He B, Entropy distribution and coverage rate-based birth intensity estimation in GM-PHD filter for multi-target visual tracking, Signal Processing 94:650-660, 2014.10.1016/j.sigpro.2013.08.002Search in Google Scholar

[63] Zhou X, Li YF, He B, Bai T, GM-PHD-based multi-target visual tracking using entropy distribution and game theory, IEEE Transactions on Industrial Informatics 10:1064-1076, 2014.10.1109/TII.2013.2294156Search in Google Scholar

[64] Zhang S, Yu H, Dong J, Wang T, Qi L, Liu H, Combining kinect and pnp for camera pose estimation, Proceedings of 8th International Conference on Human System Interactions, 357-361, 2015.Search in Google Scholar

[65] Kumatani K, Arakawa T, Yamamoto K, McDonough J, Raj B, Singh R and Tashev I, Microphone array processing for distant speech recognition: Towards real-world deployment. In IEEE Signal and Information Processing Association Annual Summit and Conference (APSIPA ASC), 1-10, 2012.Search in Google Scholar

[66] Tashev, I, Recent advances in human-machine interfaces for gaming and entertainment. International journal of information technologies and security 3(3):69-76, 2011.Search in Google Scholar

[67] Kinnunen T, Li H, An overview of text-independent speaker recognition: From features to super-vectors, Speech communication 52:12-40, 2010.10.1016/j.specom.2009.08.009Search in Google Scholar

[68] Drejing K, Thill S, Hemeren P, Engagement: A traceable motivational concept in human-robot interaction, Proceedings of the 2015 International Conference on Affective Computing and Intelligent Interaction (ACII), 956-961, 2015.10.1109/ACII.2015.7344690Search in Google Scholar

[69] Sloman A., Beyond shallow models of emotion, Cognitive Processing 2(1):177-198, 2001.Search in Google Scholar

[70] Norman DA, Ortony A and Russell DM, Affect and machine design: Lessons for the development of autonomous machines, IBM Systems Journal 42(1):38-44, 2003.10.1147/sj.421.0038Search in Google Scholar

[71] Belpaeme T, Baxter P, Read R, Wood R, Cuayáhuitl H, Kiefer B, Racioppa S, Kruijff-Korbayová I, Athanasopoulos G, Enescu V and Looije R,Multimodal child-robot interaction: Building social bonds. Journal of Human-Robot Interaction, 1(2):33-53, 2012.10.5898/JHRI.1.2.BelpaemeSearch in Google Scholar

[72] Gómez Esteban P, Cao HL, De Beir A, Van de Perre G, Lefeber D, Vanderborght B, A multilayer reactive system for robots interacting with children with autism, ArXiv preprint arXiv:1606.03875, 2016.Search in Google Scholar

[73] Fujiwara K, Kanehiro F, Kajita S, Kaneko K, Yokoi K, Hirukawa H, Ukemi: falling motion control to minimize damage to biped humanoid robot, Proceeding of International Conference on Intelligent Robots and Systems, 2521-2526, 2002.Search in Google Scholar

[74] Yun SK, Goswami A, Hardware experiments of humanoid robot safe fall using Aldebaran Nao, Proceedings of IEEE International Conference on Robotics and Automation (ICRA), 71-78, 2012.10.1109/ICRA.2012.6225126Search in Google Scholar

[75] Ford C, Bugmann G, Culverhouse P, Modeling the human blink: A computational model for use within human-robot interaction, International Journal of Humanoid Robotics 10(1), 2013.10.1142/S0219843613500060Search in Google Scholar

[76] Ferreira JF, and Dias J, Attentional Mechanisms for Socially Interactive Robots-A Survey, IEEE Transactions on Autonomous Mental Development 6(2):110-125, 2014.10.1109/TAMD.2014.2303072Search in Google Scholar

[77] Zaraki A, Mazzei D, Giuliani M, De Rossi D, Designing and evaluating a social gaze-control system for a humanoid robot, IEEE Transactions on Human-Machine Systems 44(2):157-168, 2014.10.1109/THMS.2014.2303083Search in Google Scholar

[78] Lanillos P, Ferreira JF, and Dias J, Designing an artificial attention system for social robots, In IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 4171-4178, 2015.10.1109/IROS.2015.7353967Search in Google Scholar

[79] Sutton RS and Barto AG, Reinforcement learning: An introduction. 1(1), 1998, Cambridge: MIT press.Search in Google Scholar

[80] LeCun Y, Bengio Y, and Hinton G, Deep learning. Nature, 521(7553):436-444, 2015.10.1038/nature14539Search in Google Scholar PubMed

[81] Amershi S, Cakmak M, Knox WB, and Kulesza T, Power to the people: The role of humans in interactive machine learning. AI Magazine, 35(4):105-120, 2014.10.1609/aimag.v35i4.2513Search in Google Scholar

[82] Fails JA and Olsen DR, Interactive machine learning, Proceedings of the 8th international conference on Intelligent user interfaces. ACM, 2003.10.1145/604045.604056Search in Google Scholar

[83] Argall BD, Chernova S, Veloso M, and Browning B, A survey of robot learning from demonstration. Robotics and autonomous systems, 57(5):469-483, 2009.10.1016/j.robot.2008.10.024Search in Google Scholar

[84] Billard A, Calinon S, Dillmann R, and Schaal S, Robot programming by demonstration. In Springer handbook of robotics, 1371-1394, 2008.10.1007/978-3-540-30301-5_60Search in Google Scholar

[85] Senft E, Baxter P, Kennedy J, Belpaeme T, Sparc: Supervised progressively autonomous robot competencies, Proceedings of International Conference on Social Robotics, 603-612, 2015.10.1007/978-3-319-25554-5_60Search in Google Scholar

[86] Senft E, Baxter P, Belpaeme T, Human-guided learning of social action selection for robot-assisted therapy, 4th Workshop on Machine Learning for Interactive Systems, 2015.Search in Google Scholar

[87] Senft E, Lemaignan S, Baxter P and Belpaeme, T, SPARC: an efficient way to combine reinforcement learning and supervised autonomy, Future of Interactive Learning Machines workshop, 2016.Search in Google Scholar

[88] Thomaz AL, and Breazeal C, Teachable robots: Understanding human teaching behavior to build more effective robot learners, Artificial Intelligence 172(6):716-737, 2008.10.1016/j.artint.2007.09.009Search in Google Scholar

[89] Sloman A, Varieties of Meta-cognition in Natural and Artificial Systems, In AAAI Workshop on Metareasoning, 8:12--20, 2011.10.7551/mitpress/9780262014809.003.0020Search in Google Scholar

[90] Coeckelbergh M, Are emotional robots deceptive?, IEEE Transactions on Affective Computing, 3(4):388-393, 2012.10.1109/T-AFFC.2011.29Search in Google Scholar

[91] Richardson K, An Anthropology of Robots and AI: Annihilation Anxiety and Machines, Routledge, 2015.10.4324/9781315736426Search in Google Scholar

[92] Anderson M, Anderson SL, Machine ethics, Cambridge University Press, 2011.10.1017/CBO9780511978036Search in Google Scholar

[93] Eurobarometer, Public attitudes towards robots, 2012, in Google Scholar

[94] Itoh K, Miwa H, Matsumoto M, Zecca M, Takanobu H, Roccella S, Carrozza M, Dario P, Takanishi A, Various emotional expressions with emotion expression humanoid robot we-4rii, Proceedings of IEEE technical exhibition based conference on robotics and automation, 35-36, 2004. Search in Google Scholar

[95] Sugiyama O, Kanda T, Imai M, Ishiguro H, Hagita N, Natural deictic communication with humanoid robots, Proceedings of IEEE/RSJ International Conference on Intelligent Robots and Systems, 1441-1448, 2007.Search in Google Scholar

[96] Ido J, Matsumoto Y, Ogasawara T, Nisimura R, Humanoid with interaction ability using vision and speech information, Proceedings of IEEE/RSJ International conference on Intelligent Robots and Systems, 1316-1321, 2006.10.1109/IROS.2006.281896Search in Google Scholar

[97] Zecca M, Mizoguchi Y, Endo K, Iida F, Kawabata Y, Endo N, Itoh K, Takanishi A, Whole body emotion expressions for kobian humanoid robot: preliminary experiments with different emotional patterns, Proceedings of the 18th IEEE International symposium on robot and human interactive communication, 381-386, 2009.10.1109/ROMAN.2009.5326184Search in Google Scholar

[98] Ekman P, Friesen W, Facial Action Coding System, Consulting Psychologists Press, 1978.10.1037/t27734-000Search in Google Scholar

[99] Saldien J, Goris K, Vanderborght B, Vanderfaeillie J, Lefeber D, Expressing emotions with the social robot probo, International Journal of Social Robotics 2(4):377-389, 2010.10.1007/s12369-010-0067-6Search in Google Scholar

[100] De Beir A, Cao HL, Gómez Esteban P, Van De Perre G, Vanderborght B, Enhancing Nao Expression of Emotions Using Pluggable Eyebrows, International Journal of Social Robotics, 1-9, 2015.Search in Google Scholar

[101] Atkinson AP, Dittrich WH, Gemmell AJ, Young AW, et al, Emotion perception from dynamic and static body expressions in point-light and full-light displays, Perception-London 33(6):717-746, 2004.10.1068/p5096Search in Google Scholar PubMed

[102] Van de Perre G, Van Damme M, Lefeber D, Vanderborght B, Development of a generic method to generate upper-body emotional expressions for different social robots, Advanced Robotics 29(9):597-609, 2015.10.1080/01691864.2015.1031697Search in Google Scholar

[103] RocketBox (2016) http://www.rocketbox-libraries.comSearch in Google Scholar

[104] Hirai K, Hirose M, Haikawa Y, Takenaka T, The development of honda humanoid robot, Proceedings of the 1998 IEEE International Conference on Robotics and Automation, 2:1321-1326, 1998.Search in Google Scholar

[105] Van de Perre G, De Beir A, Cao HL, Esteban PG, Lefeber D, and Vanderborght B, Reaching and pointing gestures calculated by a generic gesture system for social robots. Robotics and Autonomous Systems, 83:32-43, 2016.10.1016/j.robot.2016.06.006Search in Google Scholar

[106] Yussof H, Salleh MH, Miskam MA, Shamsuddin S, Omar AR, Asknao apps targeting at social skills development for children with autism, Proceedings of IEEE International Conference on Automation Science and Engineering (CASE), 973-978, 2015.10.1109/CoASE.2015.7294225Search in Google Scholar

Received: 2016-08-05
Accepted: 2017-04-09
Published Online: 2017-05-11
Published in Print: 2017-04-25

© 2017 Pablo G. Esteban et al

This work is licensed under the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 License.

Downloaded on 8.12.2023 from
Scroll to top button