How to Build a Supervised Autonomous System for Robot-Enhanced Therapy for Children with Autism Spectrum Disorder

Pablo G. Esteban 1 , Paul Baxter 2 , Tony Belpaeme 2 , Erik Billing 3 , Haibin Cai 4 , Hoang-Long Cao 5 , Mark Coeckelbergh 6 , Cristina Costescu 7 , Daniel David 7 , Albert De Beir 5 , Yinfeng Fang 4 , Zhaojie Ju 4 , James Kennedy 2 , Honghai Liu 4 , Alexandre Mazel 8 , Amit Pandey 8 , Kathleen Richardson 6 , Emmanuel Senft 2 , Serge Thill 3 , Greet Van de Perre 5 , Bram Vanderborght 5 , David Vernon 3 , Hui Yu 4  and Tom Ziemke 3
  • 1 Robotics and Multibody Mechanics Research Group, Agile & Human Centered Production and Robotic Systems Research Priority of Flanders Make, Vrije Universiteit Brussel, Brussels, Belgium
  • 2 Centre for Robotics and Neural Systems, Plymouth University, Plymouth, United Kingdom of Great Britain and Northern Ireland
  • 3 Interaction Lab School of Informatics, University of Skövde, Skövde, Sweden
  • 4 School of Computing, University of Portsmouth, Portsmouth, United Kingdom of Great Britain and Northern Ireland
  • 5 Robotics and Multibody Mechanics Research Group, Agile & Human Centered Production and Robotic Systems Research Priority of Flanders Make, Vrije Universiteit Brussel, Brussels, Belgium
  • 6 Centre for Computing and Social Responsibility, Faculty of Technology, De Montfort University, Leicester, United Kingdom of Great Britain and Northern Ireland
  • 7 Department of Clinical Psychology and Psychotherapy, Babeş-Bolyai University, Cluj-Napoca, Romania
  • 8 Softbank Robotics Europe, Paris, France

Abstract

Robot-Assisted Therapy (RAT) has successfully been used to improve social skills in children with autism spectrum disorders (ASD) through remote control of the robot in so-called Wizard of Oz (WoZ) paradigms.However, there is a need to increase the autonomy of the robot both to lighten the burden on human therapists (who have to remain in control and, importantly, supervise the robot) and to provide a consistent therapeutic experience. This paper seeks to provide insight into increasing the autonomy level of social robots in therapy to move beyond WoZ. With the final aim of improved human-human social interaction for the children, this multidisciplinary research seeks to facilitate the use of social robots as tools in clinical situations by addressing the challenge of increasing robot autonomy.We introduce the clinical framework in which the developments are tested, alongside initial data obtained from patients in a first phase of the project using a WoZ set-up mimicking the targeted supervised-autonomy behaviour. We further describe the implemented system architecture capable of providing the robot with supervised autonomy.

If the inline PDF is not rendering correctly, you can download the PDF file here.

  • [1] American Psychiatric Association, Diagnostic and statistical manual of mental disorders, Arlington: American Psychiatric Publishing, 2013.

  • [2] Howlin P, Goode S, Hutton J, Rutter M, Adult outcome for children with autism, Journal of Child Psychology and Psychiatry 45(2):212-229, 2004.

  • [3] Mordre M, Groholt B, Knudsen AK, Sponheim E, Mykletun A, Myhre AM, Is long-term prognosis for pervasive developmental disorder not otherwise specified different from prognosis for autistic disorder? findings from a 30-year follow-up study, Journal of autism and developmental disorders 42(6):920-928, 2012.

  • [4] Dawson G, Osterling J, Early intervention in autism, The effectiveness of early intervention, 307-326, 1997.

  • [5] Roberts JM, Ridley G, Review of the research to identify the most effective models of best practice in the management of children with autism spectrum disorders, Centre for Development Disability Studies, 2004, University of Sydney.

  • [6] Eldevik S, Hastings RP, Hughes JC, Jahr E, Eikeseth S, Cross S, Meta-analysis of early intensive behavioral intervention for children with autism, Journal of Clinical Child & Adolescent Psychology 38(3):439-450, 2009.

  • [7] Peters-Scheffer, N., Didden, R., Korzilius, H., and Sturmey, P., A meta-analytic study on the effectiveness of comprehensive ABA-based early intervention programs for children with autism spectrum disorders. Research in Autism Spectrum Disorders, 5(1):60-69, 2011.

  • [8] De Rivera C, The use of intensive behavioural intervention for children with autism, Journal on developmental disabilities 14(2):1-15, 2008.

  • [9] Ozonoff S, Reliability and validity of the Wisconsin Card Sorting Test in studies of autism, Neuropsychology 9(4):491, 1995.

  • [10] Diehl JJ, Schmitt LM, Villano M, Crowell CR, The clinical use of robots for individuals with autism spectrum disorders: A critical review, Research in autism spectrum disorders 6(1):249-262, 2012.

  • [11] Robins B, Dautenhahn K, Dubowski J, Does appearance matter in the interaction of children with autism with a humanoid robot?, Interaction Studies 7(3):509-542, 2006.

  • [12] David D, Matu SA, David OA, Robot-based psychotherapy: Concepts development, state of the art, and new directions, International Journal of Cognitive Therapy 7(2):192-210, 2014.

  • [13] Barakova EI, Lourens T, Expressing and interpreting emotional movements in social games with robots, Personal and ubiquitous computing 14(5):457-467, 2010.

  • [14] Chevalier P, Isableu B, Martin JC, and Tapus A, Individuals with autism: Analysis of the first interaction with nao robot based on their proprioceptive and kinematic profiles, In Advances in robot design and intelligent control, 225-233, 2016.

  • [15] Tapus A, Tapus C, Matarić MJ, User-robot personality matching and assistive robot behavior adaptation for post-stroke rehabilitation therapy, Intelligent Service Robotics 1(2):169-183, 2008.

  • [16] Albo-Canals J, Yanez C, Barco A, Bahón CA, Heerink M, Modelling social skills and problem solving strategies used by children with ASD through cloud connected social robots as data loggers: first modelling approach, Proceedings of New Friends 2015: the 1st international conference on social robots in therapy and education, 22-23, 2016.

  • [17] Boccanfuso L, Scarborough S, Abramson RK, Hall AV, Wright HH, O’Kane JM, A low-cost socially assistive robot and robot-assisted intervention for children with autism spectrum disorder: field trials and lessons learned, Autonomous Robots, 1-19, 2016.

  • [18] Yun SS, Kim H, Choi J, Park SK, A robot-assisted behavioral intervention system for children with autism spectrum disorders, Robotics and Autonomous Systems 76:58-67, 2016.

  • [19] Vanderborght B, Simut R, Saldien J, Pop C, Rusu AS, Pintea S, Lefeber D, David DO, Using the social robot probo as a social story telling agent for children with asd, Interaction Studies 13(3):348-372, 2012.

  • [20] Simut R, Costescu CA, Vanderfaeillie J, Van de Perre G, Vanderborght B, Lefeber D, Can you cure me? children with autism spectrum disorders playing a doctor game with a social robot, International Journal on School Health 3(3),(Inpress), 2016.

  • [21] Simut R, Pop C, Vanderfaeillie J, Lefeber D, Vanderborght B, Trends and future of social robots for asd therapies: potential and limits in interaction, presented at the International Conference on Innovative Technologies for Austism Spectrum Disorders (ASD): tools, trends and testimonials, 2012.

  • [22] Huijnen C, Lexis M, Jansens R, Witte LP, Mapping Robots to Therapy and Educational Objectives for Children with Autism Spectrum Disorder, Journal of autism and developmental disorders 46(6):2100-2114, 2016.

  • [23] Landauer TK, Psychology as a mother of invention.,ACM SIGCHI Bulletin 18(4):333-335, 1987.

  • [24] Wilson J, Rosenberg D, Rapid prototyping for user interface design, Handbook of Human-Computer Interaction 39:859-873, 1988.

  • [25] Scassellati B, Admoni H, Mataric M, Robots for use in autism research, Annual review of biomedical engineering 14:275-294, 2012.

  • [26] Thill S, Pop CA, Belpaeme T, Ziemke T, Vanderborght B, Robot assisted therapy for autism spectrum disorders with (partially) autonomous control: Challenges and outlook, Paladyn, Journal of Behavioral Robotics 3(4):209-217, 2012.

  • [27] Cabibihan JJ, Javed H, Ang Jr M and Aljunied SM Why robots? A survey on the roles and benefits of social robots in the therapy of children with autism. International journal of social robotics 5(4):593-618, 2013.

  • [28] Robins B, Otero N, Ferrari E and Dautenhahn K, Eliciting requirements for a robotic toy for children with autism-results from user panels, Proceedings of the 16th IEEE international symposium on robot and human interactive communication (RO-MAN),101-106, 2007.

  • [29] Ferrari E, Robins B and Dautenhahn K, Therapeutic and educational objectives in robot assisted play for children with autism, Proceedings of the 18th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN), 108-114, 2009.

  • [30] Michaud F, Duquette A and Nadeau I, Characteristics of mobile robotic toys for children with pervasive developmental disorders, Proceedings of the IEEE International Conference on Systems, Man and Cybernetics, 3:2938-2943, 2003.

  • [31] Dennett DC, The intentional stance, MIT press, 1989.

  • [32] Arkin RC, Homeostatic control for a mobile robot: Dynamic replanning in hazardous environments. Journal of Robotic Systems, 9(2):197-214, 1992.

  • [33] Feil-Seifer D, Mataric MJ, B3IA: A control architecture for autonomous robot-assisted behavior intervention for children with Autism Spectrum Disorders, Proceedings on the 17th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN), 328-333, 2008.

  • [34] Cao HL, Gómez Esteban P, De Beir A, Simut R, Van De Perre G, Lefeber D, Vanderborght B, Toward a platform-independent social behavior architecture for multiple therapeutic scenarios, Proceedings of Conference New Friends, 3-32, 2015.

  • [35] Coeckelbergh M, Pop C, Simut R, Peca A, Pintea S, David D, Vanderborght B, A survey of expectations about the role of robots in robot-assisted therapy for children with asd: Ethical acceptability, trust, sociability, appearance, and attachment, Science and engineering ethics 22(1):47-65, 2016.

  • [36] Peca A, Robot enhanced therapy for children with autism disorders: Measuring ethical acceptability, IEEE Technology and Society Magazine 35(2):54-66, 2016.

  • [37] Dream project, 2016, http://www.dream2020.eu/

  • [38] Lord C, Risi S, Lambrecht L, Cook EH, Leventhal BL, DiLavore PC, Pickles A and Rutter M, The Autism Diagnostic Observation Schedule-Generic: A standard measure of social and communication deficits associated with the spectrumof autism, Journal of autism and developmental disorders 30(3):205-223, 2000.

  • [39] Ingersoll B, The social role of imitation in autism: Implications for the treatment of imitation deficits, Infants & Young Children 21(2):107-119, 2008.

  • [40] Baxter P, Wood R, Belpaeme T., A touchscreen-based sandtray to facilitate, mediate and contextualise human-robot social interaction, Proceedings of 7th ACM/IEEE International Conference on Human-Robot Interaction (HRI), 105-106, 2012.

  • [41] Shriberg LD, Paul R, McSweeny JL, Klin A, Cohen DJ and Volkmar FR, Speech and prosody characteristics of adolescents and adults with high functioning autism and Asperger syndrome, Journal of Speech, Language, and Hearing Research, 44:1097-1115, 2001.

  • [42] Oller D, Niyogi P, Gray S, Richards J, Gilkerson J, Xu D, Yapanel U and Warren S, Automated vocal analysis of naturalistic recordings from children with autism, language delay, and typical development, Proceedings of the National Academy of Sciences, 107(30):13354--13359, 2010.

  • [43] Halberstadt AG, Dunsmore JC and Denham SA, Spinning the pinwheel, together: More thoughts on affective social competence, Social Development, 10:130-136, 2001.

  • [44] Cai H, Zhou X, Yu H, Liu H, Gaze estimation driven solution for interacting children with asd, Proceedings of 26th International Symposium on Micro-Nano Mechatronics and Human Science, 1-6, 2015.

  • [45] Xiong X, De la Torre F, Supervised descent method and its applications to face alignment, Proceedings of IEEE Conference on Computer Vision and Pattern Recognition, 532-539, 2013.

  • [46] Dementhon DF, Davis LS, Model-based object pose in 25 lines of code, International Journal of Computer Vision 15:123-141, 1995.

  • [47] Cai H, Liu B, Zhang J, Chen S, Liu H, Visual focus of attention estimation using eye center localization, IEEE Systems Journal 99:1-6, 2015.

  • [48] Cai H, Yu H, Yao C, Chen S, Liu H, Convolution-based means of gradient for fast eye centre localization, Proceedings of International Conference on Machine Learning and Cybernetics, 759-764, 2015.

  • [49] Timm F, Barth E, Accurate eye center localisation by means of gradients. Proceedings of 6th International Conference on Computer Vision Theory and Applications, 125-130, 2011.

  • [50] Bobick AF, Davis JW, The recognition of human movement using temporal templates, IEEE Transactions on Pattern Analysis and Machine Intelligence, 23:257-267, 2001.

  • [51] Niebles JC, Li F, A hierarchical model of shape and appearance for human action classification, Proceedings of IEEE Conference on Computer Vision and Pattern Recognition, 1-8, 2007.

  • [52] Niebles JC,Wang H, Li F, Unsupervised learning of human action categories using spatial-temporal words, International Journal of Computer Vision 79:299-318, 2008.

  • [53] Liu B, Yu H, Zhou X, Liu H, Combining 3d joints moving trend and geometry property for human action recognition, Proceedings of IEEE International Conference on Systems, Man, and Cybernetics, 1-6, 2016.

  • [54] Chang CC, Lin CJ, Libsvm: A library for support vector machines, ACM Transactions on Intelligent Systems and Technology 2:1-27, 2011.

  • [55] Li W, Zhang Z, Liu Z, Action recognition based on a bag of 3d points, Proceedings of IEEE Conference on Computer Vision and Pattern Recognition Workshops, 9-14, 2010.

  • [56] Ju Z, Ji X, Li J, Liu H, An integrative framework of human hand gesture segmentation for human-robot interaction, IEEE Systems Journal 99:1-11, 2015.

  • [57] Wang Y, Yu H, Stevens B, Liu H, Dynamic facial expression recognition using local patch and lbp-top, Proceedings of the 8th International Conference on Human System Interactions, 362-367, 2015.

  • [58] Ojala T, Pietikainen M, Maenpaa T, Multiresolution gray-scale and rotation invariant texture classification with local binary patterns, IEEE Transactions on Pattern Analysis and Machine Intelligence 24:971-987, 2002.

  • [59] Zhao G, Pietikainen M, Dynamic texture recognition using local binary patterns with an application to facial expression, IEEE Transactions on Pattern Analysis and Machine Intelligence 29:915-928, 2007.

  • [60] Agarwal L, Lakhwani K, Optimization of frame rate in real time object detection and tracking, International Journal of Scientific & Technology Research, 2:132-134, 2013.

  • [61] Zhou X, Yu H, Liu H, Li YF, Tracking multiple video targets with an improved GM-PHD tracker, Sensors 15(12):30240-30260, 2015.

  • [62] Zhou X, Li YF, He B, Entropy distribution and coverage rate-based birth intensity estimation in GM-PHD filter for multi-target visual tracking, Signal Processing 94:650-660, 2014.

  • [63] Zhou X, Li YF, He B, Bai T, GM-PHD-based multi-target visual tracking using entropy distribution and game theory, IEEE Transactions on Industrial Informatics 10:1064-1076, 2014.

  • [64] Zhang S, Yu H, Dong J, Wang T, Qi L, Liu H, Combining kinect and pnp for camera pose estimation, Proceedings of 8th International Conference on Human System Interactions, 357-361, 2015.

  • [65] Kumatani K, Arakawa T, Yamamoto K, McDonough J, Raj B, Singh R and Tashev I, Microphone array processing for distant speech recognition: Towards real-world deployment. In IEEE Signal and Information Processing Association Annual Summit and Conference (APSIPA ASC), 1-10, 2012.

  • [66] Tashev, I, Recent advances in human-machine interfaces for gaming and entertainment. International journal of information technologies and security 3(3):69-76, 2011.

  • [67] Kinnunen T, Li H, An overview of text-independent speaker recognition: From features to super-vectors, Speech communication 52:12-40, 2010.

  • [68] Drejing K, Thill S, Hemeren P, Engagement: A traceable motivational concept in human-robot interaction, Proceedings of the 2015 International Conference on Affective Computing and Intelligent Interaction (ACII), 956-961, 2015.

  • [69] Sloman A., Beyond shallow models of emotion, Cognitive Processing 2(1):177-198, 2001.

  • [70] Norman DA, Ortony A and Russell DM, Affect and machine design: Lessons for the development of autonomous machines, IBM Systems Journal 42(1):38-44, 2003.

  • [71] Belpaeme T, Baxter P, Read R, Wood R, Cuayáhuitl H, Kiefer B, Racioppa S, Kruijff-Korbayová I, Athanasopoulos G, Enescu V and Looije R,Multimodal child-robot interaction: Building social bonds. Journal of Human-Robot Interaction, 1(2):33-53, 2012.

  • [72] Gómez Esteban P, Cao HL, De Beir A, Van de Perre G, Lefeber D, Vanderborght B, A multilayer reactive system for robots interacting with children with autism, ArXiv preprint arXiv:1606.03875, 2016.

  • [73] Fujiwara K, Kanehiro F, Kajita S, Kaneko K, Yokoi K, Hirukawa H, Ukemi: falling motion control to minimize damage to biped humanoid robot, Proceeding of International Conference on Intelligent Robots and Systems, 2521-2526, 2002.

  • [74] Yun SK, Goswami A, Hardware experiments of humanoid robot safe fall using Aldebaran Nao, Proceedings of IEEE International Conference on Robotics and Automation (ICRA), 71-78, 2012.

  • [75] Ford C, Bugmann G, Culverhouse P, Modeling the human blink: A computational model for use within human-robot interaction, International Journal of Humanoid Robotics 10(1), 2013.

  • [76] Ferreira JF, and Dias J, Attentional Mechanisms for Socially Interactive Robots-A Survey, IEEE Transactions on Autonomous Mental Development 6(2):110-125, 2014.

  • [77] Zaraki A, Mazzei D, Giuliani M, De Rossi D, Designing and evaluating a social gaze-control system for a humanoid robot, IEEE Transactions on Human-Machine Systems 44(2):157-168, 2014.

  • [78] Lanillos P, Ferreira JF, and Dias J, Designing an artificial attention system for social robots, In IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 4171-4178, 2015.

  • [79] Sutton RS and Barto AG, Reinforcement learning: An introduction. 1(1), 1998, Cambridge: MIT press.

  • [80] LeCun Y, Bengio Y, and Hinton G, Deep learning. Nature, 521(7553):436-444, 2015.

  • [81] Amershi S, Cakmak M, Knox WB, and Kulesza T, Power to the people: The role of humans in interactive machine learning. AI Magazine, 35(4):105-120, 2014.

  • [82] Fails JA and Olsen DR, Interactive machine learning, Proceedings of the 8th international conference on Intelligent user interfaces. ACM, 2003.

  • [83] Argall BD, Chernova S, Veloso M, and Browning B, A survey of robot learning from demonstration. Robotics and autonomous systems, 57(5):469-483, 2009.

  • [84] Billard A, Calinon S, Dillmann R, and Schaal S, Robot programming by demonstration. In Springer handbook of robotics, 1371-1394, 2008.

  • [85] Senft E, Baxter P, Kennedy J, Belpaeme T, Sparc: Supervised progressively autonomous robot competencies, Proceedings of International Conference on Social Robotics, 603-612, 2015.

  • [86] Senft E, Baxter P, Belpaeme T, Human-guided learning of social action selection for robot-assisted therapy, 4th Workshop on Machine Learning for Interactive Systems, 2015.

  • [87] Senft E, Lemaignan S, Baxter P and Belpaeme, T, SPARC: an efficient way to combine reinforcement learning and supervised autonomy, Future of Interactive Learning Machines workshop, 2016.

  • [88] Thomaz AL, and Breazeal C, Teachable robots: Understanding human teaching behavior to build more effective robot learners, Artificial Intelligence 172(6):716-737, 2008.

  • [89] Sloman A, Varieties of Meta-cognition in Natural and Artificial Systems, In AAAI Workshop on Metareasoning, 8:12--20, 2011.

  • [90] Coeckelbergh M, Are emotional robots deceptive?, IEEE Transactions on Affective Computing, 3(4):388-393, 2012.

  • [91] Richardson K, An Anthropology of Robots and AI: Annihilation Anxiety and Machines, Routledge, 2015.

  • [92] Anderson M, Anderson SL, Machine ethics, Cambridge University Press, 2011.

  • [93] Eurobarometer, Public attitudes towards robots, 2012, http://ec.europa.eu/public_opinion/archives/ebs/ebs_382_en.pdf

  • [94] Itoh K, Miwa H, Matsumoto M, Zecca M, Takanobu H, Roccella S, Carrozza M, Dario P, Takanishi A, Various emotional expressions with emotion expression humanoid robot we-4rii, Proceedings of IEEE technical exhibition based conference on robotics and automation, 35-36, 2004.

  • [95] Sugiyama O, Kanda T, Imai M, Ishiguro H, Hagita N, Natural deictic communication with humanoid robots, Proceedings of IEEE/RSJ International Conference on Intelligent Robots and Systems, 1441-1448, 2007.

  • [96] Ido J, Matsumoto Y, Ogasawara T, Nisimura R, Humanoid with interaction ability using vision and speech information, Proceedings of IEEE/RSJ International conference on Intelligent Robots and Systems, 1316-1321, 2006.

  • [97] Zecca M, Mizoguchi Y, Endo K, Iida F, Kawabata Y, Endo N, Itoh K, Takanishi A, Whole body emotion expressions for kobian humanoid robot: preliminary experiments with different emotional patterns, Proceedings of the 18th IEEE International symposium on robot and human interactive communication, 381-386, 2009.

  • [98] Ekman P, Friesen W, Facial Action Coding System, Consulting Psychologists Press, 1978.

  • [99] Saldien J, Goris K, Vanderborght B, Vanderfaeillie J, Lefeber D, Expressing emotions with the social robot probo, International Journal of Social Robotics 2(4):377-389, 2010.

  • [100] De Beir A, Cao HL, Gómez Esteban P, Van De Perre G, Vanderborght B, Enhancing Nao Expression of Emotions Using Pluggable Eyebrows, International Journal of Social Robotics, 1-9, 2015.

  • [101] Atkinson AP, Dittrich WH, Gemmell AJ, Young AW, et al, Emotion perception from dynamic and static body expressions in point-light and full-light displays, Perception-London 33(6):717-746, 2004.

  • [102] Van de Perre G, Van Damme M, Lefeber D, Vanderborght B, Development of a generic method to generate upper-body emotional expressions for different social robots, Advanced Robotics 29(9):597-609, 2015.

  • [103] RocketBox (2016) http://www.rocketbox-libraries.com

  • [104] Hirai K, Hirose M, Haikawa Y, Takenaka T, The development of honda humanoid robot, Proceedings of the 1998 IEEE International Conference on Robotics and Automation, 2:1321-1326, 1998.

  • [105] Van de Perre G, De Beir A, Cao HL, Esteban PG, Lefeber D, and Vanderborght B, Reaching and pointing gestures calculated by a generic gesture system for social robots. Robotics and Autonomous Systems, 83:32-43, 2016.

  • [106] Yussof H, Salleh MH, Miskam MA, Shamsuddin S, Omar AR, Asknao apps targeting at social skills development for children with autism, Proceedings of IEEE International Conference on Automation Science and Engineering (CASE), 973-978, 2015.

OPEN ACCESS

Journal + Issues

Paladyn. Journal of Behavioral Robotics is a new, peer-reviewed, electronic-only journal that publishes original, high-quality research on topics broadly related to neuronally and psychologically inspired robots and other behaving autonomous systems.

Search