Jump to ContentJump to Main Navigation
Show Summary Details
More options …

Paladyn, Journal of Behavioral Robotics

Editor-in-Chief: Schöner, Gregor

1 Issue per year


CiteScore 2017: 0.33

SCImago Journal Rank (SJR) 2017: 0.104

Open Access
Online
ISSN
2081-4836
See all formats and pricing
More options …

How does the robot feel? Perception of valence and arousal in emotional body language

Mina Marmpena
  • Corresponding author
  • Centre for Robotics and Neural Systems, University of Plymouth, UK, SoftBank Robotics Europe, Paris, France
  • Email
  • Other articles by this author:
  • De Gruyter OnlineGoogle Scholar
/ Angelica Lim / Torbjørn S. Dahl
  • Centre for Robotics and Neural Systems, School of Computing, Electronics and Mathematics, University of Plymouth, UK
  • Email
  • Other articles by this author:
  • De Gruyter OnlineGoogle Scholar
Published Online: 2018-07-25 | DOI: https://doi.org/10.1515/pjbr-2018-0012

Abstract

Human-robot interaction in social robotics applications could be greatly enhanced by robotic behaviors that incorporate emotional body language. Using as our starting point a set of pre-designed, emotion conveying animations that have been created by professional animators for the Pepper robot, we seek to explore how humans perceive their affect content, and to increase their usability by annotating them with reliable labels of valence and arousal, in a continuous interval space. We conducted an experiment with 20 participants who were presented with the animations and rated them in the two-dimensional affect space. An inter-rater reliability analysis was applied to support the aggregation of the ratings for deriving the final labels. The set of emotional body language animations with the labels of valence and arousal is available and can potentially be useful to other researchers as a ground truth for behavioral experiments on robotic expression of emotion, or for the automatic selection of robotic emotional behaviors with respect to valence and arousal. To further utilize the data we collected, we analyzed it with an exploratory approach and we present some interesting trends with regard to the human perception of Pepper’s emotional body language, that might be worth further investigation.

Keywords: social robots; human robot interaction; dimensional affect; robot emotion expression

References

  • [1] C. Breazeal, Role of expressive behaviour for robots that learn from people, Philosophical Transactions of the Royal Society B: Biological Sciences, 2009, 364(1535), 3527-3538Google Scholar

  • [2] T. Fong, I. Nourbakhsh, K. Dautenhahn, A survey of socially interactive robots, Robotics and Autonomous Systems, 2003, 42(3), 143-166CrossrefGoogle Scholar

  • [3] I. Leite, G. Castellano, A. Pereira, C. Martinho, A. Paiva, Longterm interactions with empathic robots: evaluating perceived support in children, Proceedings of International Conference on Social Robotics (2012, Chengdu, China), Springer, Berlin, Heidelberg, 2012, 298-307Google Scholar

  • [4] B. J. MacLennan, Robots React, but Can They Feel?, In: J. Vallverdú, D. Casacuberta (Eds.), Handbook of Research on Synthetic Emotions and Sociable Robotics: New Applications in Affective Computing and Artificial Intelligence, IGI Global, 2009Google Scholar

  • [5] A. Paiva, I. Leite, T. Ribeiro, Emotion Modeling for Social Robots, In: R. A. Calvo, S. D’Mello, J. Gratch, A. Kappas (Eds.), The Oxford Handbook of Affective Computing, Oxford University Press, 2015Google Scholar

  • [6] P. Ekman, An argument for basic emotions, Cognition and Emotion, 2008, 6(3-4), 169-200CrossrefGoogle Scholar

  • [7] J. A. Russell, L. F. Barrett, Core affect, prototypical emotional episodes, and other things called emotion: dissecting the elephant, Journal of Personality and Social Psychology, 1999, 76(5), 805-819Google Scholar

  • [8] A. Ortony, G. L. Clore, A. Collins, The Cognitive Structure of Emotions, Cambridge University Press, 1990Google Scholar

  • [9] K. R. Scherer, A. Schorr, T. Johnstone (Eds.), Series in affective science. Appraisal processes in emotion: Theory, methods, research, Oxford University Press, 2001Google Scholar

  • [10] E. Hudlicka, H. Gunes, Benefits and limitations of continuous representations of emotions in affective computing: Introduction to the special issue, 2012, International Journal of Synthetic Emotions, 3(1), i-viGoogle Scholar

  • [11] R. Cowie, G. McKeown, E. Douglas-Cowie, Tracing emotion: An overview, International Journal of Synthetic Emotions, 2012, 3(1), 1-17Google Scholar

  • [12] J. Broekens, In defense of dominance: PAD usage in computational representations of affect, International Journal of Synthetic Emotions, 2012, 3(1), 33-42Google Scholar

  • [13] M. Mortillaro, B. Meuleman, K. R. Scherer, Advocating a componential appraisal model to guide emotion recognition, International Journal of Synthetic Emotions, 2012, 3(1), 18-32Google Scholar

  • [14] M. Lewis and L. Cañamero, Are Discrete Emotions Useful in Human-Robot Interaction? Feedback from Motion Capture Analysis, Proceedings of Humaine Association Conference on Affective Computing and Intelligent Interaction, (2013, Geneva, Switzerland), 97-102Google Scholar

  • [15] J. R. J. Fontaine, K. R. Scherer, E. B. Roesch, P. C. Ellsworth, The world of emotions is not two-dimensional, Psychological Science, 2007, 18(12), 1050-1057Google Scholar

  • [16] Lisetti, Hudlicka, Why and How to build Emotion-Based Agent Architectures, In: R. A. Calvo, S. D’Mello, J. Gratch, A. Kappas (Eds.), The Oxford Handbook of Affective Computing, Oxford University Press, 2015Google Scholar

  • [17] A. Kleinsmith, N. Bianchi-Berthouze, Affective body expression perception and recognition: A survey, IEEE Transactions on Affective Computing, 2013, 4(1), 15-33Google Scholar

  • [18] M. de Meijer, The contribution of general features of body movement to the attribution of emotions, Journal of Nonverbal Behavior, 1989, 13(4), 247-268CrossrefGoogle Scholar

  • [19] H. G. Wallbott, Bodily expression of emotion, European Journal of Social Psychology, 1998, 28(6), 879-896CrossrefGoogle Scholar

  • [20] N. Dael, M. Mortillaro, K. R. Scherer, Emotion expression in body action and posture, Emotion, 2012, 12(5), 1085-1101CrossrefWeb of ScienceGoogle Scholar

  • [21] M. Coulson, Attributing Emotion to static body postures: recognition accuracy, confusions, and viewpoint dependence, Journal of Nonverbal Behavior, 2004, 28(2), 117-139CrossrefGoogle Scholar

  • [22] A. Kleinsmith, N. Bianchi-Berthouze, Recognizing affective dimensions from body posture, Proceedings of International Conference on Affective Computing and Intelligent Interaction (2007, Lisbon, Portugal), Springer, Berlin, Heidelberg, 2007, 48-58Google Scholar

  • [23] T. Ribeiro, A. Paiva, The illusion of robotic life: Principles and practices of animation for robots, Proceedings of International Conference on Human-Robot Interaction (2012, Boston, USA), ACM New York, NY, USA, 2012, 383-390Google Scholar

  • [24] J. Monceaux, J. Becker, C. Boudier, A. Mazel, Demonstration: First steps in emotional expression of the humanoid robot Nao, Proceedings of International Conference on Multimodal Interfaces (2009, Cambridge, Massachusetts, USA), ACM New York, NY, USA, 2009, 235-236Google Scholar

  • [25] A. Beck, L. Cañamero, K. A. Bard, Towards an Affect Space for robots to display emotional body language, Proceedings of International Symposium in Robot and Human Interactive Communication (2010, Viareggio, Italy), IEEE, 2010, 464-469Google Scholar

  • [26] C. Tsiourti, A. Weiss, K. Wac, M. Vincze, Designing emotionally expressive Robots: A comparative study on the perception of communication modalities, Proceedings of International Conference on Human Agent Interaction (2017, Bielefeld, Germany), ACM New York, NY, USA, 2017, 213-222Google Scholar

  • [27] M. Destephe, T. Maruyama, M. Zecca, K. Hashimoto, A. Takanishi, Improving the human-robot interaction through emotive movements A special case: Walking, Proceedings of International Conference on Human-Robot Interaction (2013, Tokyo, Japan), IEEE Press Piscataway, NJ, USA, 2013, 115-116Google Scholar

  • [28] M. Destephe, A. Henning, M. Zecca, K. Hashimoto, A. Takanishi, Perception of emotion and emotional intensity in humanoid robots gait, Proceedings of International Conference on Robotics and Biomimetics (2013, Shenzhen, China), IEEE, 2013, 1276-1281Google Scholar

  • [29] M. Häring, N. Bee, E. André, Creation and evaluation of emotion expression with body movement, sound and eye color for humanoid robots, Proceedings of International Symposium in Robot and Human Interactive Communication (2011, Atlanta, USA), 204-209Google Scholar

  • [30] S. Embgen, M. Luber, C. Becker-Asano, M. Ragni, V. Evers, K. O. Arras, Robot-specific social cues in emotional body language, Proceedings of International Symposium in Robot and Human Interactive Communication (2012, Paris, France), IEEE, 2012, 1019-1025Google Scholar

  • [31] J. Li, M. Chignell, Communication of emotion in social robots through simple head and arm movements, International Journal of Social Robotics, 2011, 3(2), 125-142Google Scholar

  • [32] R. Laban, Modern Educational Dance, Macdonald & Evans Ltd, 1964Google Scholar

  • [33] H. Knight, R. Simmons, Expressive motion with x, y and theta: Laban Effort Features for mobile robots, Proceedings of International Symposium on Robot and Human Interactive Communication (2014, Edinburgh, UK), IEEE, 2014, 267-273Google Scholar

  • [34] M. Sharma, D. Hildebrandt, G. Newman, J. E. Young, R. Eskicioglu, Communicating affect via flight path: Exploring use of the Laban Effort System for designing affective locomotion paths, Proceedings of International Conference on Human-Robot Interaction (2013, Tokyo, Japan), IEEE Press Piscataway, NJ, USA, 2013, 293-300Google Scholar

  • [35] J. M. Angel-Fernandez, A. Bonarini, Robots showing emotions, Interaction Studies, 2016, 17(3), 408-437CrossrefGoogle Scholar

  • [36] J. Novikova, L. Watts, A Design model of emotional body expressions in non-humanoid robots, in Proceedings of International Conference on Human-agent Interaction (2014, Tsukuba, Japan), ACM New York, NY, USA, 2014, 353-360Google Scholar

  • [37] M. Masuda, S. Kato, H. Itoh, Laban-based motion rendering for emotional expression of human form robots, Proceedings of International Workshop on Knowledge Management and Acquisition for Smart Systems and Services (2010, Daegue, Korea), Springer Berlin Heidelberg, 2010, 49-60Google Scholar

  • [38] A. Lim, H. G. Okuno, The MEI robot: Towards using motherese to develop multimodal emotional intelligence, IEEE Transactions on Autonomous Mental Development, 2014, 6(2), 126-138Google Scholar

  • [39] S. Rossi, M. Staffa, A. Tamburro, Socially assistive robot for providing recommendations: comparing a humanoid robot with a mobile application, International Journal of Social Robotics, 2018, 10(2), 265-1278Google Scholar

  • [40] A. Betella, P. F. M. J. Verschure, The Affective Slider: A digital self-assessment scale for the measurement of human emotions, PLoS ONE, 2016, 11(2), e0148037, https://doi.org/10.1371/journal.pone.0148037Google Scholar

  • [41] D. Watson, L. A. Clark, A. Tellegen, Development and validation of brief measures of positive and negative affect: The PANAS scales, Journal of Personality and Social Psychology, 1988, 54(6), 1063-1070CrossrefGoogle Scholar

  • [42] P. E. Shrout, J. L. Fleiss, Intraclass correlations: Uses in assessing rater reliability, Psychological Bulletin, 1979, 86(2), 420-428CrossrefGoogle Scholar

  • [43] L. S. Feldt, D. J. Woodruff, F. A. Salih, Statistical inference for coefficient alpha, Applied Psychological Measurement, 1987, 11(1), 93-103CrossrefGoogle Scholar

  • [44] H. C. Kraemer, Extension of Feldt’s approach to testing homogeneity of coefficients of reliability, Psychometrika, 1981, 46(1), 41-45CrossrefGoogle Scholar

  • [45] B. Diedenhofen, J. Musch, cocron: A web interface and R package for the statistical comparison of Cronbach’s alpha coefficients, International Journal of Internet Science, 2016, 11(1), 51-60Google Scholar

  • [46] K. O. Mcgraw, S. P. Wong, Forming inferences about some intraclass correlation coefficients, Psychological Methods, 1996, 1(4), 390-390Google Scholar

  • [47] K. A. Hallgren, Computing inter-rater reliability for observational data: An overview and tutorial, Tutorials in Quantitative Methods for Psychology, 2012, 8(1), 23-34Google Scholar

  • [48] T. K. Koo, M. Y. Li, A guideline of selecting and reporting intraclass correlation coefficients for reliability research, Journal of Chiropractic Medicine, 2016, 15(2), 155-163Google Scholar

  • [49] H. Gunes, M. Pantic, Automatic, dimensional and continuous emotion recognition, International Journal of Synthetic Emotions, 2010, 1(1), 68-99Google Scholar

  • [50] L. F. Barrett, B. Mesquita, M. Gendron, Context in emotion perception, Current Directions in Psychological Science, 2011, 20(5), 286-290CrossrefGoogle Scholar

About the article

Received: 2017-11-30

Accepted: 2018-05-29

Published Online: 2018-07-25


Citation Information: Paladyn, Journal of Behavioral Robotics, Volume 9, Issue 1, Pages 168–182, ISSN (Online) 2081-4836, DOI: https://doi.org/10.1515/pjbr-2018-0012.

Export Citation

© Mina Marmpena et al. This work is licensed under the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 License. BY-NC-ND 4.0

Comments (0)

Please log in or register to comment.
Log in