Search Results

You are looking at 1 - 10 of 376 items :

  • "Human-Robot Interaction" x
Clear All

the 2017 ACM/IEEE International Conference on Human-Robot Interaction (HRI ’17), 2017, 225–226 [5] A. Sciutti, M. Mara, V. Tagliasco, G. Sandini, Humanizing human-robot interaction: On the importance of mutual understanding, IEEE Technology and Society Magazine, 2018, 37(1), 22–29 [6] R. Blake, M. Shiffrar, Perception of human motion, Annual Review of Psychology, 2007, 58, 47–73 [7] G. Sandini, A. Sciutti, F. Rea, Movement-based communication for humanoid-human interaction, In: A. Goswami, P. Vadakkepat (Eds.), Humanoid Robotics: A Reference, Springer, Dordrecht

References [1] N. Otero, S. Knoop, C. L. Nehaniv, D. Syrdal, K. Dautenhahn and R. Dillmann. Distribution and recognition of gestures in human-robot interaction. IEEE symposium on robots and human interactive communication (RO-MAN06), Hatfield, UK September 6-8, 2006. [2] B. Gates. A robot in every home. Scientific American 296(1):5865, 2007. [3] K. Dautenhahn, The art of designing socially intelligent agents: Science, fiction, and the human in the loop, Applied Artificial Intelligence, vol. 12, pp. 573617, 1998. [4] M. M. Haque, D. Das, T. Onuki, Y. Kobayashi and

heart rate and skin conductance, are important control variables in cooperative human-robot interaction as measures for, e. g., mental stress, which is of particular relevance in work environments, e. g., production, and long-term cooperation. Physiological responses deliver valuable objective measurement data to be interpreted in the light of subjective user experience measurements, and vice versa. In this way, also unconscious human reactions can be captured. Considering the human-robot interaction system a closed-loop system in terms of technical coordination as

Resolution in Natural Language Understanding, Active Vision, Memory Retrieval, and Robot Reasoning and Actuation, in Proc. of the IEEE International Conference on Systems, Man, and Cybernetics, 1999, pp. 988-993. [5] H. Yanco and J. Drury, Classifying Human-Robot Interaction: An Updated Taxonomy, in Proc. of the IEEE SMC 2004 International Conference on Systems, Man and Cybernetics, 2004, pp. 2841-2846. [6] H. Yanco, M. Baker, B. Keyes, and P. Thoren, Analysis of Human-Robot Interaction for Urban Search and Rescue, in Proc. of PERMIS, 2006. [7] J. Casper and R. R. Murphy

References [1] M. A. Goodrich, A. C. Schultz, Human-Robot Interaction: A survey Foundation and Trends in Human-Computer Interaction. 2007, 1, 203-275, http://dx.doi.org/10.1561/1100000005 [2] T.W. Fong, I. Nourbakhsh, K. Dautenhahn, A survey of socially interactive robots Robotics and Autonomous Systems. 2003, 42, 143-166 [3] J. G. Trafton, L. M. Hiatt, A. M. Harrison, F. P. Tamborello, S. S. Khemlani, A. C. Schultz, ACT-R/E: An embodied cognitive architecture for human-robot interaction, Journal of Human-Robot Interaction, 2013, 2, 30-55 [4] P. Langley, J. E

References [1] N. Mitsunaga, T. Miyashita, H. Ishiguro, K. Kogure, and N. Hagita, Robovie-IV: A Communication Robot Interacting with People Daily in an Office, In Proceedings of IROS 2006, p. 5066-5072. [2] C.D. Kidd, C. Breazeal, Robots at home: Understanding long-term human-robot interaction, In proceedings of IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 22-26 Sept. 2008, pp 3230-3235. [3] J. Sung, H.I. Christensen, and R.E. Grinter, Robots in the Wild: Understanding Long-Term Use. In Proceedings of the 4th ACM/IEEE international

Abstract

A system for recognition of emotions based on speech analysis can have interesting applications in human-robot interaction. In this paper, we carry out an exploratory study on the possibility to use a proposed methodology to recognize basic emotions (sadness, surprise, happiness, anger, fear and disgust) based on phonetic and acoustic properties of emotive speech with the minimal use of signal processing algorithms. We set up an experimental test, consisting of choosing three types of speakers, namely: (i) five adult European speakers, (ii) five Asian (Middle East) adult speakers and (iii) five adult American speakers. The speakers had to repeat 6 sentences in English (with durations typically between 1 s and 3 s) in order to emphasize rising-falling intonation and pitch movement. Intensity, peak and range of pitch and speech rate have been evaluated. The proposed methodology consists of generating and analyzing a graph of formant, pitch and intensity, using the open-source PRAAT program. From the experimental results, it was possible to recognize the basic emotions in most of the cases

, Washington: Pew Research Center, 2018 [6] A. Steinfeld, et al., Common metrics for human-robot interaction, In: Proceedings of the 1st ACM SIGCHI/SIGART Conference on Human-Robot Interaction, ACM, 2006, 33–40 [7] C. Bartneck, D. Kulić, E. Croft, S. Zoghbi, Measurement instruments for the anthropomorphism, animacy, likeability, perceived intelligence, and perceived safety of robots, International Journal of Social Robotics, 2009, 1(1), 71–81 [8] B. Reeves, C. I. Nass, The media equation: How people treat computers, television, and new media like real people and places

Open Access. © 2019 Cathrine Hasse et al., published by De Gruyter. This work is licensed under the Creative Commons Attribu- tion alone 4.0 License Paladyn, J. Behav. Robot. 2019; 10:180–181 Editorial Open Access Cathrine Hasse*, Stine Trentemøller, and Jessica Sorenson Special Issue on Ethnography in Human-Robot Interaction Research https://doi.org/10.1515/pjbr-2019-0015 Received March 20, 2019; accepted March 20, 2019 This special issue builds upon the ideas raised in a workshop¹ on ethnography as an alternative methodol- ogy at the 2018Human-Robot Interaction

/71078. Accessed Dec. 2014. [4] J. Beer and L. Takayama. Mobile remote presence systems for older adults: Acceptance, benefits, and concerns. In Proc. of the 6th Intl. Conf. on Human-Robot Interaction, pages 19–26. ACM, 2011. [5] E. Bergman and E. Johnson. Towards accessible humancomputer interaction. Advances in human-computer interaction, 5(1), 1995. [6] R. Bevilacqua, A. Cesta, G. Cortellessa, A. Macchione, A. Orlandini, and L. Tiberio. Telepresence robot at home: A long-term case study. In Ambient Assisted Living: Italian Forum 2013, pages 73–85. Springer International