Jump to ContentJump to Main Navigation
Show Summary Details
More options …

Metrology and Measurement Systems

The Journal of Committee on Metrology and Scientific Instrumentation of Polish Academy of Sciences

4 Issues per year

IMPACT FACTOR 2016: 1.598

CiteScore 2016: 1.58

SCImago Journal Rank (SJR) 2016: 0.460
Source Normalized Impact per Paper (SNIP) 2016: 1.228

Open Access
See all formats and pricing
More options …
Volume 21, Issue 4


Emotion Monitoring – Verification of Physiological Characteristics Measurement Procedures

Agnieszka Landowska
  • Corresponding author
  • Gdańsk University of Technology, Faculty of Electronics, Telecommunications and Informatics, Software Engineering Department, Narutowicza 11/12, 80-233 Gdańsk, Poland
  • Email
  • Other articles by this author:
  • De Gruyter OnlineGoogle Scholar
Published Online: 2014-12-04 | DOI: https://doi.org/10.2478/mms-2014-0049


This paper concerns measurement procedures on an emotion monitoring stand designed for tracking human emotions in the Human-Computer Interaction with physiological characteristics. The paper addresses the key problem of physiological measurements being disturbed by a motion typical for human-computer interaction such as keyboard typing or mouse movements. An original experiment is described, that aimed at practical evaluation of measurement procedures performed at the emotion monitoring stand constructed at GUT. Different locations of sensors were considered and evaluated for suitability and measurement precision in the Human- Computer Interaction monitoring. Alternative locations (ear lobes and forearms) for skin conductance, blood volume pulse and temperature sensors were proposed and verified. Alternative locations proved correlation with traditional locations as well as lower sensitiveness to movements like typing or mouse moving, therefore they can make a better solution for monitoring the Human-Computer Interaction.

Keywords: affective computing; emotion recognition; physiology; motion artifacts; sensor location


  • [1] Boehner, K., dePaula, R., Dourish, P., Sengers, P. (2007). How emotion is made and measured. Int. J. Human-Computer Studies. 65, 275‒291.Google Scholar

  • [2] Landowska, A. (2013). Affect-awareness Framework for Intelligent Tutoring Systems. In Proc of HSI. Gdańsk. Poland. 540‒547.Google Scholar

  • [3] Gunes, H., Schuller, H. (2013). Categorical and dimensional affect analysis in continuous input: Current trends and future directions. Image and Vision Computing. 31. 120‒136.Google Scholar

  • [4] Szwoch, W. (2013). Using physiological signals for emotion recognition. In Proc of HSI. Gdańsk. Poland. 556‒561.Google Scholar

  • [5] Wang, J., Yin, L., Wei, X., Sun, Y., (2006). 3D facial expression recognition based on primitive surface feature distribution. Comp. Vision and Pattern Recogn.. 2. 1399‒1406.Google Scholar

  • [6] Zhang, J., Lipp, O., Oei, T., Zhou, R. (2011). The effects of arousal and valence on facial electromyograpjic asymmetry during blocked picture viewing. International Journal of Psychophysiology. 79. 378‒384.Web of ScienceGoogle Scholar

  • [7] Gunes, H. M., Piccardi, M. (2005). Affect Recognition from Face and Body: Early Fusion vs. Late Fusion. IEEE International Conference on Systems. Man and Cybernetics. 4. 3437‒3443.Google Scholar

  • [8] Vizer, L. M., Zhou, L., Sears, A. (2009). Automated stress detection using keystroke and linguistic features. Int. Journal of Human-Computer Studies. 67. 870‒886.Google Scholar

  • [9] Kołakowska, A. (2013). A review of emotion recognition methods based on keystroke dynamics and mouse movements. In Proc of HSI. Gdańsk. Poland. 548-555.Google Scholar

  • [10] Zeng, Z., Pantic, M., Roisman, G., Huang, T. (2009). A survey of affect recognition methods: Audio. visual. and spontaneous expressions. IEEE Transactions on Pattern Analysis and Machine Intelligence. 31(1). 39‒58.Web of ScienceGoogle Scholar

  • [11] Healey, J., Picard, R. (2005). Detecting stress during real-world driving tasks using physiological sensors. IEEE Trans. on Intelligent Transportation Systems. 6 (2). 156‒166.Google Scholar

  • [12] Lisetti, C., Nasoz, F. (2004). Using noninvasive wearable computers to recognize human emotions from physiological signals. EURASIP Journal on Applied Signal Processing. 11. 1672‒1687.CrossrefGoogle Scholar

  • [13] Mauss, I., Robinson, M. (2009). Measures of emotion: A review. Cogn. Emot.. 23(2). 209‒237.Google Scholar

  • [14] Jang, E., Rak, B., Kim, S., Sohn, J. (2012). Emotion classification by Machine Learning Algorithm using Physiological Signals. In. Proc. of Computer Science and Information Technology. Singapore. 25. 1‒5.Google Scholar

  • [15] BioGraph Infiniti and FlexComp Infiniti User Manual. Thought Technology. Canada. www.thoughttechnology.com (VII 2013).Google Scholar

  • [16] Boxtel, A. (2010). Facial EMG as a Tool for Inferring Affective States. In Proc. of Measuring Behavior. Eindhoven. The Netherlands. 104‒108.Google Scholar

  • [17] Bach, D., Flandin, G., Friston, K., Dolan, R. (2010). Modelling event-related skin conductance responses. Int J Psychophysiol. 75(3). 349-356.CrossrefPubMedGoogle Scholar

  • [18] Yousefi, R., Nourani, M., Ostadabbas, S., Panashi, I. (2013). A motion-tolerant adaptive algorithm for wearable photoplethymographic biosensors. IEEE J. Biomed. Health Inform.. 18. 670-681.Web of ScienceGoogle Scholar

  • [19] Sweeney, K. T., Ward, T. E., McLoone, S. F. (2012). Artifact Removal in Physiological Signals - Practices and Possibilities. IEEE Transactions on Information Technology in Biomedicine. 16 (3). 488‒500.Web of ScienceGoogle Scholar

  • [20] Sweeney, K. T., Kelly, D., Ward, T. E., McLoone, S. F. (2011). A Review of the State of the Art in Artifact Removal Technologies as used in an Assisted Living Domain. IET Conference on Assisted Living.Google Scholar

  • [21] Sweeney, K., McLoone, S., Ward, T. (2010). A simple bio-signals quality measure for in-home monitoring. 7th IASTED International Conference.Google Scholar

  • [22] Emotions in Human-Computer Interaction Research Group. www.emorg.eu (I 2014).Google Scholar

  • [23] Balakrishnan, G., Durand, F., Guttag, J. (2013). Detecting Pulse from Head Motions in Video. IEEE Conf. on Computer Vision and Pattern Recognition (CVPR). 3430-3437.Google Scholar

  • [24] Nimwegen, C., Leuven, K. (2009). Unobtrusive physiological measures to adapt system behavior: The GSR mouse. Presented at Key issues in sensory augmentation. Institute for Media Studies. KU Leuven. Nederlands.Google Scholar

  • [25] Landowska, A. (2013). Affective computing and affective learning - methods. tools and prospects. EduAction. Electronic education magazine. 1(5). 16‒31.Google Scholar

  • [26] Kołakowska, A., Landowska, A., Szwoch, M., Szwoch, W., Wrobel, M. R. (2013). Emotion Recognition and its Application in Software Engineering. In HSI Proc.. Poland. 532-539. Google Scholar

About the article

Received: 2014-02-10

Revised: 2014-12-15

Accepted: 2014-07-05

Published Online: 2014-12-04

Published in Print: 2014-12-01

Citation Information: Metrology and Measurement Systems, Volume 21, Issue 4, Pages 719–732, ISSN (Online) 2300-1941, DOI: https://doi.org/10.2478/mms-2014-0049.

Export Citation

© Polish Academy of Sciences. This work is licensed under the Creative Commons Attribution-NonCommercial-NoDerivatives 3.0 License. BY-NC-ND 3.0

Citing Articles

Here you can find all Crossref-listed publications in which this article is cited. If you would like to receive automatic email messages as soon as this article is cited in other publications, simply activate the “Citation Alert” on the top of this page.

Fernando Bevilacqua, Henrik Engström, and Per Backlund
Entertainment Computing, 2017
R. Piryani, D. Madhavi, and V.K. Singh
Information Processing & Management, 2017, Volume 53, Number 1, Page 122

Comments (0)

Please log in or register to comment.
Log in