Skip to content
BY 4.0 license Open Access Published by De Gruyter Open Access December 13, 2019

Measuring emotions during learning: lack of coherence between automated facial emotion recognition and emotional experience

  • Franziska Hirt EMAIL logo , Egon Werlen , Ivan Moser and Per Bergamin
From the journal Open Computer Science

Abstract

Measuring emotions non-intrusively via affective computing provides a promising source of information for adaptive learning and intelligent tutoring systems. Using non-intrusive, simultaneous measures of emotions, such systems could steadily adapt to students emotional states. One drawback, however, is the lack of evidence on how such modern measures of emotions relate to traditional self-reports. The aim of this study was to compare a prominent area of affective computing, facial emotion recognition, to students’ self-reports of interest, boredom, and valence. We analyzed different types of aggregation of the simultaneous facial emotion recognition estimates and compared them to self-reports after reading a text. Analyses of 103 students revealed no relationship between the aggregated facial emotion recognition estimates of the software FaceReader and self-reports. Irrespective of different types of aggregation of the facial emotion recognition estimates, neither the epistemic emotions (i.e., boredom and interest), nor the estimates of valence predicted the respective self-report measure. We conclude that assumptions on the subjective experience of emotions cannot necessarily be transferred to other emotional components, such as estimated by affective computing. We advise to wait for more comprehensive evidence on the predictive validity of facial emotion recognition for learning before relying on it in educational practice.

References

[1] Wu C.H., Huang Y.M., Hwang J.P., Review of affective computing in education/learning: Trends and challenges, British Journal of Educational Technology, 47(6), 2016, 1304–1323, 10.1111/bjet.1232410.1111/bjet.12324Search in Google Scholar

[2] Bosch N., D’Mello S.K., Ocumpaugh J., Baker R.S., Shute V., Using video to automatically detect learner affect in computer-enabled classrooms, 2016, 10.1145/0000000.000000010.1145/2946837Search in Google Scholar

[3] Wang C.H., Lin H.C.K., Constructing an Affective Tutoring System for Designing Course Learning and Evaluation, Journal of Educational Computing Research, 55(8), 2018, 1111–1128, 10.1177/073563311769995510.1177/0735633117699955Search in Google Scholar

[4] Calvo R.A., D’Mello S., Affect Detection: An Interdisciplinary Review of Models, Methods, and Their Applications, IEEE Transactions on Affective Computing, 1(1), 2010, 18–37, 10.1109/TAFFC.2010.110.1109/T-AFFC.2010.1Search in Google Scholar

[5] Scherer K.R., What are emotions? And how can they be measured?, Social Science Information, 44(4), 2005, 695–729, 10.1177/053901840505821610.1177/0539018405058216Search in Google Scholar

[6] D’Mello S.K., Kappas A., Gratch J., The affective computing approach to affect measurement, Emotion Review, 10(2), 2018, 174–18310.1177/1754073917696583Search in Google Scholar

[7] D’mello S.K., Kory J., A Review and Meta-Analysis of Multimodal Affect Detection Systems, ACM Computing Surveys, 47(3), 2015, 1–36, 10.1145/268289910.1145/2682899Search in Google Scholar

[8] Soleymani M., Mortillaro M., Behavioral and Physiological Responses to Visual Interest and Appraisals: Multimodal Analysis and Automatic Recognition, Frontiers in ICT, 5(17), 2018, 10.3389/fict.2018.0001710.3389/fict.2018.00017Search in Google Scholar

[9] Bosch N., D’Mello S., Mills C., What emotions do novices experience during their first computer programming learning session?, Technical report, 2013, 10.1007/978-3-642-39112-5-210.1007/978-3-642-39112-5_2Search in Google Scholar

[10] Trigwell K., Ellis R.A., Han F., Relations between students’ approaches to learning, experienced emotions and outcomes of learning, Studies in Higher Education, 37(7), 2012, 811–824, 10.1080/03075079.2010.54922010.1080/03075079.2010.549220Search in Google Scholar

[11] Tze V.M.C., Daniels L.M., Klassen R.M., Evaluating the Relationship Between Boredom and Academic Outcomes: A Meta-Analysis, Educational Psychology Review, 28(1), 2016, 119–144, 10.1007/s10648-015-9301-y10.1007/s10648-015-9301-ySearch in Google Scholar

[12] Ekman P., Cordaro D., What is meant by calling emotions basic, Emotion Review, 3(4), 2011, 364–37010.1177/1754073911410740Search in Google Scholar

[13] Moors A., Ellsworth P.C., Scherer K., Frijda N., Appraisal theories of emotion: State of the art and future development, Emotion Review, 5(2), 2013, 119–12410.1177/1754073912468165Search in Google Scholar

[14] Soutschek A., Weinreich A., Schuber T., Facial Electromyography reveals dissociable affective responses in social and non-social cooperation, Motivation and Emotion, 42(1), 2018, 118–12510.1007/s11031-017-9662-2Search in Google Scholar

[15] Amos B., Ludwiczuk Bartosz Satyanarayanan M., Openface: A general-purpose face recognition library with mobile applications, 2016, 10.5281/zenodo.32148Search in Google Scholar

[16] Affectiva HomepageSearch in Google Scholar

[17] Noldus, Noldus HomepageSearch in Google Scholar

[18] Ekman P., Friesen W.V., Measuring facial movement, Environmental Psychology and Nonverbal Behavior, 1, 1976, 56–7510.1007/BF01115465Search in Google Scholar

[19] Loijens L., Krips O., FaceReader Methodology Note. A white paper by Noldus Information Technology, Technical report, Amsterdam: Noldus, 2018Search in Google Scholar

[20] Soleymani M., Detecting cognitive appraisals from facial expressions for interest recognition, preprint arXiv, 2016, arXiv:1609.09761v2Search in Google Scholar

[21] Bonanno G., Keltner D., Brief Report The coherence of emotion systems: Comparing “on-line” measures of appraisal and facial expressions, and self-report, Cognition & Emotion, 18(3), 2004, 431–444, 10.1080/0269993034100014910.1080/02699930341000149Search in Google Scholar

[22] Lewinski P., den Uyl T.M., Butler C., Automated facial coding: Validation of basic emotions and FACS AUs in FaceReader., Journal of Neuroscience, Psychology, and Economics, 7(4), 2014, 227–236, 10.1037/npe000002810.1037/npe0000028Search in Google Scholar

[23] Harley J.M., Bouchet F., Azevedo R., Aligning and comparing data on emotions experienced during learning with metatutor, in H. Lane, K. Yacef, J. Mostow, P. Pavlik, eds., Artificial Intelligence in Education. AIED 2013. Lecture Notes in Computer Science, vol 7926, Springer, Berlin, Heidelberg, 2013, 61–70, 10.1007/978-3-642-39112-5-710.1007/978-3-642-39112-5_7Search in Google Scholar

[24] Brodny G., Kolakowska A., Landowska A., Szwoch M., Szwoch W., Wrobel M.R., Comparison of selected off-the-shelf solutions for emotion recognition based on facial expressions, in 29th International Conference on Human System Interactions (HSI), IEEE, 2016, 397–404, 10.1109/HSI.2016.752966410.1109/HSI.2016.7529664Search in Google Scholar

[25] Suhr Y.T., FaceReader, a promising instrument for measuring facial emotion expression? A comparison to facial electromyography and self-reports, Ph.D. thesis, Master thesis, Utrecht University, 2017Search in Google Scholar

[26] Sneddon I., McRorie M., McKeown G., Hanratty J., The Belfast induced natural emotion database, IEEE Transactions on Affective Computing, 3(1), 2012, 32–41, 10.1109/T-AFFC.2011.2610.1109/T-AFFC.2011.26Search in Google Scholar

[27] Pekrun R., Vogl E., Muis K.R., Sinatra G.M., Measuring emotions during epistemic activities: the Epistemically-Related Emotion Scales, Cognition and Emotion, 31(6), 2017, 1268–1276, 10.1080/02699931.2016.120498910.1080/02699931.2016.1204989Search in Google Scholar PubMed

[28] Krapp A., Hidi S., Renninger A.K., Interest, learning, and development, in The role of interest in learning and development, Erlbaum, Hilsdale, NJ, 1991, 3–25Search in Google Scholar

[29] Russell J.A., A circumplex model of affect, Journal of Personality and Social Psychology, 39(6), 1980, 1161–117810.1037/h0077714Search in Google Scholar

[30] Flesch R., A new readability yardstick, Journal of Applied Psychology, 32(3), 1948, 221–23310.1037/h0057532Search in Google Scholar PubMed

[31] Amstad T., Wie verständlich sind unsere Zeitungen?, Studenten-Schreib-Service, Zürich, 1978Search in Google Scholar

[32] Suk H.J., Color and emotion - a study on the affective judgment across media and in relation to visual stimuli, Ph.D. thesis, Doctoral dissertation, University of Mannheim, 2006Search in Google Scholar

[33] Mathôt S., Schreij D., Theeuwes J., OpenSesame: An open-source, graphical experiment builder for the social sciences, Behavior Research Methods, 44(2), 2012, 314–324, 10.3758/s13428-011-0168-710.3758/s13428-011-0168-7Search in Google Scholar PubMed PubMed Central

[34] Grafsgaard J., Wiggins J.B., Boyer K.E., Wiebe E.N., Lester J., Automatically recognizing facial expression: predicting engagement and frustration, Educational Data Mining, 2013Search in Google Scholar

[35] Kapoor A., Mota S., Picard R.W., Towards a learning companion that recognizes affect, Technical Report 543, 2001Search in Google Scholar

[36] McDaniel B., D’Mello S., King B., Chipman P., Tapp K., Graesser A.C., Facial Features for Affective State Detection in Learning Environments Permalink, in Proceedings of the 29th Annual Cognitive Science Society, 2007, 467–472Search in Google Scholar

[37] Lewinski P., Don′t look blank, happy, or sad: Patterns of facial expressions of speakers in banks′ YouTube Videos predict video′s popularity over time, Journal of Neuroscience, Psychology, and Economics, 8(4), 2015, 1–9, 10.13140/RG.2.1.4653.640910.1037/npe0000046Search in Google Scholar

[38] Bürkner P.C., Vuorre M., Ordinal regression models in psychology: A tutorial, Advances in Methods and Practices in Psychological Science, 2(1), 2019, 251524591882319, 10.1177/251524591882319910.1177/2515245918823199Search in Google Scholar

[39] Bürkner P.C., brms: An R package for Bayesian multilevel models using Stan, Journal of Statistical Software, 80(1), 2017, 1–28, 10.18637/jss.v080.i0110.18637/jss.v080.i01Search in Google Scholar

[40] R Core Team, R: A language and environment for statistical computing., 2018Search in Google Scholar

[41] Heino M.T.J., Vuorre M., Hankonen N., Bayesian evaluation of behavior change interventions: a brief introduction and a practical example, Health Psychology and Behavioral Medicine, 6(1), 2018, 49–78, 10.1080/21642850.2018.142810210.1080/21642850.2018.1428102Search in Google Scholar PubMed PubMed Central

[42] Scherer K.R., What are emotions? and how can they be measured?, Social Science Information, 44(4), 2005, 695–729, 10.1177/053901840505821610.1177/0539018405058216Search in Google Scholar

[43] Zimmermann P., Guttormsen S., Danuser B., Gomez P., Affective computing - A rationale for measuring mood with mouse and keyboard, International Journal of Occupational Safety and Ergonomics, 9(4), 2003, 539–551, 10.1080/10803548.2003.1107658910.1080/10803548.2003.11076589Search in Google Scholar PubMed

[44] Feldman Barrett L., Adolphs R., Marsella S., Martinez A.M., Pollak S.D., Emotional expressions reconsidered: challenges to inferring emotion from human facial movements, Psychological Science in the Public Interest, 20(1), 2019, 1–68, 10.1177/152910061983293010.1177/1529100619832930Search in Google Scholar PubMed PubMed Central

[45] Feldman Barrett L., Quigley K.S., Bliss-Moreau E., Aronson K.R., Interoceptive sensitivity and self-reports of emotional experience, Journal of Personality and Social Psychology, 87(5), 2005, 684–697, 10.1016/j.molcel.2009.10.020.The10.1037/0022-3514.87.5.684Search in Google Scholar PubMed PubMed Central

[46] Rogosa D., Saner H., Longitudinal Data Analysis Examples with Random Coefficient Models, Journal of Educational and Behavioral Statistics, 20(2), 1995, 149–170, https://doi.org/10.3102/1076998602000214910.3102/10769986020002149Search in Google Scholar

[47] Lewinski P., Automated facial coding software outperforms people in recognizing neutral faces as neutral from standardized datasets, Frontiers in Psychology, 6, 2015, 1386, 10.3389/fpsyg.2015.0138610.3389/fpsyg.2015.01386Search in Google Scholar PubMed PubMed Central

Received: 2019-05-29
Accepted: 2019-09-16
Published Online: 2019-12-13

© 2019 Franziska Hirt et al., published by De Gruyter

This work is licensed under the Creative Commons Attribution 4.0 Public License.

Downloaded on 1.6.2023 from https://www.degruyter.com/document/doi/10.1515/comp-2019-0020/html
Scroll to top button