Jump to ContentJump to Main Navigation
Show Summary Details
More options …

i-com

Journal of Interactive Media

Editor-in-Chief: Ziegler, Jürgen

3 Issues per year

Online
ISSN
2196-6826
See all formats and pricing
More options …
Volume 14, Issue 2

Issues

Human Capacities for Emotion Recognition and their Implications for Computer Vision

Benny Liebold / René Richter / Michael Teichmann / Fred H. Hamker / Peter Ohler
Published Online: 2015-07-12 | DOI: https://doi.org/10.1515/icom-2015-0032

Abstract

Current models for automated emotion recognition are developed under the assumption that emotion expressions are distinct expression patterns for basic emotions. Thereby, these approaches fail to account for the emotional processes underlying emotion expressions. We review the literature on human emotion processing and suggest an alternative approach to affective computing. We postulate that the generalizability and robustness of these models can be greatly increased by three major steps: (1) modeling emotional processes as a necessary foundation of emotion recognition; (2) basing models of emotional processes on our knowledge about the human brain; (3) conceptualizing emotions based on appraisal processes and thus regarding emotion expressions as expressive behavior linked to these appraisals rather than fixed neuro-motor patterns. Since modeling emotional processes after neurobiological processes can be considered a long-term effort, we suggest that researchers should focus on early appraisals, which evaluate intrinsic stimulus properties with little higher cortical involvement. With this goal in mind, we focus on the amygdala and its neural connectivity pattern as a promising structure for early emotional processing. We derive a model for the amygdala-visual cortex circuit from the current state of neuroscientific research. This model is capable of conditioning visual stimuli with body reactions to enable rapid emotional processing of stimuli consistent with early stages of psychological appraisal theories. Additionally, amygdala activity can feed back to visual areas to modulate attention allocation according to the emotional relevance of a stimulus. The implications of the model considering other approaches to automated emotion recognition are discussed.

Keywords: Emotions in HCI; Emotion Recognition; Neural Networks

References

  • Adolphs, R. and M. Spezio. 2006. Role of the amygdala in processing visual social stimuli. Prog. Brain. Res. 156: 363–378.Google Scholar

  • Amaral, D. G., H. Behniea and J. L. Kelly. 2003. Topographic organization of projections from the amygdala to the visual cortex in the macaque monkey. Neuroscience. 118(4): 1099–1120.CrossrefGoogle Scholar

  • Anderson, A. K. and E. A. Phelps. 2001. Lesions of the human amygdala impair enhanced perception of emotionally salient events. Nature, 411(6835): 305–309. doi: CrossrefGoogle Scholar

  • Barlett, M. S., G. Littlewort, P. Braathen, T. J. Sejnowski and J. R. Movellan. 2003. A prototype for automatic recognition of spontaneous facial actions. In: (S. Thrun and L. K. S. Saul, B., eds.), Advances in Neural Information Processing Systems. NIPS, Vancouver, CA, pp. 1271–1278.Google Scholar

  • Bartlett, M. S., G. Littlewort, M. Frank, C. Lainscsek, I. Fasel, and J. Movellan. 2006. Fully automatic facial action recognition in spontaneous behavior. 7th International Conference on Automatic Face and Gesture Recognition (FGR06).Google Scholar

  • Becker-Asano, C. 2013. WASABI for affect simulation in human-computer interaction: architecture description and example applications. Paper presented at the Emotion Representations and Modelling for HCI Systems Workshop, Sydney, Australia.Google Scholar

  • Becker-Asano, C. and I. Wachsmuth. 2009. Affective computing with primary and secondary emotions in a virtual human. Auton. Agent Multi Agent Syst. 20(1): 32–49. doi: CrossrefGoogle Scholar

  • Bergstrom, H. C., C. G. McDonald, S. Dey, H. Tang, R. G. Selwyn and L. R. Johnson. 2012. The structure of Pavlovian fear conditioning in the amygdala. Brain Struct. Func. 218(6): 1569–1589.CrossrefGoogle Scholar

  • Beuth, F. and F. H. Hamker. 2015. A mechanistic cortical microcircuit of attention for amplification, normalization and suppression. Vision Res. doi: CrossrefGoogle Scholar

  • Blascovich, J. 2002. A theoretical model of social influence for increasing the utility of collaborative virtual environments. In: (W. Broll, C. Greenhalgh and E. F. Churchill, eds), Collaborative virtual environments: Proceedings of the 4th international conference on collaborative virtual environments ACM, New York, pp. 25–30.Google Scholar

  • Bonda, E. 2000. Organization of connections of the basal and accessory basal nuclei in the monkey amygdala. Eur. J. Neurosci. 12(6): 1971–1992.CrossrefGoogle Scholar

  • Bosse, T. and E. Zwanenburg. 2009. There‘s always hope: enhancing agent believability through expectation-based emotions. 1–8. doi: CrossrefGoogle Scholar

  • Calvo, R. A. and S. D’Mello. 2010. Affect detection: an interdisciplinary review of models, methods, and their applications. IEEE Trans. Affect. Comput. 1(1): 18–37.CrossrefGoogle Scholar

  • Canamero, L. 2005. Emotion understanding from the perspective of autonomous robots research. Neural Netw. 18(4): 445–455. doi: CrossrefGoogle Scholar

  • Cassell, J. and K. R. Thorisson. 1999. The power of a nod and a glance: envelope vs. emotional feedback in animated conversational agents. Appl. Artif. Intell. 13(4–5): 519–538. doi: CrossrefGoogle Scholar

  • Catani, M., D. K. Jones, R. Donato and D. H. Ffytche. 2003). Occipito-temporal connections in the human brain. Brain. 126(9): 2093–2107.CrossrefGoogle Scholar

  • Creem, S. H. and D. R. Proffitt. 2001. Defining the cortical visual systems: “What”, “Where”, and “How”. Acta Psychol. 107(1–3): 43–68.CrossrefGoogle Scholar

  • de Melo, C. M., P. Carnevale and J. Gratch. (2012). The effect of virtual agents’ emotion displays and appraisals on people’s decision making in negotiation. In: (Y. Nakano, M. Neff, A. Paiva and M. Walker, eds) Intelligent virtual agents, 12th International Conference, IVA 2012, Santa Cruz, CA, USA. Springer, New York, pp. 53–66.Google Scholar

  • de Melo, C. M., P. J. Carnevale, S. J. Read and J. Gratch. 2014. Reading people‘s minds from emotion expressions in interdependent decision making. J. Pers. Soc. Psychol. 106(1): 73–88. doi: CrossrefGoogle Scholar

  • Demeure, V., R. Niewiadomski and C. Pelachaud. 2011. How is believability of a virtual agent related to warmth, competence, personification, and embodiment? Presence: teleoperators and virtual environments. 20(5): 431–448. doi: CrossrefGoogle Scholar

  • Ekman, P. 1972. Universals and cultural differences in facial expression of emotion. In: (J. D. Cole, ed) Nebraska symposium on motivation, 1971. University of Nebraska Press, Lincoln, pp. 207–282.Google Scholar

  • Fischer, A. H. and A. S. R. Manstead. 2008. Social functions of emotion. In: (M. Lewis, J. Haviland-Jones and L. Feldmann-Barrett, eds) Handbook of emotions, 3rd ed. Guilford Press, New York, pp. 456–468.Google Scholar

  • Frijda, N. H. 1986. The emotions. Studies in emotion and social interaction. Cambridge University Press, Cambridge, MA.Google Scholar

  • Gratch, J. and S. Marsella. 2004. A domain-independent framework for modeling emotion. Cogn. Syst. Res. 5(4): 269–306.CrossrefGoogle Scholar

  • Gratch, J. and S. Marsella. 2005. Lessons from emotion psychology for the design of lifelike characters. Appl. Artif. Intell. 19(3–4): 215–233. doi: CrossrefGoogle Scholar

  • Gschwind, M., G. Pourtois, S. Schwartz, D. Van De Ville and P. Vuilleumier. 2012. White-matter connectivity between face-responsive regions in the human brain. Cereb. Cortex. 22(7):1564–1576.CrossrefGoogle Scholar

  • Guadagno, R. E., J. Blascovich, J. N. Bailenson and C. McCall. 2007. Virtual humans and persuasion: the effects of agency and behavioral realism. Media Psychol. 10(1): 1–22. doi: CrossrefGoogle Scholar

  • Gunes, H. and M. Piccardi. 2007. Bi-modal emotion recognition from expressive face and body gestures. J. Netw. Comput. Appl. 30(4): 1334–1345.Google Scholar

  • Hitchcock, J. M. and M. Davis. 1991. Efferent pathway of the amygdala involved in conditioned fear as measured with the fear-potentiated startle paradigm. Behav. Neurosci. 105(6): 826–842.CrossrefGoogle Scholar

  • Holland and Gallagher. 1999. Amygdala circuitry in attentional and representational processes. Trends Cogn. Sci. 3(2): 65–73.Google Scholar

  • Izard, C. E. 1977. Human emotions. Plenum Press, New York.Google Scholar

  • Izard, C. E. 2010. The many meanings/aspects of emotion: definitions, functions, activation, and regulation. Emot. Rev. 2(4): 363–370. doi: CrossrefGoogle Scholar

  • James, W. 1884. What is an emotion? Mind. 9(34): 188–205.CrossrefGoogle Scholar

  • Javier, G., D. Sundgren, R. Rahmani, A. Larsson, A. Moran and I. Bonet. 2015. Speech emotion recognition in emotional feedback for Human-Robot Interaction. IJARAI. 4(2).Google Scholar

  • Kane, M. J. and R. W. Engle. 2002. The role of prefrontal cortex in working-memory capacity, executive attention, and general fluid intelligence: an individual-differences perspective. Psychon. Bull. Rev. 9(4): 637–671. doi: CrossrefGoogle Scholar

  • Kenny, P., T. D. Parsons, C. S. Pataki, M. Pato, C. ST-George, J. Sugar and A. A. Rizzo. 2008. Virtual justice: a PTSD virtual patient for clinical classroom training. ARCTT. 6: 111–116.Google Scholar

  • Kenny, P., T. D. Parsons, J. Gratch, A. Leuski and A. A. Rizzo. 2007. Virtual patients for clinical therapist skills training. 4722: 197–210. doi: CrossrefGoogle Scholar

  • Kensinger, E. A. and D. L. Schacter. 2006. Processing emotional pictures and words: effects of valence and arousal. Cogn. Affect. Behav. Neurosci. 6(2): 110–126. doi: CrossrefGoogle Scholar

  • Kleinsmith, A. and N. Bianchi-Berthouze. 2013. Affective body expression perception and recognition: a survey. IEEE Trans. Affect. Comput. 4(1): 15–33.CrossrefGoogle Scholar

  • Koelstra, S., M. Pantic and I. Patras. 2010. A dynamic texture-based approach to recognition of facial actions and their temporal models. IEEE J. PAMI. 32(11): 1940–1954.Google Scholar

  • Kotsia, I., S. Zafeiriou, N. Nikolaidis and I. Pitas. 2008. Texture and shape information fusion for facial action unit recognition. First International Conference on Advances in Computer-Human Interaction.Google Scholar

  • Krämer, N. C., I. A. Iurgel and G. Bente. 2005. Emotion and motivation in embodied conversational agents. Paper presented at the AISB‘05 Convention, Symposium on Agents that Want and Like: Motivational and Emotional Roots of Cognition and Action, Hatfield, UK.Google Scholar

  • Krämer, N. C., S. Kopp, C. Becker-Asano and N. Sommer. 2013. Smile and the world will smile with you — the effects of a virtual agent‘s smile on users’ evaluation and behavior. Int. J. Hum. Comput. Stud. 71(3): 335–349. doi: CrossrefGoogle Scholar

  • Kukla, E. and P. Nowak. 2015. Facial emotion recognition based on cascade of neural networks. Adv. Intel. Syst. Comput. 67–78.CrossrefGoogle Scholar

  • Lahbiri, M., A. Fnaiech, M. Bouchouicha, M. Sayadi and P. Gorce. 2013. Facial emotion recognition with the hidden Markov model. 2013 International Conference on Electrical Engineering and Software Applications.Google Scholar

  • Lazarus, R. S. 1991. Emotion and adaption. Oxford University Press,Oxford, UK.Google Scholar

  • LeDoux, J. E., J. Iwata, P. Cicchetti and D. J. Reis. 1988. Different projections of the central amygdaloid nucleus mediate autonomic and behavioral correlates of conditioned fear. J. Neurosci. 8(7): 2517–2529.Google Scholar

  • Lee, J. and S. C. Marsella. 2010. Predicting speaker head nods and the effects of affective information. IEEE Trans. Multimedia. 12(6): 552–562. doi: CrossrefGoogle Scholar

  • Levenson, R. W. 1999. The intrapersonal functions of emotion. Cogn. Emot. 13(5): 481–504. doi: CrossrefGoogle Scholar

  • Li, Y., S. Wang, Y. Zhao and Q. Ji. 2013. Simultaneous facial feature tracking and facial expression recognition. IEEE Trans. Image Process. 22(7): 2559–2573.CrossrefGoogle Scholar

  • Liebold, B. and P. Ohler. 2013. Multimodal emotion expressions of virtual agents. Mimic and vocal emotion expressions and their effects on emotion recognition. In: (T. Pun, C. Pelachaud and N. Sebe, eds) 2013 Humaine Association conference on Affective Computing and Intelligent Interaction, ACII 2013. IEEE, Los Alamitos, CA, pp. 405–410.Google Scholar

  • Lin, H.-C., S.-C. Mao, C.-L. Su and P.-W. Gean. 2010. Alterations of excitatory transmission in the lateral amygdala during expression and extinction of fear memory. Int. J. Neuropsychopharmacol. 13(3): 335–345.CrossrefGoogle Scholar

  • Lindquist, K. A., T. D. Wager, H. Kober, E. Bliss-Moreau and L. F. Barrett. 2012. The brain basis of emotion: a meta-analytic review. Behav. Brain Sci. 35(3): 121–143. doi: CrossrefGoogle Scholar

  • Lozano-Monasor, E., M. T. López, A. Fernández-Caballero and F. Vigo-Bustos. 2014. Facial expression recognition from webcam based on active shape models and support vector machines. Lect. Notes Comput. Sc. 147–154.CrossrefGoogle Scholar

  • Manstead, A. S. R. and A. H. Fischer. 2001. Social appraisal: the social world as object of and influence on appraisal processes. In: (K. R. Scherer, A. Schorr and T. Johnstone, eds) Appraisal processes in emotion: theory, research, application. Oxford University Press, New York, pp. 221–232.Google Scholar

  • McDonald, A. J. 1998. Cortical pathways to the mammalian amygdala. Prog. Neurobiol. 55(3): 257–332.CrossrefGoogle Scholar

  • Mehrabian, A. and J. A. Russell. 1974. An approach to environmental psychology. MIT Press, Cambridge, MA.Google Scholar

  • Mulligan, K. and K. R. Scherer. 2012. Toward a working definition of emotion. Emot. Rev. 4(4): 345–357. doi: CrossrefGoogle Scholar

  • Murphy, F. C., I. Nimmo-Smith and A. D. Lawrence. 2003. Functional neuroanatomy of emotions: a meta-analysis. Cogn. Affect. Behav. Neurosci. 3(3): 207–233. doi: CrossrefGoogle Scholar

  • Oatley, K. and J. M. Jenkins. 1996. Understanding emotions. Blackwell, Oxford, UK.Google Scholar

  • Okon-Singer, H., T. Hendler, L. Pessoa and A. J. Shackman. 2015. The neurobiology of emotion-cognition interactions: fundamental questions and strategies for future research. Front. Hum. Neurosci. 9: 58. doi: CrossrefGoogle Scholar

  • Pape, H.-C. and D. Pare.2010. Plastic synaptic networks of the amygdala for the acquisition, expression, and extinction of conditioned fear. Physiol. Rev. 90(2): 419–463.CrossrefGoogle Scholar

  • Perikos, I., E. Ziakopoulos and I. Hatzilygeroudis. 2014. Recognizing emotions from facial expressions using neural network. IFIP AICT. 236–245.Google Scholar

  • Pessoa, L. 2008. On the relationship between emotion and cognition. Nat. Rev. Neurosci. 9(2): 148–158. doi: CrossrefGoogle Scholar

  • Pessoa, L. 2012. Beyond brain regions: network perspective of cognition–emotion interactions. Behav. Brain Sci. 35(3): 158–159. doi: CrossrefGoogle Scholar

  • Pessoa, L. and R. Adolphs. 2010. Emotion processing and the amygdala: from a ‘low road’ to ‘many roads’ of evaluating biological significance. Nat. Rev. Neurosci. 11(11): 773–783. doi: CrossrefGoogle Scholar

  • Phan, K. L., T. Wager, S. F. Taylor and I. Liberzon. 2002. Functional neuroanatomy of emotion: a meta-analysis of emotion activation studies in PET and fMRI. Neuroimage. 16(2): 331–348. doi: CrossrefGoogle Scholar

  • Phillips, A. T., H. M. Wellman and E. S. Spelke. 2002. Infants‘ ability to connect gaze and emotional expression to intentional action. Cognition. 85(1): 53–78. doi: CrossrefGoogle Scholar

  • Piatkowska, E. and J. Martyna. 2012. Computer Recognition of Facial Expressions of Emotion. Lect. Notes Comput. Sc. 405–414.CrossrefGoogle Scholar

  • Picard, R. W. 1997. Affective computing. MIT Press, Cambridge, MA.Google Scholar

  • Plutchik, R. 1980. Emotion. A psychoevolutionary synthesis. Harper & Row, New York.Google Scholar

  • Premack, D. and G. Woodruff. 1978. Does the chimpanzee have a theory of mind? Behav. Brain Sci. 1(04): 515–526. doi: CrossrefGoogle Scholar

  • Prevost, L., R. Belaroussi and M. Milgram. 2006. Multiple neural networks for facial feature localization in orientation-free face images. Lect. Notes Comput. Sc. 188–197.CrossrefGoogle Scholar

  • Qu, C., W.-P. Brinkman, Y. Ling, P. Wiggers and I. Heynderickx. 2014. Conversations with a virtual human: synthetic emotions and human responses. Comput. Human Behav. 34: 58–68. doi: CrossrefGoogle Scholar

  • Russell, J. A. and L. F. Barrett. 1999. Core affect, prototypical emotional episodes, and other things called emotion: dissecting the elephant. J. Pers. Soc. Psychol. 76(5): 805–819. doi: CrossrefGoogle Scholar

  • Sah, P., E. S. L. Faber, M. Lopez De Armentia and J. Power. 2003. The amygdaloid complex: anatomy and physiology. Physiol. Rev. 83(3): 803–834.CrossrefGoogle Scholar

  • Sander, D., D. Grandjean and K. R. Scherer. 2005. A systems approach to appraisal mechanisms in emotion. Neural Netw. 18(4): 317–352. doi: CrossrefGoogle Scholar

  • Sarnarawickrame, K. and S. Mindya. 2013. Facial expression recognition using active shape models and support vector machines. Paper presented at the Advances in ICT for Emerging Regions (ICTer).Google Scholar

  • Scherer, K. R., T. Bänziger and E. B. Roesch (eds). (2010). A blueprint for affective computing. Oxford University Press, New York.Google Scholar

  • Scherer, K. R. 1984. On the nature and function of emotion: a component process approach. In: (K. R. Scherer and P. Ekman, eds) Approaches to emotion. Erlbaum, Hillsdale, NJ, pp. 293–317.Google Scholar

  • Scherer, K. R. 2000. Emotions as episodes of subsystem synchronization driven by nonlinear appraisal processes. In: (M. D. Lewis and I. Granic, eds) Emotion, development, and self-organization: dynamic systems approaches to emotional development. Cambridge University Press, New York, pp. 70–99.Google Scholar

  • Scherer, K. R. 2001. Appraisal considered as a process of multi-level sequential checking. In: (K. R. Scherer, A. Schorr and T. Johnstone, eds) Appraisal processes in emotion: theory, methods, research. Oxford University Press, New York, NJ, pp. 92–120.Google Scholar

  • Scherer, K. R. 2013. The nature and dynamics of relevance and valence appraisals: theoretical advances and recent evidence. Emot. Rev. 5(2): 150–162. doi: CrossrefGoogle Scholar

  • Scherer, K. R.and H. Ellgring. 2007a. Are facial expressions of emotion produced by categorical affect programs or dynamically driven by appraisal? Emotion, 7(1): 113–130. doi: CrossrefGoogle Scholar

  • Scherer, K. R. and H. Ellgring. 2007c. Multimodal expression of emotion: affect programs or componential appraisal patterns? Emotion. 7(1): 158–171. doi: CrossrefGoogle Scholar

  • Schwabe, L., C. J. Merz, B. Walter, D. Vaitl, O. T. Wolf and R. Stark. 2011. Emotional modulation of the attentional blink: the neural structures involved in capturing and holding attention. Neuropsychologia. 49(3): 416–425.CrossrefGoogle Scholar

  • Schönbrodt, F. D. and J. B. Asendorpf. 2011. The challenge of constructing psychologically believable agents. J. Media Psychol. 23(2): 100–107. doi: CrossrefGoogle Scholar

  • Senechal, T., L. Prevost and S. M. Hanif. 2010. Neural network cascade for facial feature localization. Lect. Notes Comput. Sc. 141–148.CrossrefGoogle Scholar

  • Shenhav, A. and J. D. Greene. 2014. Integrative moral judgment: dissociating the roles of the amygdala and ventromedial prefrontal cortex. J. Neurosci. 34(13): 4741–4749.CrossrefGoogle Scholar

  • Shi, C. and M. Davis. 1999. Pain pathways involved in fear conditioning measured with fear-potentiated startle: lesion studies. J. Neurosci. 19(1): 420–430.Google Scholar

  • Taylor, J. G. and N. F. Fragopanagos. 2005. The interaction of attention and emotion. Neural Netw. 18(4): 353–369. doi: CrossrefGoogle Scholar

  • Valstar, M. F. and M. Pantic. 2012. Fully automatic recognition of the temporal phases of facial actions. IEEE J SMCB. 42(1): 28–43.Google Scholar

  • van Kleef, G. A., E. A. van Doorn, M. W. Heerdink and L. F. Koning. 2011. Emotion is for influence. Eur. Rev. Soc. Psychol. 22(1): 114–163. doi: CrossrefGoogle Scholar

  • van Kleef, G. A. 2010. The emerging view of emotion as social information. Soc. Personal. Psychol. Compass. 4(5): 331–343. doi: CrossrefGoogle Scholar

  • Vitay, J. and F. H. Hamker. 2011. A neuroscientific view on the role of emotions in behaving cognitive agents. KI – Künstliche Intelligenz, 25(3): 235–244. doi: CrossrefGoogle Scholar

  • Vuilleumier, P., M. P. Richardson, J. L. Armony, J. Driver and R. J. Dolan. 2004. Distant influences of amygdala lesion on visual cortical activation during emotional face processing. Nat. Neurosci. 7(11): 1271–1278.CrossrefGoogle Scholar

  • Vytal, K. and S. Hamann. 2010. Neuroimaging support for discrete neural correlates of basic emotions: a voxel-based meta-analysis. J. Cogn. Neurosci. 22(12): 2864–2885. doi: CrossrefGoogle Scholar

  • Whitehill, J., G. Littlewort, I. Fasel,, Bartlett, M. and Movellan, J. 2009. Toward practical smile detection. IEEE J PAMI. 31(11): 2106–2111.Google Scholar

  • Wimmer, H. 1983. Beliefs about beliefs: representation and constraining function of wrong beliefs in young children‘s understanding of deception. Cognition. 13(1): 103–128. doi: CrossrefGoogle Scholar

  • Ye, W. and X. Fan. 2014. Bimodal emotion recognition from speech and text. IJACSA. 5(2).CrossrefGoogle Scholar

  • Zeng, Z., M. Pantic, G. I. Roisman and T. S. Huang. 2009. A survey of affect recognition methods: audio, visual, and spontaneous expressions. IEEE Trans. Pattern. Anal. Mach. Intell. 31(1): 39–58. doi: CrossrefGoogle Scholar

About the article

Benny Liebold

Benny Liebold is a researcher in the Institute for Media Research at Chemnitz University of Technology. His research focusses on the cognitive and emotional processing of virtual environments with an emphasis on the role of emotions in HCI, presence, game studies, and media effects in general, such as skill transfer and aggressive behavior.

René Richter

Dipl. Inf. René Richter is a researcher in the department of computer science at Chemnitz University of Technology. His research field is computational neuroscience, in particular the influence of emotions on attention and how this interaction can be used in the context of HCI.

Michael Teichmann

Dipl. Inf. Michael Teichmann works as a researcher in the department of computer science at Chemnitz University of Technology. His research focusses on neuroscientifically grounded computational models of the human visual cortex, in particular self-organization via neural plasticity in recurrent models of the ventral visual stream.

Fred H. Hamker

Fred H. Hamker is a professor of artificial intelligence in the department of computer science at Chemnitz University of Technology since 2009. He received his diploma in electrical engineering from the University of Paderborn in 1994 and his Ph. D. in computer science at Ilmenau University of Technology in 1999. He was a postdoc at the Goethe University Frankfurt and the California Institute of Technology (Pasadena, USA). In 2008 he received his venia legendi from the Department of Psychology at the University of Münster.

Peter Ohler

Peter Ohler is a professor of media psychology at the Institute for Media Research at Chemnitz University of Technology since 2002. He studied psychology at Saarland University and received his Ph. D. at TU Berlin in 1991. He had postdoc positions at TU Berlin and the University of Passau. In 2000 he received his venia legendi in psychology from TU Berlin. His research interests include the psychology of film, evolutionary psychology, cognitive science, and the psychology of play.


Published Online: 2015-07-12

Published in Print: 2015-08-01


Citation Information: i-com, Volume 14, Issue 2, Pages 126–137, ISSN (Online) 2196-6826, ISSN (Print) 1618-162X, DOI: https://doi.org/10.1515/icom-2015-0032.

Export Citation

© 2015 Walter de Gruyter GmbH, Berlin/Boston. Copyright Clearance Center

Comments (0)

Please log in or register to comment.
Log in