Jump to ContentJump to Main Navigation
Show Summary Details
More options …

Paladyn, Journal of Behavioral Robotics

Editor-in-Chief: Schöner, Gregor

1 Issue per year


CiteScore 2017: 0.33

SCImago Journal Rank (SJR) 2017: 0.104

Open Access
Online
ISSN
2081-4836
See all formats and pricing
More options …

Real-time gaze estimation via pupil center tracking

Dario Cazzato
  • Corresponding author
  • Interdisciplinary Centre for Security, Reliability and Trust, University of Luxembourg, L-1359 Luxembourg, Luxembourg
  • Email
  • Other articles by this author:
  • De Gruyter OnlineGoogle Scholar
/ Fabio Dominio / Roberto Manduchi / Silvia M. Castro
Published Online: 2018-02-07 | DOI: https://doi.org/10.1515/pjbr-2018-0002

Abstract

Automatic gaze estimation not based on commercial and expensive eye tracking hardware solutions can enable several applications in the fields of human computer interaction (HCI) and human behavior analysis. It is therefore not surprising that several related techniques and methods have been investigated in recent years. However, very few camera-based systems proposed in the literature are both real-time and robust. In this work, we propose a real-time user-calibration-free gaze estimation system that does not need person-dependent calibration, can deal with illumination changes and head pose variations, and can work with a wide range of distances from the camera. Our solution is based on a 3-D appearance-based method that processes the images from a built-in laptop camera. Real-time performance is obtained by combining head pose information with geometrical eye features to train a machine learning algorithm. Our method has been validated on a data set of images of users in natural environments, and shows promising results. The possibility of a real-time implementation, combined with the good quality of gaze tracking, make this system suitable for various HCI applications.

Keywords: gaze estimation; regression tree; appearancebased method; pupil tracking

References

  • [1] K. Lund, The importance of gaze and gesture in interactive multimodal explanation, Language Resources and Evaluation, 2007, 41(3-4), 289-303Google Scholar

  • [2] J. De Villiers, The interface of language and theory of mind, Lingua, 2007, 117(11), 1858-1878Google Scholar

  • [3] http://www.tobii.com [Online; accessed 01-December-2017]Google Scholar

  • [4] http://www.sr-research.com/ [Online; accessed 01-December-2017]Google Scholar

  • [5] A. Duchowski, Eye tracking methodology: Theory and practice, Springer Science & Business Media, 2007, 373Google Scholar

  • [6] C. H. Morimoto, M. R. Mimica, Eye gaze tracking techniques for interactive applications, Computer vision and image understanding, 2005, 98(1), 4-24Google Scholar

  • [7] D. W. Hansen, Q. Ji, In the eye of the beholder: A survey of models for eyes and gaze, IEEE Transactions on Pattern Analysis and Machine Intelligence, 2010, 32(3), 478-500CrossrefGoogle Scholar

  • [8] M. A. Just, P. A. Carpenter, Eye fixations and cognitive processes, Cognitive psychology, 1976, 8(4), 441-480Google Scholar

  • [9] J. H. Goldberg, M. J. Stimson, M. Lewenstein, N. Scott, A. M. Wichansky, Eye tracking in web search tasks: design implications, in Proceedings of the 2002 symposium on Eye tracking research & applications, ACM, 2002, 51-58Google Scholar

  • [10] P. Majaranta, A. Bulling, Eye tracking and eye-based human- computer interaction, in Advances in Physiological Computing, Springer, 2014, 39-65Google Scholar

  • [11] K. Yun, Y. Peng, D. Samaras, G. J. Zelinsky, T. L. Berg, Exploring the role of gaze behavior and object detection in scene understanding, Frontiers in psychology, 2013, 4(no. DEC)Google Scholar

  • [12] T. Busjahn, R. Bednarik, C. Schulte, What influences dwell time during source code reading?: analysis of element type and frequency as factors, in Proceedings of the Symposium on Eye Tracking Research and Applications, ACM, 2014, 335-338Google Scholar

  • [13] H. H. Greene, K. Rayner, Eye movements and familiarity effects in visual search, Vision research, 2001, 41(27), 3763-3773Google Scholar

  • [14] P. Kasprowski, O. V. Komogortsev, A. Karpov, First eye movement verification and identification competition at BTAS 2012, in IEEE Fifth International Conference on Biometrics: Theory, Applications and Systems (BTAS 2012), 2012, 195-202Google Scholar

  • [15] F. Deravi, S. P. Guness, Gaze trajectory as a biometric modality, in BIOSIGNALS, 2011, 335-341Google Scholar

  • [16] M. Wedel, R. Pieters, Eye tracking for visualmarketing, NowPublishers Inc, 2008Google Scholar

  • [17] K. Gidlöf, A. Wallin, R. Dewhurst, K. Holmqvist, Using eye tracking to trace a cognitive process: Gaze behaviour during decision making in a natural environment, Journal of Eye Movement Research, 2013, 6(1), 1-14Google Scholar

  • [18] H. Cai, X. Zhou, H. Yu, H. Liu, Gaze estimation driven solution for interacting children with ASD, in 2015 International Symposium on Micro-NanoMechatronics and Human Science (MHS), 2015, 1-6Google Scholar

  • [19] S. Thill, C. A. Pop, T. Belpaeme, T. Ziemke, B. Vanderborght, Robot-assisted therapy for autism spectrumdisorderswith (partially) autonomous control: Challenges and outlook, Paladyn, 2012, 3(4), 209-217Google Scholar

  • [20] S. Sheikhi, J.-M. Odobez, Combining dynamic head pose-gaze mapping with the robot conversational state for attention re cognition in human-robot interactions, Pattern Recognition Letters, 2015, 66, 81-90Google Scholar

  • [21] M. P. Michalowski, S. Sabanovic, R. Simmons, A spatial model of engagement for a social robot, 9th IEEE International Workshop on Advanced Motion Control, IEEE, 2006, 762-767Google Scholar

  • [22] T. Yonezawa, H. Yamazoe, A. Utsumi, S. Abe, Attractive, informative, and communicative robot system on guide plate as an attendant with awareness of user’s gaze, Paladyn, Journal of Behavioral Robotics, 2013, 4(2), 113-122Google Scholar

  • [23] S. Frintrop, Towards attentive robots, Paladyn, Journal of Behavioral Robotics, 2011, 2(2), 64-70Google Scholar

  • [24] M. Leo, G. Medioni, M. Trivedi, T. Kanade, G. M. Farinella, Computer vision for assistive technologies, Computer Vision and Image Understanding, 2017, 154, 1-15Google Scholar

  • [25] E. D. Guestrin, M. Eizenman, General theory of remote gaze estimation using the pupil center and corneal reflections, IEEE Transactions on biomedical engineering, 2006, 53(6), 1124-1133Google Scholar

  • [26] H. Yamazoe, A. Utsumi, T. Yonezawa, S. Abe, Remote gaze estimation with a single camera based on facial-feature tracking without special calibration actions, in Proceedings of the 2008 symposiumon Eye tracking research & applications, ACM, 2008, 245-250Google Scholar

  • [27] E. Wood, A. Bulling, Eyetab: Model-based gaze estimation on unmodified tablet computers, in Proceedings of the Symposium on Eye Tracking Research and Applications, ACM, 2014, 207-210Google Scholar

  • [28] L. Sun, Z. Liu, M.-T. Sun, Real time gaze estimation with a consumer depth camera, Information Sciences, 2015, 320, 346-360Google Scholar

  • [29] D. Cazzato, A. Evangelista, M. Leo, P. Carcagně, C. Distante, A low-cost and calibration-free gaze estimator for soft biometrics: An explorative study, Pattern Recognition Letters, 2015Google Scholar

  • [30] X. Xiong, Z. Liu, Q. Cai, Z. Zhang, Eye gaze tracking using an RGBD camera: a comparison with a RGB solution, in Proceedings of the 2014 ACM International Joint Conference on Pervasive and Ubiquitous Computing: Adjunct Publication, ACM, 2014, 1113-1121Google Scholar

  • [31] L. Jianfeng, L. Shigang, Eye-model-based gaze estimation by RGB-D camera, in Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition Workshops, 2014, 592-596Google Scholar

  • [32] Z. Guo, Q. Zhou, Z. Liu, Appearance-based gaze estimation under slight head motion, Multimedia Tools and Applications, 2016, 1-20Google Scholar

  • [33] X. Zhang, Y. Sugano, M. Fritz, A. Bulling, Appearance based gaze estimation in the wild, in Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, 2015, 4511-4520Google Scholar

  • [34] F. Lu, Y. Sugano, T. Okabe, Y. Sato, Adaptive linear regression for appearance-based gaze estimation, IEEE transactions on pattern analysis and machine intelligence, 2014, 36(10), 2033-2046Google Scholar

  • [35] O. Williams, A. Blake, R. Cipolla, Sparse and semi supervised visual mapping with the sˆ 3gp, in IEEE Computer Society Conference on Computer Vision and Pattern Recognition, IEEE, 2006, 1, 230-237Google Scholar

  • [36] K. Liang, Y. Chahir, M. Molina, C. Tijus, F. Jouen, Appearancebased gaze tracking with spectral clustering and semisupervised Gaussian process regression, in Proceedings of the 2013 Conference on Eye Tracking South Africa, ACM, 2013, 17-23Google Scholar

  • [37] T. Schneider, B. Schauerte, R. Stiefelhagen,Manifold alignment for person independent appearance-based gaze estimation, in 22nd International Conference on Pattern Recognition (ICPR), IEEE, 2014, 1167-1172Google Scholar

  • [38] O. Ferhat, A. Llanza, F. Vilarińo, A feature-based gaze estimation algorithm for natural light scenarios, in Pattern Recognition and Image Analysis, Springer, 2015, 569-576Google Scholar

  • [39] P. Koutras, P. Maragos, Estimation of eye gaze direction angles based on active appearance models, IEEE International Conference on Image Processing (ICIP), IEEE, 2015, 2424-2428Google Scholar

  • [40] H. Yoshimura, M. Hori, T. Shimizu, Y. Iwai, Appearance based gaze estimation for digital signage considering head pose, International Journal of Machine Learning and Computing, 2015, 5(6), 507Google Scholar

  • [41] F. Lu, Y. Sugano, T. Okabe, Y. Sato, Gaze estimation from eye appearance: A head pose-free method via eye image synthesis, IEEE Transactions on Image Processing, 2015, 24(11), 3680-3693Google Scholar

  • [42] Y. Sugano, Y. Matsushita, Y. Sato, Learning-by-synthesis for appearance-based 3d gaze estimation, in Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, 2014, 1821-1828Google Scholar

  • [43] C. Xiong, L. Huang, C. Liu, Remote gaze estimation based on 3d face structure and iris centers under natural light, Multimedia Tools and Applications, 2015, 1-15Google Scholar

  • [44] K. A. Funes-Mora, J.-M. Odobez, Gaze estimation in the 3d space using RGB-D sensors, International Journal of Computer Vision, 2016, 118(2), 194-216CrossrefGoogle Scholar

  • [45] F. Lu, T. Okabe, Y. Sugano, Y. Sato, Learning gaze biases with head motion for head pose-free gaze estimation, Image and Vision Computing, 2014, 32(3), 169-179CrossrefGoogle Scholar

  • [46] C. Holland, A.Garza, E. Kurtova, J. Cruz,O. Komogortsev, Usability evaluation of eye tracking on an unmodified common tablet, in CHI’13 Extended Abstracts on Human Factors in Computing Systems, ACM, 2013, 295-300Google Scholar

  • [47] J. Chen, Q. Ji, A probabilistic approach to online eye gaze tracking without explicit personal calibration, IEEE Transactions on Image Processing, 2015, 24(3), 1076-1086Google Scholar

  • [48] F. de la Torre, W.-S. Chu, X. Xiong, F. Vicente, X. Ding, J. Cohn, Intraface, 11th IEEE International Conference and Workshops on Automatic Face and Gesture Recognition (FG), IEEE, 2015, 1, 1-8Google Scholar

  • [49] M. Smereka, I. Duleba, Circular object detection using a modified Hough transform, International Journal of Applied Mathematics and Computer Science, 2008, 18(1), 85-91CrossrefGoogle Scholar

  • [50] P. Viola, M. Jones, Rapid object detection using a boosted cascade of simple features, in Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR), IEEE, 2001, 1, 1-511Google Scholar

  • [51] X. Xiong, F. Torre, Supervised descent method and its applications to face alignment, in Proceedings of the IEEE conference on computer vision and pattern recognition, 2013, 532-539Google Scholar

  • [52] N. A. Dodgson, Variation and extrema of human interpupillary distance, in Electronic imaging, 2004, 36-46, International Society for Optics and Photonics, 2004Google Scholar

  • [53] C. C. Gordon, C. L. Blackwell, B. Bradtmiller, J. L. Parham, P. Barrientos, S. P. Paquette, B. D. Corner, J. M. Carson, J. C. Venezia, B. M. Rockwell, et al., 2012 anthropometric survey of us army personnel: Methods and summary statistics, tech. rep., Army Natick Soldier Research Development And Engineering Center Ma, Google Scholar

  • [54] J. Canny, A computational approach to edge detection, IEEE Transactions on Pattern Analysis and Machine Intelligence, 1986, 6, 679-698Google Scholar

  • [55] J. F. Magee, Decision trees for decision making, Harvard Business Review, 1964Google Scholar

  • [56] G. Fanelli, M. Dantone, J. Gall, A. Fossati, L. Van Gool, Random forests for real time 3d face analysis, International Journal of Computer Vision, 2013, 101(3), 437-458Google Scholar

  • [57] L. Breiman, Random forests, Machine learning, 2001, 45(1), 5-32Google Scholar

  • [58] A. Liaw, M. Wiener, Classification and regression by randomforest, R news, 2002, 2(3), 18-22Google Scholar

  • [59] G. Bradski, A. Kaehler, Learning OpenCV: Computer vision with the OpenCV library, O’Reilly Media, Inc., 2008Google Scholar

  • [60] https://www.qt.io/developers [Online; accessed 01-December-2017]Google Scholar

  • [61] M. Leo, D. Cazzato, T. DeMarco, C. Distante, Unsupervised eye pupil localization through differential geometry and local selfsimilarity matching, PloS one, 2014, 9(8), e102829Google Scholar

  • [62] R. Valenti, T. Gevers, Accurate eye center location through invariant isocentric patterns, IEEE Transactions on Pattern Analysis and Machine Intelligence, 2012, 34(9), 1785-1798CrossrefGoogle Scholar

  • [63] M. Leo, D. Cazzato, T. DeMarco, C. Distante, Unsupervised approach for the accurate localization of the pupils in near frontal facial images, Journal of Electronic Imaging, 2013, 22(3), 033033-033033Google Scholar

  • [64] S. Asteriadis, P. Tzouveli, K. Karpouzis, S. Kollias, Estimation of behavioral user state based on eye gaze and head pose-application in an e-learning environment, Multimedia Tools and Applications, 2009, 41(3), 469-493CrossrefGoogle Scholar

  • [65] T. D’Orazio, M. Leo, C. Guaragnella, A. Distante, A visual approach for driver inattention detection, Pattern Recognition, 2007, 40(8), 2341-2355Google Scholar

  • [66] V. Sundstedt, Gazing at games: An introduction to eye tracking control, Synthesis Lectures on Computer Graphics and Animation, 2012, 5(1), 1-113Google Scholar

  • [67] L. Chaby, M. Chetouani, M. Plaza, D. Cohen, Exploring multimodal social-emotional behaviors in autism spectrum disorders: an interface between social signal processing and psychopathology, in International Conference on Privacy, Security, Risk and Trust (PASSAT), and International Confernece on Social Computing (SocialCom), IEEE, 2012, 950-954Google Scholar

  • [68] L. Piccardi, B. Noris, O. Barbey, A. Billard, G. Schiavone, F. Keller, C. von Hofsten, Wearcam: A head mounted wireless camera for monitoring gaze attention and for the diagnosis of developmental disorders in young children, in the 16th IEEE International Symposium on Robot and Human interactive Communication (RO-MAN), IEEE, 2007, 594-598Google Scholar

  • [69] X. Li, A. Çöltekin, M.-J. Kraak, Visual exploration of eye movement data using the space-time-cube, in International Conference on Geographic Information Science, Springer, 2010, 295-309Google Scholar

  • [70] A. T. Duchowski, V. Shivashankaraiah, T. Rawls, A. K. Gramopadhye, B. J. Melloy, B. Kanki, Binocular eye tracking in virtual reality for inspection training, in Proceedings of the symposium on Eye tracking research & applications, ACM, 2000, 89-96Google Scholar

About the article

Received: 2017-09-20

Accepted: 2018-01-16

Published Online: 2018-02-07

Published in Print: 2018-02-01


Citation Information: Paladyn, Journal of Behavioral Robotics, Volume 9, Issue 1, Pages 6–18, ISSN (Online) 2081-4836, DOI: https://doi.org/10.1515/pjbr-2018-0002.

Export Citation

© by Dario Cazzato, published by Sciendo. This work is licensed under the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 License. BY-NC-ND 4.0

Comments (0)

Please log in or register to comment.
Log in