Jump to ContentJump to Main Navigation
Show Summary Details
More options …

Paladyn, Journal of Behavioral Robotics

Editor-in-Chief: Schöner, Gregor

CiteScore 2017: 0.33

SCImago Journal Rank (SJR) 2017: 0.104

ICV 2017: 99.90

Open Access
See all formats and pricing
More options …

Human activity analysis: a personal robot integrating a framework for robust person detection and tracking and physical based motion analysis

Consuelo Granata
  • ISIR, UPMC-CNRS UMR , 4, place Jussieu, 75005 Paris, France
  • mperial College of London, Exhibition Rd, London SW7 2AZ, UK
  • Other articles by this author:
  • De Gruyter OnlineGoogle Scholar
/ Philippe Bidaud
  • ISIR, UPMC-CNRS UMR , 4, place Jussieu, 75005 Paris, France
  • ONERA Chemin de la Hunière, 91123 Palaiseau, France
  • Other articles by this author:
  • De Gruyter OnlineGoogle Scholar
/ Joseph Salini / Ragou Ady
Published Online: 2013-12-10 | DOI: https://doi.org/10.2478/pjbr-2013-0011


The analysis of certain parameters related to cognitive and to motor humans’ activities in everyday life conditions can allow to detect potential behavioral troubles, make diagnoses and assess patients’ progress after a therapy. Within this context, personal robots can provide an autonomous movable platform for embedded sensors allowing to detect and track humans while ensuring an optimal observability of the person’s activity in complex and cluttered environments. This paper presents a framework combining a multimodal human detector based on sensors embedded in a mobile robot and a decisional engine exploiting the fuzzy logic mechanisms to make the robot track humans, maximizing observability and facing losses of detection. The robustness of this framework is evaluated experimentally in home spaces through different scenarios. Such a mobile system provides an effective marker-less motion capture means for sensing human activity in non-invasive fashion. We present a physical model based method exploiting the features of the system and of the embedded Kinect. Its performances are evaluated first comparing the results to those obtained with a precise 3D motion capture marker based system and to data obtained from a dynamic posturography platform. Then an experiment in real life conditions is performed to assess the system sensitivity to some gait disturbances.

Keywords: Human activity analysis; human detection; human following; marker-less motion capture; dynamic human body modeling

  • [1] R. C. Luo, Y. J. Chen, C. T. Liao and A. C. Tsai. Mobile robot based human detection and tracking using range and intensity data fusion. IEEE ARSO 2007 workshop on advanced robotics and its social impacts, Hsinchu, Taïwan (2007).Google Scholar

  • [2] J. Satake and J. Miura, Robust stereo-based person detection and tracking for a person following robot. IEEE ICRA 2009 workshop on people detection and tracking, Kobe, Japan (2009).Google Scholar

  • [3] J. Deutscher and I. Reid. Articulated body motion capture by stochastic search. International Journal of Computer Vision, 61(2), pages 185-205 (2005).CrossrefGoogle Scholar

  • [4] J. Deutscher, A. Blake and I. Reid. Articulated body motion capture by annealed particle filtering. In IEEE conference on computer vision and pattern recognition. vol. 2, pages 1144-1149 (2000).Google Scholar

  • [5] K. Choo and D. Fleet. People tracking using hybrid Monte Carlo filtering. In International conference on Computer vision. pages 321-328 (2001).Google Scholar

  • [6] M. Bray, E. Koller-Meier and L. V. Gool. Smart particle filtering for high-dimensional tracking. Computer Vision and Image Understanding, 106(1), pages 116-129 (2007).CrossrefGoogle Scholar

  • [7] J. Gall, T. Brox, B. Rosenhahn and H. P. Seidel. Global stochastic optimization for robust and accurate human motion capture. (Tech. Rep. MPI-I-2007-4-008). Max-Planck-Institut für Informatik, Germany (2007).Google Scholar

  • [8] R. Kehl, M. Bray and L. V. Gool. Full body tracking from multiple views using stochastic sampling. In IEEE conference on computer vision and pattern recognition, pages 129-136 (2005).Google Scholar

  • [9] L. Mundermann, S. Corazza and T. Andriacchi. Accurately measuring human movement using articulated ICP with soft-joint constraints and a repository of articulated models. In Computer vision and pattern recognition, pages 1-6 (2007).Google Scholar

  • [10] M. Shaheen, J. Gall, R. Strzodka, L. Van Gool and H.P. Seidel. A comparison of 3d model-based tracking approaches for human motion capture in uncontrolled environments. Workshop on Applications of Computer Vision (WACV), pages 1-8 (2009).Google Scholar

  • [11] V. B. Zordan and N. C. Van Der Horst. Mapping optical motion capture data to skeletal motion using a physical model. ACM SIGGRAPH/ Eurographics symposium on Computer animation, pages 245-250, Eurographics Association (2003).Google Scholar

  • [12] R. Poppe. Vision-based human motion analysis: An overview. Computer vision and image understanding, vol. 108-1, pages 4-18 (2007).Google Scholar

  • [13] M. Jones and P. Viola. Fast multi-view face detection. Mitsubishi Electric Research Lab TR-20003-96 (2003).Google Scholar

  • [14] C. Huang, H. Ai, Y. Li and S. Lao. High-performance rotation invariant multiview face detection. IEEE Transactions on Pattern Analysis and Machine Intelligence, 29(4): pages 671-686 (2007).Google Scholar

  • [15] N. Bellotto and H. Hu. Multisensor-based human detection and tracking for mobile service robots. Systems, Man, and Cybernetics, Part B : Cybernetics, IEEE Transactions, 39(1): pages 167-181 (2009).Google Scholar

  • [16] N. Bellotto and H. Hu. Computationally efficient solutions for tracking people with a mobile robot: an experimental evaluation of bayesian filters. Autonomous Robots, 28(4): pages 425-438 (2010).CrossrefGoogle Scholar

  • [17] N. Bellotto and H. Hu. A bank of unscented kalman filters for multimodal human perception with mobile service robots. International Journal of Social Robotics, 2(2): pages 121-136 (2010).Google Scholar

  • [18] J. Fritsch, M. Kleinehagenbrock, S. Lang, T. Plötz, GA Fink and G. Sagerer. Multi-modal anchoring for human-robot interaction. Robotics and Autonomous Systems, 43(2-3): pages 133-147 (2003).CrossrefGoogle Scholar

  • [19] C. Martin, E. Schaffernicht, A. Scheidig and H. M. Gross. Multimodal sensor fusion using a probabilistic aggregation scheme for people detection and tracking. Robotics and Autonomous Systems, 54(9), pages 721-728. (2006).CrossrefGoogle Scholar

  • [20] Z. Zhang and KRS Kodagoda. Multi-sensor approach for people detection. Proceedings in International Conference on Intelligent Sensors, Sensor Networks and Information Processing, pages 355-360 (2005).Google Scholar

  • [21] M. Bregonzio, M. Taj and A. Cavallaro. Multi-modal particle filtering tracking using appearance, motion and audio likelihoods. In IEEE International Conference on Image Processing, vol. 5, pages V-33. IEEE (2007).Google Scholar

  • [22] H.J. Böhme, T. Wilhelm, J. Key, C. Schauer, C. Schröter, H.M. Grob and T. Hempel. An approach to multi-modal humanmachine interaction for intelligent service robots. Robotics and Autonomous Systems, 44(1): pages 83-96 (2003).CrossrefGoogle Scholar

  • [23] P. Perez, J. Vermaak and A. Blake. Data fusion for visual tracking with particles. Proceedings of the IEEE, 92(3): pages 495-513 (2004).CrossrefGoogle Scholar

  • [24] G. Bradski. Computer vision face tracking for use in a perceptual user interface. Intel Technology Journal, 2nd Quarter (1998).Google Scholar

  • [25] P. Viola and M. Jones. Rapid object detection using boosted cascade of simple features. In IEEE Conference on Computer Vision and Pattern Recognition (2001).Google Scholar

  • [26] http://vision.cse.psu.edu/people/kyleB/faceTracker/index.shtmlGoogle Scholar

  • [27] T. Mizumoto, K. Nakadai, T. Yoshida, R. Takeda, T. Otsuka, T. Takahashi and H. G. Okuno. Design and implementation of selectable sound separation on the Texai telepresence system using HARK. In IEEE International Conference on Robotics and Automation (ICRA), pages 2130-2137 (2011).Google Scholar

  • [28] C. Granata and P. Bidaud. Interactive person following for social robots. In CLAWAR2011, International Conference on Climbing and Walking Robots and the Support Technologies for Mobile Machines, Paris, France (2011).Google Scholar

  • [29] S.Y. Chung and H.P. Huang. Predictive Navigation by Understanding Human Motion Patterns. International Journal of Advanced Robotic Systems. vol. 8 (2011).Google Scholar

  • [30] W. Li. Fuzzy logic-based ’perception-action’ behavior control of a mobile robot in uncertain environments. In IEEE Conference on Fuzzy Systems, IEEE World Congress on Computational Intelligence, pages 1626-1631 (1994).Google Scholar

  • [31] P. Rusu, E. M. Petriu, T. E. Whalen, A. Cornell and H.J.W. Spoelder. Behavior based neuro-fuzzy controller for mobile robot navigation. IEEE Transactions on Instrumentation and Measurement, 52(4): pages 1335-1340 (2003).Google Scholar

  • [32] C. H. Hu, X. D. Ma and X. Z. Dai. Reliable person following approach for mobile robot in indoor environment. In IEEE International Conference on Machine Learning and Cybernetics, vol. 3, pages 1815-1821 (2009).Google Scholar

  • [33] A. Buendia. Spirops AI, a behavior production tool. http://www.spirops.com/SpirOpsAI.php.Google Scholar

  • [34] H. Hicheur, S. Vieilledent, M. J. E. Richardson, T. Flash and A. Berthoz. Velocity and Curvature in Human Locomotion along Complex Curved Paths : a Comparison with Hand Movements. Experimental Brain Research, vol. 162, issue 2, pages 145-154 (2005).Google Scholar

  • [35] C. Granata and P. Bidaud. A framework for the design of person following behaviors for social mobile robot. In IEEE/RSJ Int Conference on Intelligent Robots and Systems (IROS), (2012).Google Scholar

  • [36] J. Salini. Dynamic control for the task/posture coordination of humanoids: toward synthesis of complex activities. PhD thesis. Université Pierre et Marie Curie (2012).Google Scholar

  • [37] http://www.spirops.com/SpirOpsAbstractQ2-2005.pdf.Google Scholar

  • [38] E. Pacchierotti, H. I. Christensen and P. Jensfelt. Embodied social interaction for service robots in hallway environments. In International Conference on Field and Service Robotics (FSR 2005). Brisbane, QLD, Australia: IEEE, pages 476-487 (2005).Google Scholar

  • [39] P. Morin and C. Samson. Motion control of wheeled mobile robots. Springer Handbook of Robotics (B. Siciliano and O. Khatib, eds.), Springer Berlin Heidelberg, pages 799-826 (2008).Google Scholar

  • [40] K. E. Webster, J. E. Wittwer and J. A. Feller. Validity of the GAITRite® walkway system for the measurement of averaged and individual step parameters of gait. Gait & posture, Elsevier, vol. 22, issue 4, pages 317-321 (2005).Google Scholar

  • [41] Y. Brenière. Why we walk the way we do. Journal of motor behavior, vol. 28-4 (1996).Google Scholar

  • [42] H. M. Schepers, E. van Asseldonk, J. H. Buurke and P. H. Veltink. Ambulatory estimation of center of mass displacement during walking. In IEEE Transactions on Biomedical Engineering, vol. 56-4, pages 1189-1195 (2009).CrossrefGoogle Scholar

  • [43] J. Shotton, A. Fitzgibbon, M. Cook, T. Sharp, M. Finocchio, R. Moore, A. Kipman and A Blake, Real-time human pose recognition in parts from single depth images. In IEEE Conference on Computer Vision and Pattern Recognition (CVPR), pages 1297-1304 (2011).Google Scholar

  • [44] W. Zijlstra and A. L. Hof. Assessment of spatio-temporal gait parameters from trunk accelerations during human walking. Gait & posture, Elsevier, vol. 18, issue 2, pages 1-10 (2003).Google Scholar

  • [45] C. Sminchisescu and B. Triggs. Estimating articulated human motion with covariance scaled sampling. The International Journal of Robotics Research, vol. 22-6, pages 371-391, SAGE Publications (2003).CrossrefGoogle Scholar

  • [46] L. Ren, G. Shakhnarovich, J. K. Hodgins, H. Pfister and P. Viola. Learning silhouette features for control of human motion. ACM Transactions on Graphics (ToG), vol. 24-4, pages 1303-1331 (2005).CrossrefGoogle Scholar

  • [47] A. Sandholm, N. Pronost and D. Thalmann. MotionLab: a Matlab toolbox for extracting and processing experimental motion capture data for neuromuscular simulations. In Modelling the Physiological Human. Springer Berlin Heidelberg, pages 110-124 (2009).Google Scholar

  • [48] A. González, M. Hayashibe and P. Fraisse. Estimation of the Center of Mass with Kinect and Wii balance board. In IEEE/RSJ Int Conf on Intelligent Robots and Systems (IROS) (2012).Google Scholar

  • [49] . Obdrzálek, G. Kurillo, F. Ofli, R. Bajcsy, E. Seto, H. Jimison, M. Pavel. Accuracy and Robustness of Kinect Pose Estimation in the Context of Coaching of Elderly Population.PubMedGoogle Scholar

  • [50] S. Tak and H. Ko, Physically based motion retargeting filter, US Patent App. 11/317, 844 (2005).Google Scholar

  • [51] J. Lee, S. Shin, Motion fairing, in IEEE Computer Animation ’96, pages 136-143 (1996).Google Scholar

  • [52] W. Shen, K. Deng, X. Bai, T. Leyvand, B. Guo and Z. Tu, Exemplarbased human action pose correction and tagging. IEEE Conference on Computer Vision and Pattern Recognition (CVPR) (2012).Google Scholar

  • [53] R. Duda, P. Hart and D. Stork, Pattern Classification and Scene Analysis 2nd ed. (1995), http://www.svms.org/classification/DuHS95.pdf retrieved March 1 (2013).Google Scholar

  • [54] B. Schölkopf, A. Smola, R. Williamson and P. Bartlett, New support vector algorithms. Neural computation 12.5, pages 1207-1245 (2000).Google Scholar

  • [55] V. Duindam and S. Stramigioli. Lagrangian dynamics of open multibody systems with generalized holonomic and nonholonomic joints. IEEE/RSJ International Conference on Intelligent Robots and Systems, pages 3342-3347 (2007).Google Scholar

  • [56] S. Barthélemy, J. Salini and A. Micaelli. Arboris-Python. https: //github.com/salini/arboris-pyhtonGoogle Scholar

  • [57] T. Liu and M. Y. Wang. Computation of three dimensional rigid body dynamics of multiple contacts using time-stepping and Gauss-Seidel. In IEEE Transactions on Automation Science and Engineering method (2005).Google Scholar

  • [58] P. de Leva, Adjustments to Zatsiorsky-Seluyanov’s segment inertia parameters, Journal of Biomechanics, vol. 29-9, pages 1223-1230 (1996).Google Scholar

  • [59] J. Salini, V. Padois and P. Bidaud. Synthesis of complex humanoid whole-body behavior: a focus on sequencing and tasks transitions. In IEEE International Conference on Robotics and Automation ICRA’11, pages 1283-1290 (2011)Google Scholar

  • [60] P. Sardain and G. Bessonnet. Forces acting on a biped robot. center of pressure-zero moment point. Systems. IEEE Transactions on Man and Cybernetics, pages 630-637 (2004).Google Scholar

  • [61] F. B. Horak, Clinical assessment of balance disorders, Gait & Posture 6, pages 76-84 (1997).Google Scholar

  • [62] http://www.codamotion.com/. Google Scholar

About the article

Published Online: 2013-12-10

Published in Print: 2013-12-01

Citation Information: Paladyn, Journal of Behavioral Robotics, Volume 4, Issue 2, Pages 131–146, ISSN (Print) 2081-4836, DOI: https://doi.org/10.2478/pjbr-2013-0011.

Export Citation

This content is open access.

Citing Articles

Here you can find all Crossref-listed publications in which this article is cited. If you would like to receive automatic email messages as soon as this article is cited in other publications, simply activate the “Citation Alert” on the top of this page.

Consuelo Granata, Aurélien Ibanez, and Philippe Bidaud
International Journal of Advanced Robotic Systems, 2015, Volume 12, Number 7, Page 89
Razvan Gabriel Boboc, Adrian Iulian Dumitru, and Csaba Antonya
International Journal of Advanced Robotic Systems, 2015, Volume 12, Number 6, Page 75

Comments (0)

Please log in or register to comment.
Log in