Skip to content
Open Access Published by De Gruyter Open Access December 10, 2013

Human activity analysis: a personal robot integrating a framework for robust person detection and tracking and physical based motion analysis

  • Consuelo Granata , Philippe Bidaud , Joseph Salini and Ragou Ady

Abstract

The analysis of certain parameters related to cognitive and to motor humans’ activities in everyday life conditions can allow to detect potential behavioral troubles, make diagnoses and assess patients’ progress after a therapy. Within this context, personal robots can provide an autonomous movable platform for embedded sensors allowing to detect and track humans while ensuring an optimal observability of the person’s activity in complex and cluttered environments. This paper presents a framework combining a multimodal human detector based on sensors embedded in a mobile robot and a decisional engine exploiting the fuzzy logic mechanisms to make the robot track humans, maximizing observability and facing losses of detection. The robustness of this framework is evaluated experimentally in home spaces through different scenarios. Such a mobile system provides an effective marker-less motion capture means for sensing human activity in non-invasive fashion. We present a physical model based method exploiting the features of the system and of the embedded Kinect. Its performances are evaluated first comparing the results to those obtained with a precise 3D motion capture marker based system and to data obtained from a dynamic posturography platform. Then an experiment in real life conditions is performed to assess the system sensitivity to some gait disturbances.

References

[1] R. C. Luo, Y. J. Chen, C. T. Liao and A. C. Tsai. Mobile robot based human detection and tracking using range and intensity data fusion. IEEE ARSO 2007 workshop on advanced robotics and its social impacts, Hsinchu, Taïwan (2007).10.1109/ARSO.2007.4531416Search in Google Scholar

[2] J. Satake and J. Miura, Robust stereo-based person detection and tracking for a person following robot. IEEE ICRA 2009 workshop on people detection and tracking, Kobe, Japan (2009).Search in Google Scholar

[3] J. Deutscher and I. Reid. Articulated body motion capture by stochastic search. International Journal of Computer Vision, 61(2), pages 185-205 (2005).10.1023/B:VISI.0000043757.18370.9cSearch in Google Scholar

[4] J. Deutscher, A. Blake and I. Reid. Articulated body motion capture by annealed particle filtering. In IEEE conference on computer vision and pattern recognition. vol. 2, pages 1144-1149 (2000).Search in Google Scholar

[5] K. Choo and D. Fleet. People tracking using hybrid Monte Carlo filtering. In International conference on Computer vision. pages 321-328 (2001).Search in Google Scholar

[6] M. Bray, E. Koller-Meier and L. V. Gool. Smart particle filtering for high-dimensional tracking. Computer Vision and Image Understanding, 106(1), pages 116-129 (2007).10.1016/j.cviu.2005.09.013Search in Google Scholar

[7] J. Gall, T. Brox, B. Rosenhahn and H. P. Seidel. Global stochastic optimization for robust and accurate human motion capture. (Tech. Rep. MPI-I-2007-4-008). Max-Planck-Institut für Informatik, Germany (2007).Search in Google Scholar

[8] R. Kehl, M. Bray and L. V. Gool. Full body tracking from multiple views using stochastic sampling. In IEEE conference on computer vision and pattern recognition, pages 129-136 (2005).Search in Google Scholar

[9] L. Mundermann, S. Corazza and T. Andriacchi. Accurately measuring human movement using articulated ICP with soft-joint constraints and a repository of articulated models. In Computer vision and pattern recognition, pages 1-6 (2007).10.1109/CVPR.2007.383302Search in Google Scholar

[10] M. Shaheen, J. Gall, R. Strzodka, L. Van Gool and H.P. Seidel. A comparison of 3d model-based tracking approaches for human motion capture in uncontrolled environments. Workshop on Applications of Computer Vision (WACV), pages 1-8 (2009).10.1109/WACV.2009.5403039Search in Google Scholar

[11] V. B. Zordan and N. C. Van Der Horst. Mapping optical motion capture data to skeletal motion using a physical model. ACM SIGGRAPH/ Eurographics symposium on Computer animation, pages 245-250, Eurographics Association (2003).Search in Google Scholar

[12] R. Poppe. Vision-based human motion analysis: An overview. Computer vision and image understanding, vol. 108-1, pages 4-18 (2007).10.1016/j.cviu.2006.10.016Search in Google Scholar

[13] M. Jones and P. Viola. Fast multi-view face detection. Mitsubishi Electric Research Lab TR-20003-96 (2003).Search in Google Scholar

[14] C. Huang, H. Ai, Y. Li and S. Lao. High-performance rotation invariant multiview face detection. IEEE Transactions on Pattern Analysis and Machine Intelligence, 29(4): pages 671-686 (2007).10.1109/TPAMI.2007.1011Search in Google Scholar

[15] N. Bellotto and H. Hu. Multisensor-based human detection and tracking for mobile service robots. Systems, Man, and Cybernetics, Part B : Cybernetics, IEEE Transactions, 39(1): pages 167-181 (2009).Search in Google Scholar

[16] N. Bellotto and H. Hu. Computationally efficient solutions for tracking people with a mobile robot: an experimental evaluation of bayesian filters. Autonomous Robots, 28(4): pages 425-438 (2010).10.1007/s10514-009-9167-2Search in Google Scholar

[17] N. Bellotto and H. Hu. A bank of unscented kalman filters for multimodal human perception with mobile service robots. International Journal of Social Robotics, 2(2): pages 121-136 (2010).10.1007/s12369-010-0047-xSearch in Google Scholar

[18] J. Fritsch, M. Kleinehagenbrock, S. Lang, T. Plötz, GA Fink and G. Sagerer. Multi-modal anchoring for human-robot interaction. Robotics and Autonomous Systems, 43(2-3): pages 133-147 (2003).10.1016/S0921-8890(02)00355-XSearch in Google Scholar

[19] C. Martin, E. Schaffernicht, A. Scheidig and H. M. Gross. Multimodal sensor fusion using a probabilistic aggregation scheme for people detection and tracking. Robotics and Autonomous Systems, 54(9), pages 721-728. (2006).10.1016/j.robot.2006.04.012Search in Google Scholar

[20] Z. Zhang and KRS Kodagoda. Multi-sensor approach for people detection. Proceedings in International Conference on Intelligent Sensors, Sensor Networks and Information Processing, pages 355-360 (2005).Search in Google Scholar

[21] M. Bregonzio, M. Taj and A. Cavallaro. Multi-modal particle filtering tracking using appearance, motion and audio likelihoods. In IEEE International Conference on Image Processing, vol. 5, pages V-33. IEEE (2007).10.1109/ICIP.2007.4379758Search in Google Scholar

[22] H.J. Böhme, T. Wilhelm, J. Key, C. Schauer, C. Schröter, H.M. Grob and T. Hempel. An approach to multi-modal humanmachine interaction for intelligent service robots. Robotics and Autonomous Systems, 44(1): pages 83-96 (2003).10.1016/S0921-8890(03)00012-5Search in Google Scholar

[23] P. Perez, J. Vermaak and A. Blake. Data fusion for visual tracking with particles. Proceedings of the IEEE, 92(3): pages 495-513 (2004).10.1109/JPROC.2003.823147Search in Google Scholar

[24] G. Bradski. Computer vision face tracking for use in a perceptual user interface. Intel Technology Journal, 2nd Quarter (1998).Search in Google Scholar

[25] P. Viola and M. Jones. Rapid object detection using boosted cascade of simple features. In IEEE Conference on Computer Vision and Pattern Recognition (2001).Search in Google Scholar

[26] http://vision.cse.psu.edu/people/kyleB/faceTracker/index.shtmlSearch in Google Scholar

[27] T. Mizumoto, K. Nakadai, T. Yoshida, R. Takeda, T. Otsuka, T. Takahashi and H. G. Okuno. Design and implementation of selectable sound separation on the Texai telepresence system using HARK. In IEEE International Conference on Robotics and Automation (ICRA), pages 2130-2137 (2011).10.1109/ICRA.2011.5979849Search in Google Scholar

[28] C. Granata and P. Bidaud. Interactive person following for social robots. In CLAWAR2011, International Conference on Climbing and Walking Robots and the Support Technologies for Mobile Machines, Paris, France (2011).Search in Google Scholar

[29] S.Y. Chung and H.P. Huang. Predictive Navigation by Understanding Human Motion Patterns. International Journal of Advanced Robotic Systems. vol. 8 (2011).10.5772/10529Search in Google Scholar

[30] W. Li. Fuzzy logic-based ’perception-action’ behavior control of a mobile robot in uncertain environments. In IEEE Conference on Fuzzy Systems, IEEE World Congress on Computational Intelligence, pages 1626-1631 (1994).Search in Google Scholar

[31] P. Rusu, E. M. Petriu, T. E. Whalen, A. Cornell and H.J.W. Spoelder. Behavior based neuro-fuzzy controller for mobile robot navigation. IEEE Transactions on Instrumentation and Measurement, 52(4): pages 1335-1340 (2003).10.1109/TIM.2003.816846Search in Google Scholar

[32] C. H. Hu, X. D. Ma and X. Z. Dai. Reliable person following approach for mobile robot in indoor environment. In IEEE International Conference on Machine Learning and Cybernetics, vol. 3, pages 1815-1821 (2009).Search in Google Scholar

[33] A. Buendia. Spirops AI, a behavior production tool. http://www.spirops.com/SpirOpsAI.php.Search in Google Scholar

[34] H. Hicheur, S. Vieilledent, M. J. E. Richardson, T. Flash and A. Berthoz. Velocity and Curvature in Human Locomotion along Complex Curved Paths : a Comparison with Hand Movements. Experimental Brain Research, vol. 162, issue 2, pages 145-154 (2005).Search in Google Scholar

[35] C. Granata and P. Bidaud. A framework for the design of person following behaviors for social mobile robot. In IEEE/RSJ Int Conference on Intelligent Robots and Systems (IROS), (2012).10.1109/IROS.2012.6385976Search in Google Scholar

[36] J. Salini. Dynamic control for the task/posture coordination of humanoids: toward synthesis of complex activities. PhD thesis. Université Pierre et Marie Curie (2012).Search in Google Scholar

[37] http://www.spirops.com/SpirOpsAbstractQ2-2005.pdf.Search in Google Scholar

[38] E. Pacchierotti, H. I. Christensen and P. Jensfelt. Embodied social interaction for service robots in hallway environments. In International Conference on Field and Service Robotics (FSR 2005). Brisbane, QLD, Australia: IEEE, pages 476-487 (2005).Search in Google Scholar

[39] P. Morin and C. Samson. Motion control of wheeled mobile robots. Springer Handbook of Robotics (B. Siciliano and O. Khatib, eds.), Springer Berlin Heidelberg, pages 799-826 (2008).10.1007/978-3-540-30301-5_35Search in Google Scholar

[40] K. E. Webster, J. E. Wittwer and J. A. Feller. Validity of the GAITRite® walkway system for the measurement of averaged and individual step parameters of gait. Gait & posture, Elsevier, vol. 22, issue 4, pages 317-321 (2005).10.1016/j.gaitpost.2004.10.005Search in Google Scholar PubMed

[41] Y. Brenière. Why we walk the way we do. Journal of motor behavior, vol. 28-4 (1996).10.1080/00222895.1996.10544598Search in Google Scholar

[42] H. M. Schepers, E. van Asseldonk, J. H. Buurke and P. H. Veltink. Ambulatory estimation of center of mass displacement during walking. In IEEE Transactions on Biomedical Engineering, vol. 56-4, pages 1189-1195 (2009).10.1109/TBME.2008.2011059Search in Google Scholar

[43] J. Shotton, A. Fitzgibbon, M. Cook, T. Sharp, M. Finocchio, R. Moore, A. Kipman and A Blake, Real-time human pose recognition in parts from single depth images. In IEEE Conference on Computer Vision and Pattern Recognition (CVPR), pages 1297-1304 (2011).10.1109/CVPR.2011.5995316Search in Google Scholar

[44] W. Zijlstra and A. L. Hof. Assessment of spatio-temporal gait parameters from trunk accelerations during human walking. Gait & posture, Elsevier, vol. 18, issue 2, pages 1-10 (2003).10.1016/S0966-6362(02)00190-XSearch in Google Scholar

[45] C. Sminchisescu and B. Triggs. Estimating articulated human motion with covariance scaled sampling. The International Journal of Robotics Research, vol. 22-6, pages 371-391, SAGE Publications (2003).10.1177/0278364903022006003Search in Google Scholar

[46] L. Ren, G. Shakhnarovich, J. K. Hodgins, H. Pfister and P. Viola. Learning silhouette features for control of human motion. ACM Transactions on Graphics (ToG), vol. 24-4, pages 1303-1331 (2005).10.1145/1095878.1095882Search in Google Scholar

[47] A. Sandholm, N. Pronost and D. Thalmann. MotionLab: a Matlab toolbox for extracting and processing experimental motion capture data for neuromuscular simulations. In Modelling the Physiological Human. Springer Berlin Heidelberg, pages 110-124 (2009).10.1007/978-3-642-10470-1_10Search in Google Scholar

[48] A. González, M. Hayashibe and P. Fraisse. Estimation of the Center of Mass with Kinect and Wii balance board. In IEEE/RSJ Int Conf on Intelligent Robots and Systems (IROS) (2012).10.1109/IROS.2012.6385665Search in Google Scholar

[49] . Obdrzálek, G. Kurillo, F. Ofli, R. Bajcsy, E. Seto, H. Jimison, M. Pavel. Accuracy and Robustness of Kinect Pose Estimation in the Context of Coaching of Elderly Population.Search in Google Scholar

[50] S. Tak and H. Ko, Physically based motion retargeting filter, US Patent App. 11/317, 844 (2005).Search in Google Scholar

[51] J. Lee, S. Shin, Motion fairing, in IEEE Computer Animation ’96, pages 136-143 (1996).Search in Google Scholar

[52] W. Shen, K. Deng, X. Bai, T. Leyvand, B. Guo and Z. Tu, Exemplarbased human action pose correction and tagging. IEEE Conference on Computer Vision and Pattern Recognition (CVPR) (2012).10.1109/CVPR.2012.6247875Search in Google Scholar

[53] R. Duda, P. Hart and D. Stork, Pattern Classification and Scene Analysis 2nd ed. (1995), http://www.svms.org/classification/DuHS95.pdf retrieved March 1 (2013).Search in Google Scholar

[54] B. Schölkopf, A. Smola, R. Williamson and P. Bartlett, New support vector algorithms. Neural computation 12.5, pages 1207-1245 (2000).10.1162/089976600300015565Search in Google Scholar PubMed

[55] V. Duindam and S. Stramigioli. Lagrangian dynamics of open multibody systems with generalized holonomic and nonholonomic joints. IEEE/RSJ International Conference on Intelligent Robots and Systems, pages 3342-3347 (2007).10.1109/IROS.2007.4399066Search in Google Scholar

[56] S. Barthélemy, J. Salini and A. Micaelli. Arboris-Python. https: //github.com/salini/arboris-pyhtonSearch in Google Scholar

[57] T. Liu and M. Y. Wang. Computation of three dimensional rigid body dynamics of multiple contacts using time-stepping and Gauss-Seidel. In IEEE Transactions on Automation Science and Engineering method (2005).10.1109/TASE.2004.840074Search in Google Scholar

[58] P. de Leva, Adjustments to Zatsiorsky-Seluyanov’s segment inertia parameters, Journal of Biomechanics, vol. 29-9, pages 1223-1230 (1996).10.1016/0021-9290(95)00178-6Search in Google Scholar

[59] J. Salini, V. Padois and P. Bidaud. Synthesis of complex humanoid whole-body behavior: a focus on sequencing and tasks transitions. In IEEE International Conference on Robotics and Automation ICRA’11, pages 1283-1290 (2011)10.1109/ICRA.2011.5980202Search in Google Scholar

[60] P. Sardain and G. Bessonnet. Forces acting on a biped robot. center of pressure-zero moment point. Systems. IEEE Transactions on Man and Cybernetics, pages 630-637 (2004).10.1109/TSMCA.2004.832811Search in Google Scholar

[61] F. B. Horak, Clinical assessment of balance disorders, Gait & Posture 6, pages 76-84 (1997).10.1016/S0966-6362(97)00018-0Search in Google Scholar

[62] http://www.codamotion.com/. Search in Google Scholar

Published Online: 2013-12-10
Published in Print: 2013-12-1

This content is open access.

Downloaded on 19.3.2024 from https://www.degruyter.com/document/doi/10.2478/pjbr-2013-0011/html
Scroll to top button