Abstract
This paper describes a system integration for a life-sized robot working at a kitchen. On cooking tasks, there should be various tools and foods, and cooking table may have reflective surface with blots and scratch. Recognition functions should be robust to noises derived from them. As other problems, cooking behaviors impose motion sequences by using whole body of the robot. For instance, while cutting a vegetable, the robot has to hold one hand against the vegetable even if another hand with a knife should be moved for the cutting. This motion requires to consider full articulation of the robot simultaneously. That is, we have difficulties against both recognition and motion generation. In this paper we propose recognition functions that are to detect kitchen tools such as containers and cutting boards. These functions are improved to overcome the influence of reflective surface, and combination shape model with task knowledge is also proposed. On the other hand, we pointed out the importance of the use of torso joints while dual arm manipulation. Our approach enables the robot to keep manipulability of both arms and viewing field of a head. Based on these products, we also introduce an integrated system incorporating recognition modules and motion generation modules. The effectiveness of the system was proven through some cooking applications.
References
[1] iRobot Corp. roomba, http://www.irobot.com/us/robots/home/roomba.aspx.Search in Google Scholar
[2] R. Hamada, J. Okabe, I. Ide, S. Satoh, S. Sakai, H. Tanaka, Cooking Navi: Assistant for Daily Cooking in Kitchen. In Proceedings of the 13th annual ACM international conference on Multimedia, pages 371-374, Singapore, 2005.10.1145/1101149.1101228Search in Google Scholar
[3] W. Ju, R. Hurwitz, T. Judd, B. Lee, CounterActive:An Interactive Cookbook for the Kitchen Counter. In CHI ’01 Extended Abstracts on Human Factors in Computing Systems, pages 269-270, USA, 2001.10.1145/634067.634227Search in Google Scholar
[4] L. Bonanni, C.H. Lee, T. Selker, CounterIntelligence:Augmented Reality Kitchen. In CHI ’05 Extended Abstracts of Computer Human Interaction(CHI), pages 2239-2245. ACM Press, USA, 2005.Search in Google Scholar
[5] I. Siio, N. Mima, I. Frank, T. Ono, H. Weintraub, Making Recipes in the Kitchen of the Future. In CHI ’04 Extended Abstracts on Human Factors in Computing Systems, pages 22-29, Austria, 2004.10.1145/985921.986130Search in Google Scholar
[6] T. Fukuda, Y. Nakauchi, K. Noguchi, T. Matsubara, Human Behavior Recognition for Cooking Support Robot. In Proceedings of 2004 IEEE Intl. Workshop on Robot and Human Interactive Communication( ROMAN), pages 359-364, Japan, 2004.Search in Google Scholar
[7] M. Bollini, S. Tellex, T. Thompson, N. Roy, D. Rus, Interpreting and Executing Recipes with a Cooking Robot. In Proceedings of International Symposium on Experimental Robotics(ISER), 2012.10.1007/978-3-319-00065-7_33Search in Google Scholar
[8] S.S. Srinivasa et al., HERB: A Home Exploring Robotic Butler. In Journal of Autonomous Robots, Vol.28, No.1, pages 5-20, 2009.10.1007/s10514-009-9160-9Search in Google Scholar
[9] Motoman industrial robot cooks okonomiyaki. Available at http://www.danshope.com/news/showarticle.php?article_id=43.Search in Google Scholar
[10] Fa-men: robot cookin’. Available at http://www.rameniac.com/reviews/comments/fa-men_nagoya/Search in Google Scholar
[11] K. Kosuge, Development of an automatic dishwashing robot system. In Proceedings of IEEE International Conference on Mechatronics and Automation(ICMA), pages 43-48, China, 2009.10.1109/ICMA.2009.5245127Search in Google Scholar
[12] Y. Sugiura, D. Sakamoto, A. Withana, M. Inami, T. Igarashi, Cooking with Robots: Designing a Household System Working in Open Environments. In Proceedings of International Conference On Human Factors In Computing Systems(CHI), pages 2427-2430, USA, 2010.10.1145/1753326.1753693Search in Google Scholar
[13] T. Asfour, K. Regenstein, P. Azad, J. Schröder, A. Bierbaum, N. Vahrenkamp and R. Dillman, Armar-III: An integrated humanoid platform for sensory-motor control. In Proc. of IEEE-RAS Int. Conf. On Humanoid Robots (Humanoids 2006), pages 169-175, Italy, 2006.10.1109/ICHR.2006.321380Search in Google Scholar
[14] Ch. Borst et al., Rollin’ Justin - Mobile Platform with Variable Base. In Proc. of the IEEE Int. Conference on Robotics and Automation( ICRA), pages 1597-1598, Japan, 2009.10.1109/ROBOT.2009.5152586Search in Google Scholar
[15] B. Bäuml et al., Catching flying balls and preparing coffee: Humanoid Rollin’Justin performs dynamic and sensitive tasks. In Proc. IEEE International Conference on Robotics and Automation( ICRA), pages 3443-3444, China, 2011.10.1109/ICRA.2011.5980073Search in Google Scholar
[16] H. Iwata and S. Sugano, Design of Human Symbiotic Robot TWENDY-ONE. In Proc. of IEEE International Conference on Robotics and Automation(ICRA), pages 580-586, Japan, 2009.10.1109/ROBOT.2009.5152702Search in Google Scholar
[17] K. Okada, M. Kojima, S. Tokutsu, T. Maki, Y. Mori and M. Inaba, Multi-cue 3D Object Recognition in Knowledge-based Visionguided Humanoid Robot System. In Proc. of IEEE/RSJ International Conference on Intelligent Robots and Systems(IROS), pages 3217-3222, USA, 2007.10.1109/IROS.2007.4399245Search in Google Scholar
[18] D. Pangercic, V. Haltakov and M. Beetz, Fast and Robust Object Detection in Household Environments Using Vocabulary Trees with SIFT Descriptors. In IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Workshop on Active Semantic Perception and Object Search in the Real World, USA, 2011.Search in Google Scholar
[19] J. Bohg et al., Task-based Grasp Adaptation on a Humanoid Robot. In 10th International IFAC Symposium on Robot Control( SYROCO), Croatia, 2012.10.3182/20120905-3-HR-2030.00174Search in Google Scholar
[20] A. Rocha, D.C. Hauagge, J. Wainer, S. Goldenstein, Automatic fruit and vegetable classification from images. In Journal of Computers and Electronics in Agriculture, Vol.70, pages 96-104, 2010.10.1016/j.compag.2009.09.002Search in Google Scholar
[21] H. Patel, R.K. Jain, M.V. Joshi, Fruit Detection using Improved Multiple Features based Algorithm. International Journal of Computer Applications, Volume 13 No.2, January, pages 1-5, 2011.10.5120/1756-2395Search in Google Scholar
[22] M. Mudrová, A. Procházka, Principal Component Analysis in Image Processing. In Technical Computing Conference, Czech Republic, 2005.Search in Google Scholar
[23] S.K. Pedersen, Circular Hough Transformation. Aalborg University, Vision, Graphic, and Interactive Systems, 2007.Search in Google Scholar
[24] R. Maini, H. Aggarwal, Study and Comparison of Various Image Edge Detection Techniques. In International Journal of Image Processing( IJIP), Volume 3 Issue 1, January-February, pages 1-11, 2009.Search in Google Scholar
[25] Tutorial on Gabor Filters, Available at http://mplab.ucsd.edu/tutorials/tutorials.html.Search in Google Scholar
[26] V.J. Traver, A. Bernardino, A review of log-polar imaging for visual perception in robotics. In Journal of Robotics and Autonomous Systems, Volume 58 Issue 4, April, pages 378-398, 2010.10.1016/j.robot.2009.10.002Search in Google Scholar
[27] Y. Nakamura and H. Hanafusa, Inverse Kinematics Solutions with Singularity Robustness for Robot Manipulator Control. In Journal of Dynamic Systems,Measurement and Control, Vol.108, pages 163-171, 1986.10.1115/1.3143764Search in Google Scholar
[28] T.F. Chang and R.V. Dubey, A weighted least-norm solution based scheme for avoiding joint limits for redundant manipulators. In IEEE Trans on Robotics and Automation, Vol.11, No.2, pages 286-292, 1995.10.1109/70.370511Search in Google Scholar
[29] N. Hogan, Impedance Control: An Approach to Manipulation. In Proc. of American Control Conference, pages 304-313, 1984. 10.23919/ACC.1984.4788393Search in Google Scholar
This content is open access.