Accessible Requires Authentication Published by Oldenbourg Wissenschaftsverlag March 27, 2018

Foot Interaction Concepts to Support Radiological Interventions

Benjamin Hatscher, Maria Luz and Christian Hansen
From the journal i-com

Abstract

During neuroradiological interventions, physicians need to interact with medical image data, which cannot be done while the hands are occupied. We propose foot input concepts with one degree of freedom, which matches a common interaction task in the operating room. We conducted a study to compare our concepts in regards to task completion time, subjective workload and user experience. Relative input performed significantly better than absolute or rate-based input. Our findings may enable more effective computer interactions in the operating room and similar domains where the hands are not available.

Funding source: Bundesministerium für Bildung und Forschung

Award Identifier / Grant number: 13GW0095A

Funding statement: This work is partially funded by the Federal Ministry of Education and Research (Bundesministerium für Bildung und Forschung) within the STIMULATE research campus (grant number 13GW0095A).

References

[1] A. Alapetite, “Impact of noise and other factors on speech recognition in anaesthesia.” In: International journal of medical informatics 77.1 (2008), pp. 68–77.10.1016/j.ijmedinf.2006.11.007 Search in Google Scholar

[2] J. Alexander, T. Han, W. Judd, P. Irani and S. Subramanian, “Putting your best foot forward: investigating real-world mappings for foot-based gestures.” In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. 2012, pp. 1229–1238. Search in Google Scholar

[3] R. Balakrishnan, G. Fitzmaurice, G. Kurtenbach and K. Singh, “Exploring interactive curve and surface manipulation using a bend and twist sensitive input strip.” In: Proceedings of the 1999 symposium on Interactive 3D graphics. 1999, pp. 111–118. Search in Google Scholar

[4] L. C. Ebert, G. Hatch, G. Ampanozi, M. J. Thali and S. Ross, “You can’t touch this touch-free navigation through radiological images.” In: Surgical innovation 19.3 (2012), pp. 301–307.10.1177/1553350611425508 Search in Google Scholar

[5] T. Fitzke, N. Krail, F. Kroll, L. Ohlrogge, F. Schröder, L. Spillner, A. Voll, F. Dylla, M. Herrlich and R. Malaka, “Fußbasierte Interaktion mit Computersystemen im Operationssaal.” In: CURAC (2015), pp. 49–54. Search in Google Scholar

[6] K. Fukahori, D. Sakamoto and T. Igarashi, “Exploring Subtle Foot Plantar-based Gestures with Sock-placed Pressure Sensors.” In: Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems, CHI’15. New York, NY, USA: ACM, 2015. pp. 3019–3028. Search in Google Scholar

[7] L. Gallo, A. P. Placitelli and M. Ciampi, “Controller-free exploration of medical image data: Experiencing the Kinect.” In: Computer-based medical systems (CBMS), 2011 24th international symposium on. 2011, pp. 1–6. Search in Google Scholar

[8] S. Grange, T. Fong and C. Baur, “M/ORIS: a medical/operating room interaction system.” In: Proceedings of the 6th international conference on Multimodal interfaces. 2004, pp. 159–166. Search in Google Scholar

[9] S. G. Hart, “NASA-task load index (NASA-TLX); 20 years later.” In: Proceedings of the human factors and ergonomics society annual meeting. Vol. 50. 2006, pp. 904–908. Search in Google Scholar

[10] B. Hatscher, M. Luz, L. E. Nacke, N. Elkmann, V. Müller and C. Hansen, “GazeTap: Towards Hands-free Interaction in the Operating Room.” In: Proceedings of the 19th ACM International Conference on Multimodal Interaction. ICMI 2017, New York, NY, USA: ACM. 2017, pp. 243–251. Search in Google Scholar

[11] J. Hettig, P. Saalfeld, M. Luz, M. Becker, M. Skalej and C. Hansen, “Comparison of gesture and conventional interaction techniques for interventional neuroradiology.” In: International Journal of Computer Assisted Radiology and Surgery (2017), pp. 1643–1653. Search in Google Scholar

[12] A. Hübler, C. Hansen, O. Beuing, M. Skalej and B. Preim, “Workflow Analysis for Interventional Neuroradiology using Frequent Pattern Mining.” In: Proceedings of the Annual Meeting of the German Society of Computer- and Robot-Assisted Surgery. Munich, 2014, pp. 165–168. Search in Google Scholar

[13] R. J. K. Jacob, Input Devices and Techniques, 1997. Search in Google Scholar

[14] S. Jalaliniya, J. Smith, M. Sousa, L. Büthe and T. Pederson, “Touch-less interaction with medical images using hand & foot gestures.” In: Proceedings of the 2013 ACM conference on Pervasive and ubiquitous computing adjunct publication. 2013, pp. 1265–1274. Search in Google Scholar

[15] R. Johnson, K. O’Hara, A. Sellen, C. Cousins and A. Criminisi, “Exploring the Potential for Touchless Interaction in Image-guided Interventional Radiology.” In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI ’11. New York, NY, USA: ACM, 2011, pp. 3323–3332. Search in Google Scholar

[16] K. Klamka, A. Siegel, S. Vogt, F. Göbel, S. Stellmach and R. Dachselt, “Look & Pedal: Hands-free Navigation in Zoomable Information Spaces through Gaze-supported Foot Input.” In: Proceedings of the 2015 ACM on International Conference on Multimodal Interaction. 2015, pp. 123–130. Search in Google Scholar

[17] H. M. Mentis, K. O’Hara, G. Gonzalez, A. Sellen, R. Corish, A. Criminisi, R. Trivedi and P. Theodore, “Voice or Gesture in the Operating Room.” In: The 33rd Annual ACM Conference Extended Abstracts (B. Begole, J. Kim, K. Inkpen and W. Woo, eds.), pp. 773–780. Search in Google Scholar

[18] A. Mewes, B. Hensen, F. Wacker and C. Hansen, “Touchless interaction with software in interventional radiology and surgery: a systematic literature review.” In: International Journal of Computer Assisted Radiology and Surgery 12.2 (2017), pp. 291–305.10.1007/s11548-016-1480-6 Search in Google Scholar

[19] M. Minge, M. Thüring, I. Wagner and C. V. Kuhr, “The meCUE Questionnaire: A Modular Tool for Measuring User Experience.” In: Advances in Ergonomics Modeling, Usability & Special Populations. Springer, 2017, pp. 115–128. Search in Google Scholar

[20] N. Bizzotto, A. Costanzo and L. Bizzotto, “Leap motion gesture control with OsiriX in the operating room to control imaging: first experiences during live surgery.” In: Surgical innovation 1 (2014), p. 2. Search in Google Scholar

[21] B. C. Odisio and M. J. Wallace, “Image-guided interventions in oncology.” In: Surgical Oncology Clinics 23.4 (2014), pp. 937–955. Search in Google Scholar

[22] K. O’Hara, G. Gonzalez, A. Sellen, G. Penney, A. Varnavas, H. Mentis, A. Criminisi, R. Corish, M. Rouncefield, N. Dastur and et al., “Touchless interaction in surgery.” In: Communications of the ACM 57.1 (2014), pp. 70–77.10.1145/2541883.2541899 Search in Google Scholar

[23] T. Pakkanen and R. Raisamo, “Appropriateness of foot interaction for non-accurate spatial tasks.” In: CHI’04 extended abstracts on Human factors in computing systems. 2004, pp. 1123–1126. Search in Google Scholar

[24] G. Pearson and M. Weiser, “Of moles and men: the design of foot controls for workstations.” In: ACM SIGCHI Bulletin. Vol. 17. 1986, pp. 333–339. Search in Google Scholar

[25] A. V. Reinschluessel, J. Teuber, M. Herrlich, J. Bissel, M. van Eikeren, J. Ganser, F. Koeller, F. Kollasch, T. Mildner, L. Raimondo and et al., “Virtual Reality for User-Centered Design and Evaluation of Touch-free Interaction Techniques for Navigating Medical Images in the Operating Room.” In: Proceedings of the 2017 CHI Conference Extended Abstracts on Human Factors in Computing Systems. 2017, pp. 2001–2009. Search in Google Scholar

[26] F. Ritter, T. Boskamp, A. Homeyer, H. Laue, M. Schwier, F. Link and H.-O. Peitgen, “Medical image analysis.” In: IEEE pulse. 2.6 (2011), pp. 60–70.10.1109/MPUL.2011.942929 Search in Google Scholar

[27] E. Sandberg-Diment, “A New ‘Mouse’ Resides on the Floor.” In: The NY Times (1985), Y18. Search in Google Scholar

[28] N. Sangsuriyachot and M. Sugimoto, “Novel interaction techniques based on a combination of hand and foot gestures in tabletop environments.” In: Proceedings of the 10th asia pacific conference on Computer human interaction. 2012, pp. 21–28. Search in Google Scholar

[29] J. Scott, D. Dearman, K. Yatani and K. N. Truong, “Sensing foot gestures from the pocket.” In: Proceedings of the 23nd annual ACM symposium on User interface software and technology. 2010, pp. 199–208. Search in Google Scholar

[30] S. Shimizu, K. Kondo, T. Yamazaki, H. Koizumi, T. Miyazaki, S. Osawa, T. Sagiuchi, K. Nakayama, I. Yamamoto and K. Fujii, “Hanging Foot Switch for Bipolar Forceps: A Device for Surgeons Operating in the Standing Position.” In: Neurologia medico-chirurgica 53.1 (2013), pp. 53–55.10.2176/nmc.53.53 Search in Google Scholar

[31] A. L. Simeone, E. Velloso, J. Alexander and H. Gellersen, “Feet movement in desktop 3D interaction.” In: 2014 IEEE Symposium on 3D User Interfaces (3DUI). 2014, pp. 71–74. Search in Google Scholar

[32] M. A. van Veelen, C. J. Snijders, E. van Leeuwen, R. H. M. Goossens and G. Kazemier, “Improvement of foot pedals used during surgery based on new ergonomic guidelines.” In: Surgical Endoscopy And Other Interventional Techniques 17.7 (2003), pp. 1086–1091.10.1007/s00464-002-9185-z Search in Google Scholar

[33] E. Velloso, J. Alexander, A. Bulling and H. Gellersen, “Interactions Under the Desk: A Characterisation of Foot Movements for Input in a Seated Position.” In: Human-Computer Interaction – INTERACT 2015, Vol. 9296. Lecture Notes in Computer Science. Cham: Springer International Publishing. 2015, pp. 384–401. Search in Google Scholar

[34] E. Velloso, D. Schmidt, J. Alexander, H. Gellersen and A. Bulling, “The Feet in Human–Computer Interaction: A Survey of Foot-Based Interaction.” In: ACM Computing Surveys (CSUR) 48.2 (2015), p. 21. Search in Google Scholar

[35] L. S. Wauben, M. A. van Veelen, D. Gossot and R. H. M. Goossens, “Application of ergonomic guidelines during minimally invasive surgery: a questionnaire survey of 284 surgeons.” In: Surgical Endoscopy And Other Interventional Techniques 20.8 (2006), pp. 1268–1274.10.1007/s00464-005-0647-y Search in Google Scholar

[36] K. Zhong, F. Tian and H. Wang, “Foot menu: Using heel rotation information for menu selection.” In: 2011 15th Annual International Symposium on Wearable Computers (ISWC). 2011, pp. 115–116. Search in Google Scholar

Published Online: 2018-03-27
Published in Print: 2018-04-25

© 2018 Walter de Gruyter GmbH, Berlin/Boston