Accessible Requires Authentication Published by Oldenbourg Wissenschaftsverlag March 27, 2018

Design and Evaluation of a Natural User Interface for Piloting an Unmanned Aerial Vehicle

Can gestural, speech interaction and an augmented reality application replace the conventional remote control for an unmanned aerial vehicle?

Roman Herrmann and Ludger Schmidt
From the journal i-com

Abstract

Controlling an unmanned aerial vehicle is challenging and requires an intensive training. One cause is the teleoperation with the conventional input device, the remote control, whose functions are complicate. This paper presents an alternative concept for the teleoperation. Its realization includes a Thalmic Myo gesture control wristlet and a Microsoft HoloLens head-mounted display. These devices are used to implement an augmented reality interface, a tactile feedback and a gesture and speech input. Finally, this implementation has been evaluated with 30 participants and compared with a conventional remote control. The results show that the proposed interface is a good solution but does not reach the performance of the remote control.

References

[1] Bleyer, T., Hold, U., Rademacher, U., & Windel, A. (2009). Belastungen des Hand-Arm-Systems als Grundlage einer ergonomischen Produktbewertung – Fallbeispiel Schaufeln. 1. Auflage. Dortmund: Bundesanstalt für Arbeitsschutz und Arbeitsmedizin. Search in Google Scholar

[2] Cauchard, J. R., Zhai, K. Y., Spadafora, M., & Landay, J. A. (2016). Emotion Encoding in Human-Drone Interaction. In 11th ACM/IEEE International Conference on Human-Robot Interaction (HRI) (pp. 263–270). Piscataway, NJ, USA: IEEE Press. Search in Google Scholar

[3] Cramar, L., Hegenberg, J., & Schmidt, L. (2012). Ansatz zur experimentellen Ermittlung von Gesten zur Steuerung eines mobilen Roboters. In VDI/VDE-Gesellschaft Mess- und Automatisierungstechnik, Useware 2012: Mensch-Maschine-Interaktion (Kaiserslautern 2012) (Vols. VDI-Berichte 2179, pp. 173–183). Düsseldorf: VDI-Verlag. Search in Google Scholar

[4] Debernardis, S., Fiorentino, M., Gattullo, M., Monno, G., & Uva, A. E. (2014). Text readability in head-worn displays: color and style optimization in video versus optical see-through devices. IEEE transactions on visualization and computer graphics, 20(1), pp. 125–139.10.1109/TVCG.2013.86 Search in Google Scholar

[5] DIN EN ISO 9241-13. (2011). Ergonomie der Mensch-System-Interaktion – Teil 13: Benutzerführung. Search in Google Scholar

[6] Dong, M., Cao, L., Zhang, D.-M., & Guo, R. (2016). UAV flight controlling based on Kinect for Windows v2. In International Congress on Image and Signal Processing, BioMedical Engineering and Informatics (CISP-BMEI) (pp. 735–739). IEEE. Search in Google Scholar

[7] Elmezain, M., Al-Hamadi, A., Appenrodt, J., & Michaelis, B. (2008). A Hidden Markov Model-based continuous gesture recognition system for hand motion trajectory. In 19th International Conference on Pattern Recognition (pp. 1–4). Piscataway, NJ: IEEE. Search in Google Scholar

[8] Gabbard, J. L., Swan, J. E., & Hix, D. (2006). The Effects of Text Drawing Styles, Background Textures, and Natural Lighting on Text Legibility in Outdoor Augmented Reality. Presence: Teleoperators and virtual environments, 15(1), pp. 16–32.10.1162/pres.2006.15.1.16 Search in Google Scholar

[9] Hegenberg, J., Cramar, L., & Schmidt, L. (2012). Task- and User-Centered Design of a Human-Robot System for Gas Leak Detection: From Requirements Analysis to Prototypical Realization. In I. Petrovic, & P. Korondi, 10th International IFAC Symposium on Robot Control (Dubrovnik 2012) (pp. 793–798). Dubrovnik: IFAC. Search in Google Scholar

[10] Herrmann, R., & Schmidt, L. (2017). Gestaltung und Evaluation einer natürlichen Flugrobotersteuerung. In M. Burghardt, R. Wimmer, C. Wolff, & C. Womser-Hacker, Mensch und Computer 2017 – Tagungsband (Regensburg 2017) (pp. 147–158). Bonn: Gesellschaft für Informatik e. V. Search in Google Scholar

[11] Herrmann, R., & Schmidt, L. (2017). Natürliche Benutzungsschnittstelle zur Steuerung eines Flugroboters. In M. Burghardt, R. Wimmer, C. Wolff, & C. Womser-Hacker, Mensch und Computer 2017 – Workshopband (Regensburg 2017) (pp. 637–640). Bonn: Gesellschaft für Informatik e. V. Search in Google Scholar

[12] Herrmann, R., Hegenberg, J., & Schmidt, L. (2016). Evaluation des Leitstands eines Boden-Luft-Servicerobotiksystems für eine Produktionsumgebung. In VDI Wissensforum GmbH, Useware 2016 (pp. 187–200). Düsseldorf: VDI Verlag GmbH. Search in Google Scholar

[13] Herrmann, R., Hegenberg, J., Ziegner, D., & Schmidt, L. (2016). Empirische Evaluation von Steuerungsarten für Flugroboter. In Gesellschaft für Arbeitswissenschaft e. V., Arbeit in komplexen Systemen – Digital, vernetzt, human?! 62. Kongress der Gesellschaft für Arbeitswissenschaft (Aachen 2016) (pp. 1–6 (A.4.9)). Dortmund: GfA-Press. Search in Google Scholar

[14] Higuchi, K., & Rekimoto, J. (2013). Flying head: a head motion synchronization mechanism for unmanned aerial vehicle control. In W. E. Mackay, CHI ’13 Extended Abstracts on Human Factors in Computing Systems (pp. 2029–2038). New York, NY: ACM. Search in Google Scholar

[15] Jones, G., Berthouze, N., Bielski, R., & Julier, S. (2010). Towards a situated, multimodal interface for multiple UAV control. In 2010 IEEE International Conference on Robotics and Automation (pp. 1739–1744). Piscataway, NJ: IEEE. Search in Google Scholar

[16] Lee, J. C. (2010). In search of a natural gesture. XRDS: Crossroads, The ACM Magazine for Students, 16(4), pp. 9–13.10.1145/1764848.1764853 Search in Google Scholar

[17] Livingston, M. A. (2013). Issues in Human Factors Evaluations of Augmented Reality Systems. In W. Huang, L. Alem, & M. A. Livingston, Human factors in augmented reality environments (pp. 3–9). New York: Springer. Search in Google Scholar

[18] Mäntylä, V.-M. (2001). Discrete hidden Markov models with application to isolated user-dependent hand gesture recognition. pp. 2–104. Search in Google Scholar

[19] McMillan, G. R. (1998). The technology and applications of gesture-based control. In T. R. Anderson, G. McMillian, J. Borah, & G. M. Rood, Alternative Control Technologies, Human Factors Issues (pp. 1–11). Canada Communication Group. Search in Google Scholar

[20] Monajjemi, M., Mohaimenianpour, S., & Vaughan, R. (2016). UAV, come to me: End-to-end, multi-scale situated HRI with an uninstrumented human and a distant UAV. In 2016 IEEE/RSJ International Conference on Intelligent Robots and Systems (pp. 4410–4417). IEEE. Search in Google Scholar

[21] Nielsen, M., Störring, M., Moeslund, T., & Granum, E. (2004). A Procedure for Developing Intuitive and Ergonomic Gesture Interfaces for HCI. In A. Camurri, & G. Volpe, Gesture-Based Communication in Human-Computer Interaction (Vol. 2915, pp. 409–420). Springer Berlin Heidelberg. Search in Google Scholar

[22] Norman, D. A. (1990). Why interfaces don’t work. In B. Laurel, & S. J. Mountford, The Art of human-computer interface design (pp. 209–219). Addison-Wesley. Search in Google Scholar

[23] Norman, D. A. (2010). Natural User Interfaces Are Not Natural. interactions, 17(3), pp. 6–10.10.1145/1744161.1744163 Search in Google Scholar

[24] Oehme, O., Wiedenmaier, S., Schmidt, L., & Luczak, H. (2001). Empirical Studies on an Augmented Reality User Interface for a Head Based Virtual Retinal Display. In M. J. Smith, & G. Salvendy, Systems, Social and Internationalization Design Aspects of Human-Computer Interaction: Proceedings of the HCI International 2001 (pp. 1026–1030). Mahwah: Erlbaum. Search in Google Scholar

[25] Peshkova, E., Hitz, M., & Ahlström, D. (2016). Exploring User-Defined Gestures and Voice Commands to Control an Unmanned Aerial Vehicle. In R. Poppe, J.-J. Meyer, R. Veltkamp, & M. Dastani, Intelligent Technologies for Interactive Entertainment (pp. 47–62). Cham: Springer International Publishing; Imprint: Springer. Search in Google Scholar

[26] Peshkova, E., Hitz, M., & Kaufmann, B. (2017). Natural Interaction Techniques for an Unmanned Aerial Vehicle System. IEEE Pervasive Computing, 16(1), pp. 34–42.10.1109/MPRV.2017.3 Search in Google Scholar

[27] Pfeil, K., Koh, S. L., & LaViola, J. (2013). Exploring 3d gesture metaphors for interaction with unmanned aerial vehicles. In J. Kim, J. Nichols, & P. Szekely, Proceedings of the 2013 international conference on Intelligent user interfaces (pp. 257–266). New York, NY: ACM. Search in Google Scholar

[28] Prinzel, L. J., & Risser, M. (2004). Head-Up Displays and Attention Capture. Springfield: National Technical Information Service. Search in Google Scholar

[29] Prümper, J. (1997). Der Benutzungsfragebogen ISONORM 9241/10: Ergebnisse zur Reliabilität und Validität. In R. Liskowsky, B. M. Velichkovsky, & W. Wünschmann, Software-Ergonomie ’97: Usability Engineering: Integration von Mensch-Computer-Interaktion und Software-Entwicklung (pp. 254–262). Stuttgart: B. G. Teubner. Search in Google Scholar

[30] Quigley, M., Conley, K., Gerkey, B., Faust, J., Foote, T., Leibs, J., Ng, A. Y. (2009). ROS: an open-source Robot Operating System. Search in Google Scholar

[31] Raskin, J. (1997). Intuitive equals Familiar. Communications of the ACM, 37(9), pp. 17–18. Search in Google Scholar

[32] Schlenzig, J., Hunter, E., & Jain, R. (1994). Recursive identification of gesture inputs using hidden markov models. In Proceedings of the Second IEEE Workshop on Applications of Computer Vision (pp. 187–194). IEEE. Search in Google Scholar

[33] Schmidt, L., Herrmann, R., Hegenberg, J., & Cramar, L. (2014). Evaluation einer 3-D-Gestensteuerung für einen mobilen Serviceroboter. Zeitschrift für Arbeitswissenschaft, 68(3), pp. 129–134.10.1007/BF03374438 Search in Google Scholar

[34] Urakami, J. (2014). Cross-cultural comparison of hand gestures of Japanese and Germans for tabletop systems. Computers in Human Behavior, 40(0), pp. 180–189.10.1016/j.chb.2014.08.010 Search in Google Scholar

[35] Wachs, J. P., Kölsch, M., Stern, H., & Edan, Y. (2011). Vision-based hand-gesture applications. Communications of the ACM, 54(2), pp. 60–71.10.1145/1897816.1897838 Search in Google Scholar

[36] Williams, K. W. (2004, Dec). A Summary of Unmannded Aircraft Accidents/Incident Data: Human Factors Implications. (F. A. Dept. Transportation, Ed.) Final Report DOT/FAA/AM-04/24. Search in Google Scholar

[37] Zhai, S., Kristensson, P. O., Appert, C., Anderson, T. H., & Cao, X. (2012). Foundational issues in touch-surface stroke gesture design—an integrative review. Foundations and Trends® in Human Computer Interaction, 5(2), pp. 97–205.10.1561/1100000012 Search in Google Scholar

Published Online: 2018-03-27
Published in Print: 2018-04-25

© 2018 Walter de Gruyter GmbH, Berlin/Boston