Jump to ContentJump to Main Navigation
Show Summary Details
More options …

at - Automatisierungstechnik

Methoden und Anwendungen der Steuerungs-, Regelungs- und Informationstechnik

[AT - Automation Technology: Methods and Applications of Control, Regulation, and Information Technology
]

Editor-in-Chief: Jumar, Ulrich


IMPACT FACTOR 2018: 0.500

CiteScore 2018: 0.60

SCImago Journal Rank (SJR) 2018: 0.211
Source Normalized Impact per Paper (SNIP) 2018: 0.532

Online
ISSN
2196-677X
See all formats and pricing
More options …
Volume 66, Issue 2

Issues

Driver observation and shared vehicle control: supporting the driver on the way back into the control loop

Fahrerbeobachtung und kooperative Fahrzeugführung: Wie der Fahrer auf dem Weg zurück in die Regelschleife unterstützt werden kann

Julian Ludwig
  • Corresponding author
  • Karlsruher Institut für Technologie, Fakultät für Elektrotechnik und Informationstechnik, Karlsruhe, Germany
  • Email
  • Other articles by this author:
  • De Gruyter OnlineGoogle Scholar
/ Manuel Martin / Matthias Horne / Michael Flad / Michael Voit / Rainer Stiefelhagen / Sören Hohmann
Published Online: 2018-02-10 | DOI: https://doi.org/10.1515/auto-2017-0103

Abstract

In the near future, drivers of automated cars will still have to take over from time to time at short notice. Current control systems implement a hard switch, disabling the automation all at once. However, studies show that the driver’s ability to take over depends on his last activity. We therefore propose a system that uses camera based observation of the driver to assess the situation and to predict transition times. We combine this with a control system that uses a cooperative shared control method to support the driver in takeover situations and allows him to adjust safely to the current situation. We present our first steps towards this goal and show both how the behavior of the driver in the interior can be assessed and how a cooperative control transfer can be implemented. We further point out the necessary steps to implement the proposed system and give a first impression of the performance via simulation.

Zusammenfassung

Auch in naher Zukunft werden die Fahrer hochautomatisierter Fahrzeuge gelegentlich noch die Fahraufgabe kurzfristig übernehmen müssen. Aktuelle Regelungssysteme schalten hierfür die Automation hart ab, wobei jedoch Studien belegen, dass die Übernahmefähigkeit des Fahrers stark von dessen letzter Aktivität abhängt. Aus diesem Grund schlagen wir ein Konzept vor, welches mit Hilfe einer kamerabasierten Fahrerbeobachtung die Situation beurteilt und die Übergabezeiten vorhersagt. Wir kombinieren die Fahrerbeobachtung mit einem kooperativen Regelungsansatz, welcher den Fahrer in der Übernahmesituation unterstützt und ihm erlaubt, sich an die aktuelle Situation anzupassen. In diesem Artikel präsentieren wir die ersten Schritte zur Umsetzung dieses Ziels und stellen zum einen vor, wie der Fahrer im Innenraum erfasst, zum anderen wie eine kooperative Übergabe der Fahrzeugführung realisiert werden kann. Wir stellen dar, welche weiteren Schritte für eine vollständige Implementierung notwendig sind und zeigen abschließend nicht trennen Simulationsergebnisse.

Keywords: cooperative control; driver observation; game theory; automated driving

Schlagwörter: Kooperative Regelung; Fahrerbeobachtung; Spieltheorie; Hochautomatisiertes Fahren

References

  • 1.

    Taxonomy and definitions for terms related to driving automation systems for on-road motor vehicles sae j 3016, 2016.

  • 2.

    David A. Abbink, Mark Mulder, and Erwin R. Boer. Haptic shared control: smoothly shifting control authority? Cognition, Technology & Work, 14(1):19–28, 2012.Web of ScienceCrossrefGoogle Scholar

  • 3.

    Shinko Y. Cheng and Mohan M. Trivedi. Vision-based infotainment user determination by hand recognition for driver assistance. IEEE transactions on intelligent transportation systems, 11(3):759–764, 2010.CrossrefWeb of ScienceGoogle Scholar

  • 4.

    Daniel Damböck, Mehdi Farid, Lars Tönert, and Klaus Bengler. Übernahmezeiten beim hochautomatisierten fahren. Tagung Fahrerassistenz. München, 15:16, 2012.Google Scholar

  • 5.

    Aaron Enes and Wayne Book. Blended shared control of zermelo’s navigation problem. In American Control Conference (ACC), pages 4307–4312. IEEE, 2010.Google Scholar

  • 6.

    Jacob Engwerda. LQ dynamic optimization and differential games. John Wiley & Sons, 2005.Google Scholar

  • 7.

    Michael Flad, Jonas Otten, Stefan Schwab, and Sören Hohmann. Steering driver assistance system: A systematic cooperative shared control design approach. In Proceedings of the International Conference on Systems, Man and Cybernetics (SMC), pages 3585–3592, 2014.Google Scholar

  • 8.

    Christian Gold, Daniel Damböck, Lutz Lorenz, and Klaus Bengler. “Take over!” How long does it take to get the driver back into the loop? In Proceedings of the Human Factors and Ergonomics Society Annual Meeting, volume 57, pages 1938–1942, 2013.Google Scholar

  • 9.

    Albert Haque, Boya Peng, Zelun Luo, Alexandre Alahi, Serena Yeung, Li Fei-Fei, Jiri Matas, Nicu Sebe, and Max Welling. Towards viewpoint invariant 3d human pose estimation. In Proceedings of the European Conference on Computer Vision (ECCV), pages 160–177, 2016.Google Scholar

  • 10.

    Nikolas Hesse, Gregor Stachowiak, Timo Breuer, and Michael Arens. Estimating body pose of infants in depth images using random ferns. In Proceedings of the IEEE International Conference on Computer Vision Workshops, pages 35–43, 2015.Google Scholar

  • 11.

    Pedro Jiménez, Luis M. Bergasa, Jesús Nuevo, Noelia Hernández, and Ivan G. Daza. Gaze fixation system for the evaluation of driver distractions induced by ivis. IEEE Transactions on Intelligent Transportation Systems, 13(3):1167–1178, 2012.CrossrefWeb of ScienceGoogle Scholar

  • 12.

    Uwe Kiencke and Lars Nielsen. Automotive control systems: for engine, driveline, and vehicle. Springer, 2005.Google Scholar

  • 13.

    Charles C. Liu, Simon G. Hosking, and Michael G. Lenné. Predicting driver drowsiness using vehicle measures: Recent insights and future challenges. Journal of safety research, 40(4):239–245, 2009.CrossrefWeb of ScienceGoogle Scholar

  • 14.

    Tianchi Liu, Yan Yang, Guang-Bin Huang, Yong Kiang Yeo, and Zhiping Lin. Driver distraction detection using semi-supervised machine learning. Transactions on intelligent transportation systems, 17(4):1108–1120, 2016.Web of ScienceCrossrefGoogle Scholar

  • 15.

    Julian Ludwig, Christoph Gote, Michael Flad, and Sören Hohmann. Cooperative dynamic vehicle control allocation using time-variant differential games. In Proceedings of the International Conference on Systems, Man and Cybernetics (SMC), pages 117–122, 2017.Google Scholar

  • 16.

    Manuel Martin, Frederik Diederichs, Kangxiong Li, Michael Voit, Vivien Melcher, Harald Wildroither, and Rainer Stiefelhagen. Klassifikation von fahrerzuständen und nebentätigkeiten über körperposen bei automatisierter fahrt. In VDI/VW-Gemeinschaftstagung Fahrerassistenzsysteme und automatisiertes Fahren, 2016.Google Scholar

  • 17.

    Manuel Martin, Stephan Stuehmer, Michael Voit, and Rainer Stiefelhagen. Real time driver body pose estimation for novel assistance systems. In International Conference on Intelligent Transportation Systems (ITSC), pages 1738–1744. IEEE, 2017.Google Scholar

  • 18.

    Harshal Maske, Girish Chowdhary, and Prabhakar Pagilla. Intent aware shared control in off-nominal situations. In Proceedings of the Conference on Decision and Control (CDC), pages 5171–5176. IEEE, 2016.Google Scholar

  • 19.

    Ralph Oyini Mbouna, Seong G. Kong, and Myung-Geun Chun. Visual analysis of eye state and head pose for driver alertness monitoring. Transactions on intelligent transportation systems, 14(3):1462–1469, 2013.CrossrefWeb of ScienceGoogle Scholar

  • 20.

    Natasha Merat, A. Hamish Jamson, Frank C. H. Lai, and Oliver Carsten. Highly automated driving, secondary task performance, and driver state. Human factors, 54(5):762–771, 2012.CrossrefWeb of ScienceGoogle Scholar

  • 21.

    Natasha Merat, A. Hamish Jamson, Frank C. H. Lai, Michael Daly, and Oliver M. J. Carsten. Transition to manual: Driver behaviour when resuming control from a highly automated vehicle. Transportation research part F: traffic psychology and behaviour, 27:274–282, 2014.CrossrefWeb of ScienceGoogle Scholar

  • 22.

    Pavlo Molchanov, Shalini Gupta, Kihwan Kim, and Kari Pulli. Multi-sensor system for driver’s hand-gesture recognition. In Proceedings of the Conference and Workshops on Automatic Face and Gesture Recognition (FG), volume 1, pages 1–8, 2015.Google Scholar

  • 23.

    Xiaoxiang Na and David J. Cole. Game-theoretic modeling of the steering interaction between a human driver and a vehicle collision avoidance controller. IEEE Transactions on Human-Machine Systems, 45(1):25–38, 2015.CrossrefWeb of ScienceGoogle Scholar

  • 24.

    S. Ozgur Oguz, Ayse Kucukyilmaz, Tevfik Metin Sezgin, and Cagatay Basdogan. Haptic negotiation and role exchange for collaboration in virtual environments. In Haptics Symposium, pages 371–378, 2010.Google Scholar

  • 25.

    Eshed Ohn-Bar, Sujitha Martin, Ashish Tawari, and Mohan M. Trivedi. Head, eye, and hand patterns for driver activity recognition. In Proceedings of the International Conference on Pattern Recognition, pages 660–665, 2014.Google Scholar

  • 26.

    Ina Petermann-Stock, Linn Hackenberg, Tobias Muhr, and Christian Mergl. Wie lange braucht der fahrer? eine analyse zu übernahmezeiten aus verschiedenen nebentätigkeiten während einer hochautomatisierten staufahrt. 6. Tagung Fahrerassistenzsysteme. Der Weg zum automatischen Fahren, 2013.

  • 27.

    Jonas Radlmayr, Christian Gold, Lutz Lorenz, Mehdi Farid, and Klaus Bengler. How traffic situations and non-driving related tasks affect the take-over quality in highly automated driving. In Proceedings of the Human Factors and Ergonomics Society Annual Meeting, volume 58, pages 2063–2067, 2014.Google Scholar

  • 28.

    Jamie Shotton, Toby Sharp, Alex Kipman, Andrew Fitzgibbon, Mark Finocchio, Andrew Blake, Mat Cook, and Richard Moore. Real-time human pose recognition in parts from single depth images. Communications of the ACM, 56(1):116–124, 2013.CrossrefWeb of ScienceGoogle Scholar

  • 29.

    Seyed Hossein Tamaddoni, Saied Taheri, and Mehdi Ahmadian. Optimal preview game theory approach to vehicle stability controller design. Vehicle system dynamics, 49(12):1967–1979, 2011.CrossrefWeb of ScienceGoogle Scholar

  • 30.

    Ashish Tawari and Mohan M. Trivedi. Robust and continuous estimation of driver gaze zone by dynamic analysis of multiple face videos. In Proceedings of the Intelligent Vehicles Symposium, pages 344–349. IEEE, 2014.Google Scholar

  • 31.

    Francisco Vicente, Zehua Huang, Xuehan Xiong, Fernando De la Torre, Wende Zhang, and Dan Levi. Driver gaze tracking and eyes off the road detection system. IEEE Transactions on Intelligent Transportation Systems, 16(4):2014–2027, 2015.Web of ScienceCrossrefGoogle Scholar

  • 32.

    Hongsong Wang and Liang Wang. Modeling temporal dynamics and spatial configurations of actions using two-stream recurrent neural networks. In The IEEE Conference on Computer Vision and Pattern Recognition (CVPR), July 2017.Google Scholar

  • 33.

    Shih-En Wei, Varun Ramakrishna, Takeo Kanade, and Yaser Sheikh. Convolutional pose machines. In Proceedings of the Conference on Computer Vision and Pattern Recognition (CVPR), pages 4724–4732, 2016.Google Scholar

  • 34.

    Martin Wollmer, Christoph Blaschke, Thomas Schindl, Björn Schuller, Berthold Farber, Stefan Mayer, and Benjamin Trefflich. Online driver distraction detection using long short-term memory. Transactions on Intelligent Transportation Systems, 12(2):574–582, 2011.Web of ScienceCrossrefGoogle Scholar

  • 35.

    Ho Yub Jung, Soochahn Lee, Yong Seok Heo, and Il Dong Yun. Random tree walk toward instantaneous 3d human pose estimation. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pages 2467–2474, 2015.Google Scholar

About the article

Julian Ludwig

Julian Ludwig studied electrical engineering and information technology at Karlsruhe Institute of Technology (KIT). He received the bachelor degree in 2011 and the master degree in 2013. Since February 2014, he is a member of the scientific staff of IRS. His research is focused on modeling and optimization of transitions of the driving task between driver and assistance system.

Manuel Martin

Manuel Martin studied computer science at the Karlsruhe Intitute of Technology (KIT). He received his diploma in 2013. In his diploma thesis at the Fraunhofer IOSB he worked on head pose estimation based on depth cameras. Since March 2014 he is a member of the scientific staff of the Fraunhofer IOSB in the group Perceptual User Interfaces. In his research he focuses on body pose estimation and activity recognition of drivers for (automated-)vehicles.

Matthias Horne

Matthias Horne studied computer science at the Karlsruhe Institute of Technology and received the master’s degree in 2016. He currently works at the Fraunhofer Institute of Optronics, System Technologies and Image Exploitation (IOSB) in Karlsruhe on image analysis algorithms with a focus on the interior of cars.

Michael Flad

Michael Flad received the Diploma degree from Ravensburg University of Cooperative Education, Ravensburg, Germany, in 2008, and the M. Sc. degree in electrical engineering and the Ph. D. degree in control engineering from Karlsruhe Institute of Technology (KIT), Karlsruhe, Germany, in 2011 and 2016, respectively. Since 2016, he has been the Group Leader of the research group Cooperative Systems with the Institute of Control Engineering, KIT. His research interests include cooperative control structures between human and machine and the application of these concepts to driver assistance system.

Michael Voit

Dr. Michael Voit studied computer science at the Karlsruhe Institute of Technology (KIT). After receiving his diploma in 2005, he began his research at KIT on the estimation of people’s visual focus of attention for perceptual user interfaces, using a camera-based perception of head orientations and context observations. In 2008 he co-initiated and joined the newly founded group at Fraunhofer IOSB for Perceptual User Interfaces, in order to research and develop smart and attentive workplace environments. Since finishing his Ph. D. in 2011, he took over group management and established the developed methodologies in numerous domains benefitting from proactive assistance systems, such as e. g. (semi-)autonomous driving as well as manual manufacture or medical surgery.

Rainer Stiefelhagen

Rainer Stiefelhagen received his Diplom (Dipl.-Inform) and Doctoral degree (Dr.-Ing.) from the Universität Karlsruhe (TH) in 1996 and 2002, respectively. He is currently a full professor for “Information technology systems for visually impaired students” at the Karlsruhe Institute of Technology (KIT), where he directs the Computer Vision for Human-Computer Interaction Lab at the Institute for Anthropomatics and Robotics as well as KIT’s Study Center for Visually Impaired Students. His research interests include computer vision methods for visual perception of humans and their activities, in order to facilitate perceptive multimodal interfaces, humanoid robots, smart environments, multimedia analysis and assistive technology for persons with visual impairments.

Sören Hohmann

Sören Hohmann studied electrical engineering at the Technische Universität Braunschweig, University of Karlsruhe and école nationale supérieure d’électricité et de mécanique Nancy. He received the diploma degree (1997) and Ph. D. degree (2002) from University of Karlsruhe. Afterwards, until 2010 he worked in the industry for BMW, Munich, where his last position was head of the predevelopement and series developement of active safety systems. Today he is the head of the Institute of Control Systems at the Karlsruhe Institute of Technology, Germany as well as a directors board member of the research center for information technology (FZI), Karlsruhe. His research interests are cooperative control, alternative energies and system guarantees by design.


Received: 2017-10-19

Accepted: 2018-01-08

Published Online: 2018-02-10

Published in Print: 2018-02-23


Funding Source: Bundesministerium für Bildung und Forschung

Award identifier / Grant number: 16SV7675K

This work is being supported by the Federal Ministry of Education and Research of Germany (Bundesministerium für Bildung und Forschung) in the project “personalized adaptive cooperative systems for automated vehicles” (PAKoS), Grant number: 16SV7675K.


Citation Information: at - Automatisierungstechnik, Volume 66, Issue 2, Pages 146–159, ISSN (Online) 2196-677X, ISSN (Print) 0178-2312, DOI: https://doi.org/10.1515/auto-2017-0103.

Export Citation

© 2018 Walter de Gruyter GmbH, Berlin/Boston.Get Permission

Citing Articles

Here you can find all Crossref-listed publications in which this article is cited. If you would like to receive automatic email messages as soon as this article is cited in other publications, simply activate the “Citation Alert” on the top of this page.

[1]
Mingjun Li, Xiaolin Song, Haotian Cao, Jianqiang Wang, Yanjun Huang, Chuan Hu, and Hong Wang
Mechanical Systems and Signal Processing, 2019, Volume 124, Page 199

Comments (0)

Please log in or register to comment.
Log in