Automated vehicles rely on a precise intrinsic and extrinsic calibration of all sensors. An exact calibration leads to accurate localization and object detection results. Especially for sensor data fusion, the transformation between different sensor frames must be well known. Moreover, modular and redundant platforms require a large number of sensors to cover their full surroundings. This makes the calibration process complex and challenging. In this article, we describe the procedure to calibrate the full sensor setup of a modular autonomous driving platform, consisting of camera, lidar, and radar sensors, in four subsequent steps. At first, the intrinsic and extrinsic camera parameters are determined. Afterwards, the transformation from lidar to camera on the one hand and from lidar to radar on the other hand is estimated. Lastly, the extrinsic calibration between all lidars and the vehicle frame is performed. In our evaluation, we show that these steps lead to an accurate calibration of the complete vehicle.
Automatisierte Fahrzeuge sind auf eine hochgenaue intrinsische und extrinsische Kalibrierung aller Sensoren angewiesen. Eine genaue Kalibrierung führt zu einer präzisen Lokalisierung und akkuraten Objekterkennung. Vor allem für die Fusionierung mehrerer Sensoren muss die Transformation zwischen den Sensordaten hinreichend bekannt sein. Darüber hinaus benötigen besonders modulare und redundant ausgelegte Plattformen eine große Menge an Sensoren, um ihr gesamtes Umfeld wahrnehmen zu können. Das macht den Prozess der Kalibrierung komplex und herausfordernd. In diesem Artikel wird der Kalibrierungsprozess des ganzheitlichen Sensorsetups einer automatisierten Fahrzeugplattform in vier nachfolgenden Schritten beschrieben, die Kameras, Lidar- und Radarsensoren beinhaltet. Zunächst werden die intrinsischen und extrinsischen Kameraparameter bestimmt. Danach wird einerseits die Transformation zwischen Lidar und Kamera und andererseits zwischen Lidar und Radar berechnet. Als letzten Schritt wird die extrinsische Kalibrierung zwischen allen Lidarsensoren und dem Fahrzeugkoordinatensystem bestimmt. In der Evaluation wird gezeigt, dass diese Schritte zu einer präzisen Kalibrierung des Gesamtsystems führen.
Funding source: Bundesministerium für Bildung und Forschung
Award Identifier / Grant number: FKZ 16EMO0287
Award Identifier / Grant number: FKZ 16EMO0290
Funding statement: This research is accomplished within the project UNICARagil (FKZ 16EMO0287 / FKZ 16EMO0290). We acknowledge the financial support for the project by the Federal Ministry of Education and Research of Germany (BMBF).
About the authors
Christian Kinzig received his Master’s degree in mechanical engineering at the Karlsruhe Institute of Technology, Germany in 2018, where he is currently pursuing a Ph. D. degree at the Institute of Measurement and Control Systems. His research is focused towards surround view generation from camera data for automated driving.
Markus Horn received his Bachelor’s and Master’s degree in communications and computer engineering from Ulm University, Germany in 2015 and 2018, respectively. He is currently a researcher at the Institute of Measurement, Control, and Microtechnology at Ulm University, working towards his Ph. D. degree in the field of sensor calibration for automated vehicles.
Martin Lauer received the Diploma degree in computer science from the Karlsruhe Institute of Technology, Germany and the Ph. D. degree in computer science from Osnabruck University, Germany in 2004. He was a Postdoctoral Researcher with Osnabruck University in the areas of machine learning and autonomous robots. Since 2008, he has been heading a research group at the Karlsruhe Institute of Technology. His main research interests are in the areas of machine vision, autonomous vehicles, and machine learning.
Michael Buchholz received his Diploma degree in electrical engineering and information technology as well as his Ph. D. from the faculty of Electrical Engineering and Information Technology at Karlsruhe Institute of Technology, Germany. Since 2009, he is serving as a research group leader and lecturer at the Institute of Measurement, Control and Microtechnology at Ulm University, Germany. His research interests comprise connected automated driving, electric mobility, modelling and control of mechatronic systems, and system identification.
Christoph Stiller received a Diploma degree in electrical engineering and a Ph. D. degree from RWTH Aachen University, Germany, in 1988 and 1994, respectively. He held a Postdoctoral position with INRS, Montreal, QC, Canada. In 1995, he joined the research of Robert Bosch GmbH in Germany. Since 2001, he has been a Full Professor with the Karlsruhe Institute of Technology, Germany and since 2009 Director at FZI Research Center for Information Technology in Karlsruhe, Germany. He is spokesperson of the focus project ‘Cooperative Interacting Automobiles’ of the German Research Foundation DFG. His research interests include perception and planning for automated vehicles.
Klaus Dietmayer was born in Celle, Germany in 1962. He received his Diploma degree in electrical engineering from the Technical University of Braunschweig, Germany, in 1989 and the Dr.-Ing. degree (equivalent to Ph. D.) from the University of Armed Forces in Hamburg, Germany in 1994. In 1994 he joined the Philips Semiconductors Systems Laboratory in Hamburg, Germany as a research engineer. Since 1996 he became a manager in the field of networks and sensors for automotive applications. In 2000 he was appointed to a professorship at Ulm University in the field of measurement and control. Currently he is Full Professor and Director of the Institute of Measurement, Control and Microtechnology in the school of Engineering and Computer Science at Ulm University. His research interests include information fusion, multi-object tracking, environment perception for advanced automotive driver assistance, and E-Mobility. Klaus Dietmayer is member of the IEEE and the German society of engineers VDI/VDE.
1. “The UNICARagil project.” https://www.unicaragil.de, 2021.Search in Google Scholar
2. T. Woopen et al., “UNICARagil – Disruptive Modular Architectures for Agile, Automated Vehicle Concepts,” in 27th Aachen Colloquium Automobile and Engine Technology, pp. 663–694, 2018.Search in Google Scholar
3. M. Buchholz et al., “Automation of the UNICARagil vehicles,” in 29th Aachen Colloquium Sustainable Mobility, pp. 1531–1560, 2020.Search in Google Scholar
4. R. Tsai, “A versatile camera calibration technique for high-accuracy 3d machine vision metrology using off-the-shelf tv cameras and lenses,” IEEE Journal on Robotics and Automation, vol. 3, no. 4, pp. 323–344, 1987.10.1109/JRA.1987.1087109Search in Google Scholar
5. Z. Zhang, “A flexible new technique for camera calibration,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 22, no. 11, pp. 1330–1334, 2000.10.1109/34.888718Search in Google Scholar
6. T. Strauß, J. Ziegler, and J. Beck, “Calibrating multiple cameras with non-overlapping views using coded checkerboard targets,” in 17th International IEEE Conference on Intelligent Transportation Systems (ITSC), pp. 2623–2628, 2014.10.1109/ITSC.2014.6958110Search in Google Scholar
7. J. Beck and C. Stiller, “Generalized b-spline camera model,” in 2018 IEEE Intelligent Vehicles Symposium (IV), pp. 2137–2142, 2018.10.1109/IVS.2018.8500466Search in Google Scholar
8. J. Beck, Camera Calibration with Non-Central Local Camera Models. PhD thesis, Karlsruhe Institute of Technology (KIT), 2021.Search in Google Scholar
9. J. V. Kümmerle and T. Kühner, “Unified intrinsic and extrinsic camera and lidar calibration under uncertainties,” in 2020 IEEE International Conference on Robotics and Automation (ICRA), pp. 6028–6034, 2020.10.1109/ICRA40945.2020.9197496Search in Google Scholar
10. T. Dang and C. Stiller, “Kontinuierliche selbstkalibrierung von stereokameras continuous self-calibration of stereo cameras,” tm – Technisches Messen, vol. 76, no. 4, pp. 167–174, 2009.10.1524/teme.2009.0932Search in Google Scholar
11. E. Rehder et al., “Online stereo camera calibration from scratch,” in 2017 IEEE Intelligent Vehicles Symposium (IV), pp. 1694–1699, 2017.10.1109/IVS.2017.7995952Search in Google Scholar
12. J. Domhof, J. F. Kooij, and D. M. Gavrila, “An extrinsic calibration tool for radar, camera and lidar,” in IEEE International Conference on Robotics and Automation (ICRA), pp. 8107–8113, 2019.10.1109/ICRA.2019.8794186Search in Google Scholar
13. J. V. Kümmerle, T. Kühner, and M. Lauer, “Automatic calibration of multiple cameras and depth sensors with a spherical target,” in 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 1–8, 2018.10.1109/IROS.2018.8593955Search in Google Scholar
14. J. V. Kümmerle, Multimodal Sensor Calibration with a Spherical Calibration Target. PhD thesis, Karlsruhe Institute of Technology (KIT), 2020.Search in Google Scholar
15. Z. Guo and Z. Xiao, “Research on online calibration of lidar and camera for intelligent connected vehicles based on depth-edge matching,” Nonlinear Engineering, vol. 10, no. 1, pp. 469–476, 2021.10.1515/nleng-2021-0038Search in Google Scholar
16. E. Birkefeld, F. Wirth, and C. Stiller, “Extrinsische kamera zu lidar kalibrierung in virtual reality,” in Forum Bildverarbeitung 2020. Ed.: T. Längle; M. Heizmann, pp. 131–141, KIT Scientific Publishing, 2020.Search in Google Scholar
17. F. Dornaika and R. Horaud, “Simultaneous robot-world and hand-eye calibration,” IEEE Transactions on Robotics and Automation, vol. 14, no. 4, pp. 617–622, 1998.10.1109/70.704233Search in Google Scholar
18. M. Horn et al., “Online extrinsic calibration based on per-sensor ego-motion using dual quaternions,” IEEE Robotics and Automation Letters, vol. 6, no. 2, pp. 982–989, 2021.10.1109/LRA.2021.3056352Search in Google Scholar
19. T. Wodtko et al., “Globally optimal multi-scale monocular hand-eye calibration using dual quaternions,” in International Conference on 3D Vision (3DV), IEEE, 2021. (accepted for publication).10.1109/3DV53792.2021.00035Search in Google Scholar
20. A. Geiger, P. Lenz, and R. Urtasun, “Are we ready for autonomous driving? the kitti vision benchmark suite,” in IEEE Conference on Computer Vision and Pattern Recognition (CVPR), pp. 3354–3361, 2012.10.1109/CVPR.2012.6248074Search in Google Scholar
21. J. Geyer et al., “A2D2: Audi autonomous driving dataset.” https://www.a2d2.audi/a2d2/en.html, 2020.Search in Google Scholar
22. A.-L. Köhler et al., “How will we travel autonomously? User needs for interior concepts and requirements towards occupant safety,” in 28th Aachen Colloquium Automobile and Engine Technology, 2019.Search in Google Scholar
23. R. Graubohm, T. Schräder, and M. Maurer, “Value sensitive design in the development of driverless vehicles: A case study on an autonomous family vehicle,” Proceedings of the Design Society: DESIGN Conference, vol. 1, pp. 907–916, 2020.10.1017/dsd.2020.140Search in Google Scholar
24. “Institute for automotive engineering (ika), rwth aachen university.” https://www.ika.rwth-aachen.de/en/, 2021.Search in Google Scholar
25. K. S. Arun, T. S. Huang, and S. D. Blostein, “Least-squares fitting of two 3-d point sets,” IEEE Transactions on Pattern Analysis and Machine Intelligence, no. 5, pp. 698–700, 1987.10.1109/TPAMI.1987.4767965Search in Google Scholar
26. Continental Engineering Services, ARS 408-21 Long Range Radar Sensor 77 GHz – Data Sheet, 2020. Available at https://conti-engineering.com/wp-content/uploads/2020/02/ARS-408-21_EN_HS-1.pdf.Search in Google Scholar
© 2022 Walter de Gruyter GmbH, Berlin/Boston