Jump to ContentJump to Main Navigation
Show Summary Details
More options …

tm - Technisches Messen

Plattform für Methoden, Systeme und Anwendungen der Messtechnik

[TM - Technical Measurement: A Platform for Methods, Systems, and Applications of Measurement Technology
]

Editor-in-Chief: Puente León, Fernando / Zagar, Bernhard


IMPACT FACTOR 2018: 0.594

CiteScore 2018: 0.54

SCImago Journal Rank (SJR) 2018: 0.261
Source Normalized Impact per Paper (SNIP) 2018: 0.563

Online
ISSN
2196-7113
See all formats and pricing
More options …
Volume 86, Issue 7-8

Issues

Subtleties of extrinsic calibration of cameras with non-overlapping fields of view

Optimierungsdetails bei der extrinsischen Kalibrierung von Multi-Kamera-Systemen ohne überlappende Sichtfelder

Zaijuan Li / Volker Willert
Published Online: 2019-06-18 | DOI: https://doi.org/10.1515/teme-2019-0030

Abstract

The calibration of the relative pose between rigidly connected cameras with non-overlapping fields of view (FOV) is a prerequisite for many applications. In this paper, the subtleties of the experimental realization of such calibration optimization methods like in (Z. Liu, et al., Measurement Science and Technology, 2011, Z. Li, V. Willert, Intelligent Transportation Systems (ITSC), 2018) are presented. Two strategies that could be adapted to certain optimization processes to find better local minima are evaluated. The first strategy is a careful measurement acquisition of pose pairs for solving the calibration problem, which improves the accuracy of the initial value for the following non-linear refinement. The second strategy is the introduction of a quality measure for the image data used for the calibration, which is based on the projection size of the known planar calibration patterns on the image. We show that introducing an additional weighting to the optimization objective chosen as a function of that quality measure improves calibration accuracy and increases robustness against noise. The above strategies are integrated into different setups and their improvement is demonstrated both in simulation and real-world experiment.

Zusammenfassung

Die Kalibrierung der Relativpose zwischen starr verbundenen Kameras, die keine überlappenden Sichtfelder besitzen, ist eine notwendige Voraussetzung für viele Anwendungen der Bildverarbeitung. Der vorliegende Artikel bespricht die technischen Details, die bei der experimentellen Umsetzung der Kalibriermethoden nach (Z. Liu, et al., Measurement Science and Technology, 2011, Z. Li, V. Willert, Intelligent Transportation Systems (ITSC), 2018) beachtet werden müssen, um genaue Kalibrierergebnisse zu erhalten. Es werden zwei Strategien vorgestellt, welche es dem Optimierungsprozess ermöglicht, bessere lokale Minima von nichtkonvexen Gütefunktionen, die zur Kalibrierung benutzt werden, zu finden. Die erste Strategie behandelt die Aufnahme und Auswahl von Messungen von geeigneten Bilderpaaren, wodurch bessere Initialwerte zur Lösung des nichtkonvexen Optimierungsproblems erzeugt werden können. Die zweite Strategie stellt ein Gütemaß auf Basis der Größe der reprojizierten Fläche des Kalibrierkörpers in den zur Kalibrierung verwendeten Bildaufnahmen vor. Dieses Maß kann als zusätzliche Gewichtung in der Gütefunktion verwendet werden und erzeugt genauere Kalibrierergebnisse, die robuster gegen Fehler auf Bildkoordinatenmessungen ausfallen. Beide Strategien werden für unterschiedliche Kamerakonfigurationen sowohl simulativ, als auch anhand echter Messdaten evaluiert.

Keywords: Extrinsic calibration; multi-camera system; quality measure

Schlagwörter: Extrinsische Kamerakalibrierung; Multi-Kamera-System; Qualitätsmaß

References

  • 1.

    Z. Liu, G. Zhang, Z. Wei, and J. Sun, “A global calibration method for multiple vision sensors based on multiple targets,” Measurement Science and Technology, vol. 22, no. 12, p. 125102, 2011.Web of ScienceGoogle Scholar

  • 2.

    Z. Li and V. Willert, “Eye-to-eye calibration for cameras with disjoint fields of view (in press),” in Intelligent Transportation Systems (ITSC). IEEE, 2018.Google Scholar

  • 3.

    M. Kaess and F. Dellaert, “Probabilistic structure matching for visual slam with a multi-camera rig,” Computer Vision and Image Understanding, vol. 114, no. 2, pp. 286–296, 2010.CrossrefWeb of ScienceGoogle Scholar

  • 4.

    E. Altuğ, J. P. Ostrowski, and C. J. Taylor, “Control of a quadrotor helicopter using dual camera visual feedback,” The International Journal of Robotics Research, vol. 24, no. 5, pp. 329–341, 2005.CrossrefGoogle Scholar

  • 5.

    G. H. Lee, F. Fraundorfer, and M. Pollefeys, “Structureless pose-graph loop-closure with a multi-camera system on a self-driving car,” in Intelligent Robots and Systems (IROS), 2013 IEEE/RSJ International Conference on. IEEE, 2013, pp. 564–571.Google Scholar

  • 6.

    Y. Suzuki, M. Koyamaishi, T. Yendo, T. Fujii, and M. Tanimoto, “Parking assistance using multi-camera infrastructure,” in Intelligent Vehicles Symposium, 2005. Proceedings. IEEE. IEEE, 2005, pp. 106–111.Google Scholar

  • 7.

    B. Petit, J.-D. Lesage, C. Menier, J. Allard, J.-S. Franco, B. Raffin, E. Boyer, and F. Faure, “Multicamera real-time 3d modeling for telepresence and remote collaboration,” International journal of digital multimedia broadcasting, vol. 2010, 2010.Google Scholar

  • 8.

    S. Nair, G. Panin, M. Wojtczyk, C. Lenz, T. Friedelhuber, and A. Knoll, “A multi-camera person tracking system for robotic applications in virtual reality tv studio,” in Proceedings of the 17th IEEE/RSJ International Conference on Intelligent Robots and Systems 2008. 2008.Google Scholar

  • 9.

    T. Strauß, J. Ziegler, and J. Beck, “Calibrating multiple cameras with non-overlapping views using coded checkerboard targets,” in Intelligent Transportation Systems (ITSC), 2014 IEEE 17th International Conference on. IEEE, 2014, pp. 2623–2628.Google Scholar

  • 10.

    R. Xia, M. Hu, J. Zhao, S. Chen, Y. Chen, and S. Fu, “Global calibration of non-overlapping cameras: state of the art,” Optik-International Journal for Light and Electron Optics, vol. 158, pp. 951–961, 2018.CrossrefGoogle Scholar

  • 11.

    J. Wang, L. Wu, M. Q.-H. Meng, and H. Ren, “Towards simultaneous coordinate calibrations for cooperative multiple robots,” in Intelligent Robots and Systems (IROS 2014), 2014 IEEE/RSJ International Conference on. IEEE, 2014, pp. 410–415.Google Scholar

  • 12.

    P. Wunsch, S. Winkler, and G. Hirzinger, “Real-time pose estimation of 3d objects from camera images using neural networks,” in Robotics and Automation, 1997. Proceedings., 1997 IEEE International Conference on, vol. 4. IEEE, 1997, pp. 3232–3237.Google Scholar

About the article

Zaijuan Li

Zaijuan Li received the M.Sc. degree in Electromechanical Engineering with major in Robotics from Harbin University of Science and Technology, Harbin, China, in 2012. She is currently working toward the Dr.-Ing. degree in the area of computer vision with the Control Methods and Robotics Laboratory, TU Darmstadt, Darmstadt, Germany. Her main research interests are in the field of multi-camera calibration and cooperative mobile vision systems as well as multi-robot localization.

Volker Willert

Volker Willert received the Dipl.-Ing. degree in electrical engineering and information technology and the Dr.-Ing. degree in control theory and robotics, with a focus on dynamical computer vision, from TU Darmstadt, Darmstadt, Germany, in 2002 and 2006, respectively. From 2005 to 2009, he was a Senior Scientist at Honda Research Institute Europe GmbH. Since July 2009, he has been with the Chair of the Control Methods and Robotics Laboratory, TU Darmstadt, and heads the research group Machine Vision and Autonomous Systems. His main research interests are in the fields of machine intelligence, computer vision, distributed controls, and machine learning for mobile robotics, multiagent systems, and driver assistance systems.


Received: 2019-03-14

Accepted: 2019-05-06

Published Online: 2019-06-18

Published in Print: 2019-07-26


Citation Information: tm - Technisches Messen, Volume 86, Issue 7-8, Pages 433–442, ISSN (Online) 2196-7113, ISSN (Print) 0171-8096, DOI: https://doi.org/10.1515/teme-2019-0030.

Export Citation

© 2019 Walter de Gruyter GmbH, Berlin/Boston.Get Permission

Comments (0)

Please log in or register to comment.
Log in