Jump to ContentJump to Main Navigation
Show Summary Details
More options …

Opto-Electronics Review

Editor-in-Chief: Jaroszewicz, Leszek

Open Access
See all formats and pricing
More options …
Volume 21, Issue 1


Wide-angle vision for road views

F. Huang
  • Computer Science and Information Engineering, National Ilan University, 1, Sec. 1, Shen-lung Road, Yi-Lan, 260, Taiwan, R.O.C.
  • Email
  • Other articles by this author:
  • De Gruyter OnlineGoogle Scholar
/ K. Fehrs / G. Hartmann / R. Klette
Published Online: 2013-01-05 | DOI: https://doi.org/10.2478/s11772-013-0079-5


The field-of-view of a wide-angle image is greater than (say) 90 degrees, and so contains more information than available in a standard image. A wide field-of-view is more advantageous than standard input for understanding the geometry of 3D scenes, and for estimating the poses of panoramic sensors within such scenes. Thus, wide-angle imaging sensors and methodologies are commonly used in various road-safety, street surveillance, street virtual touring, or street 3D modelling applications. The paper reviews related wide-angle vision technologies by focusing on mathematical issues rather than on hardware.

Keywords: panoramic views; fisheye lenses; wide-angle vision; stereo vision; road views; driver-assistance systems

  • [1] F. Huang, R. Klette, and K. Scheibe, Panoramic Imaging: Sensor-Line Cameras and Laser Range-Finders, Wiley, Chichester, 2008. http://dx.doi.org/10.1002/9780470998267CrossrefGoogle Scholar

  • [2] K. Daniilidis and R. Klette, Imaging Beyond the Pinhole Camera, Springer, New York, 2007. Google Scholar

  • [3] S. Nayar, “Catadioptric omnidirectional camera”, Proc. Conf. Comput. Vision Pattern Recogn., pp. 482–488, San Juan, Puerto Rico, 1997. Google Scholar

  • [4] D.G. Lowe, “Automatic panoramic image stitching using in-variant features”, Int. J. Comput. Vision, 74, 59–73 (2006). Web of ScienceGoogle Scholar

  • [5] S. Peleg, “Panoramic mosaics by manifold projection”, Proc. Conf. Comput. Vision Pattern Recogn., pp. 338–343, San Juan, Puerto Rico, 1997. Google Scholar

  • [6] R. Szeliski, “Image alignment and stitching: A tutorial”, Technical Report MSR-TR-2004-92, Microsoft Research, 2004. Google Scholar

  • [7] R. Szeliski and H.-Y. Shum, “Creating full view panoramic image mosaics and texture-mapped models”, Proc. SIGGRAPH, ACM Press, pp. 251–258, Los Angeles, 1997. Google Scholar

  • [8] Y.-C. Liu, K.-Y. Lin, and Y.-S. Chen, “Bird’s-eye view vision system for vehicle surrounding monitoring”, Proc. Robot Vision 4931, pp. 207–218, Springer-Verlag Heidelberg, 2008. http://dx.doi.org/10.1007/978-3-540-78157-8_16CrossrefGoogle Scholar

  • [9] T. Ehlgen, T. Pajdla, and D. Ammon, “Eliminating blind spots for assisted driving”, IEEE T. Intell. Transp. 9(4), 657–665 (2008). http://dx.doi.org/10.1109/TITS.2008.2006815CrossrefGoogle Scholar

  • [10] S. Gehrig, C. Rabe, and L. Krueger, “6D vision goes fisheye for intersection assistance”, Proc. Canadian Conf. Comput. Robot Vision, pp. 34–41, Windsor, 2008. Google Scholar

  • [11] M. Pollefeys, D. Nister, J.-M. Frahm, A. Akbarzadeh, P. Mordohai, B. Clipp, C. Engels, D. Gallup, S.-J. Kim, P. Merrell, C. Salmi, S. Sinha, B. Talton, L. Wang, Q. Yang, H. Stewenius, R. Yang, G. Welch, and H. Towles, “Detailed real-time urban 3D reconstruction from video”, Int. J. Comput. Vision 78, 143–167 (2008). http://dx.doi.org/10.1007/s11263-007-0086-4CrossrefGoogle Scholar

  • [12] G. Hartmann and R. Klette, “Cylinder sweep: Fisheye images into a bird’s-eye view”, Technical Report MI-tech-TR 69, The University of Auckland, New Zealand, 2011. Google Scholar

  • [13] S.-B. Kang, R. Szeliski, and M. Uyttendaele, ”Seamless stitching using multi-perspective plane sweep”, Technical Report MSR-TR-2001-48, Microsoft Research 2001. Google Scholar

  • [14] F. Huang and R. Klette, “Stereo panorama acquisition and automatic image disparity adjustment for stereoscopic visualization”, Multimed. Tools Appl. 47, 353–377 (2010). http://dx.doi.org/10.1007/s11042-009-0328-2CrossrefWeb of ScienceGoogle Scholar

  • [15] C. Frueh, S. Jain, and A. Zakhor, “Data processing algorithms for generating textured 3D building facade meshes from laser scans and camera images”, Int. J. Comput. Vision 61(2), 159–184 (2005). http://dx.doi.org/10.1023/B:VISI.0000043756.03810.ddCrossrefGoogle Scholar

  • [16] M. Fleck, “Perspective projection: The wrong imaging model”, Technical Report, Dep. Computer Science, University of Iowa, 1995. Google Scholar

  • [17] J. Kumler and M. Bauer, “Fisheye lens designs and their relative performance”, Proc. SPIE 4093, pp. 360–369 (2000). http://dx.doi.org/10.1117/12.405226CrossrefGoogle Scholar

  • [18] K. Miyamoto, “Fish-eye lens”, J. Opt. Soc. Amer. 54, 1060–1061 (1964). http://dx.doi.org/10.1364/JOSA.54.001060CrossrefGoogle Scholar

  • [19] H. Bakstein and T. Pajdla, “Panoramic mosaicing with 180° field of view lens”, Proc. IEEE Workshop Omnidirectional Vision, pp. 60–67, Copenhagen, 2002. Google Scholar

  • [20] J. Kannala and S.S. Brandt, “A generic camera model and calibration method for conventional, wide-angle, and fisheye lenses”, IEEE Trans. Pattern Anal. Machine Intell. 28, 1335–1340 (2006). http://dx.doi.org/10.1109/TPAMI.2006.153CrossrefGoogle Scholar

  • [21] D. Scaramuzza, A. Martinelli, and R. Siegwart, “A toolbox for easily calibrating omnidirectional cameras”, Proc. IEEE/RSJ Int. Conf. Intell. Robots Systems, pp. 5695–5701, Beijing, 2006. Google Scholar

  • [22] H. Ishiguro, M. Yamamoto, and S. Tsuji, “Omni-directional stereo”, IEEE T. Pattern Anal. Machine Intell. 14, 257–262 (1992). http://dx.doi.org/10.1109/34.121792CrossrefGoogle Scholar

  • [23] Y. Li, H.Y. Shum, C.K. Tang, and R. Szeliski, “Stereo reconstruction from multiperspective panoramas”, IEEE T. Pattern Anal. Machine Intell. 26, 45–62 (2004). http://dx.doi.org/10.1109/TPAMI.2004.1261078CrossrefGoogle Scholar

  • [24] D. Murray, “Recovering range using virtual multicamera stereo”, Computer Vision Image Understanding 61, 285–291 (1995). http://dx.doi.org/10.1006/cviu.1995.1021CrossrefGoogle Scholar

  • [25] S. Peleg and M. Ben-Ezra, ”Stereo panorama with a single camera”, Proc. Conf. Comput. Vision Pattern Recogn., pp. 395–401, Fort Collins, 1999. Google Scholar

  • [26] F. Huang, A. Torii, and R. Klette, “Geometries of panoramic images and 3D vision”, Machine Graphics & Vision 9, 463–477 (2010). Google Scholar

  • [27] J.-Y. Bouguet, “Camera calibration toolbox for MATLAB”. http://www.vision.caltech.edu/bouguetj/calib_doc/, 2010. Google Scholar

  • [28] T.-H. Ho, C.C. Davis, and S.D. Milner, “Using geometric constraints for fisheye camera calibration”, Proc. IEEE Workshop Omnidirectional Vision, pp. 17–21, Beijing, 2005. Google Scholar

  • [29] J.-Y. Bouguet, “Visual methods for three-dimensional modelling”, PhD thesis, California Institute of Technology, 1999. Google Scholar

  • [30] S. Li, “Binocular spherical stereo”, IEEE T. Intell. Transp. 9, 589–600 (2008). http://dx.doi.org/10.1109/TITS.2008.2006736CrossrefGoogle Scholar

  • [31] S.E. Chen, “QuickTime — An image-based approach to virtual environment navigation”, Proc. SIGGRAPH, pp. 29–37, Los Angeles, 1995. Google Scholar

  • [32] S.B. Kang and P. Desikan, “Virtual navigation of complex scenes using clusters of cylindrical panoramic images”, Proc. Graphics Interface, pp. 223–232, Vancouver, 1998. Google Scholar

  • [33] S.B. Kang and R. Szeliski, “3-d scene data recovery using omnidirectional multibaseline stereo”, Int. J. Comput. Vision 25, 167–183 (1997). http://dx.doi.org/10.1023/A:1007971901577CrossrefGoogle Scholar

  • [34] H. Li, R.I. Hartley, and J.H. Kim, “A linear approach to motion estimation using generalized camera models”, Proc. IEEE Comput. Society Conf. on Comput. Vision Pattern Recogn., pp. 1–8, Anchorage, 2008. Google Scholar

About the article

Published Online: 2013-01-05

Published in Print: 2013-03-01

Citation Information: Opto-Electronics Review, Volume 21, Issue 1, Pages 1–22, ISSN (Online) 1896-3757, DOI: https://doi.org/10.2478/s11772-013-0079-5.

Export Citation

© 2013 SEP, Warsaw. This work is licensed under the Creative Commons Attribution-NonCommercial-NoDerivatives 3.0 License. BY-NC-ND 3.0

Comments (0)

Please log in or register to comment.
Log in