Jump to ContentJump to Main Navigation
Show Summary Details
In This Section

Annual of Navigation

The Journal of Polish Navigational Forum

1 Issue per year

Open Access
See all formats and pricing
In This Section

Low-Cost Navigation and Guidance Systems for Unmanned Aerial Vehicles — Part 1: Vision-Based and Integrated Sensors

Roberto Sabatini
  • Cranfield University — Department of Aerospace Engineering, UK
/ Celia Bartel
  • Cranfield University — Department of Aerospace Engineering, UK
/ Anish Kaharkar
  • Cranfield University — Department of Aerospace Engineering, UK
/ Tesheen Shaid
  • Cranfield University — Department of Aerospace Engineering, UK
/ Leopoldo Rodriguez
  • Cranfield University — Department of Aerospace Engineering, UK
/ David Zammit-Mangion
  • Cranfield University — Department of Aerospace Engineering, UK
/ Huamin Jia
  • Cranfield University — Department of Aerospace Engineering, UK
Published Online: 2013-07-27 | DOI: https://doi.org/10.2478/v10367-012-0019-3


In this paper we present a new low-cost navigation system designed for small size Unmanned Aerial Vehicles (UAVs) based on Vision-Based Navigation (VBN) and other avionics sensors. The main objective of our research was to design a compact, light and relatively inexpensive system capable of providing the Required Navigation Performance (RNP) in all phases of flight of a small UAV, with a special focus on precision approach and landing, where Vision Based Navigation (VBN) techniques can be fully exploited in a multisensor integrated architecture. Various existing techniques for VBN were compared and the Appearance-Based Approach (ABA) was selected for implementation. Feature extraction and optical flow techniques were employed to estimate flight parameters such as roll angle, pitch angle, deviation from the runway and body rates. Additionally, we addressed the possible synergies between VBN, Global Navigation Satellite System (GNSS) and MEMS-IMU (Micro-Electromechanical System Inertial Measurement Unit) sensors, as well as the aiding from Aircraft Dynamics Models (ADMs). In particular, by employing these sensors/models, we aimed to compensate for the shortcomings of VBN and MEMS-IMU sensors in high-dynamics attitude determination tasks. An Extended Kalman Filter (EKF) was developed to fuse the information provided by the different sensors and to provide estimates of position, velocity and attitude of the UAV platform in real-time. Two different integrated navigation system architectures were implemented. The first used VBN at 20 Hz and GPS at 1 Hz to augment the MEMS-IMU running at 100 Hz. The second mode also included the ADM (computations performed at 100 Hz) to provide augmentation of the attitude channel. Simulation of these two modes was accomplished in a significant portion of the AEROSONDE UAV operational flight envelope and performing a variety of representative manoeuvres (i.e., straight climb, level turning, turning descent and climb, straight descent, etc.). Simulation of the first integrated navigation system architecture (VBN/IMU/GPS) showed that the integrated system can reach position, velocity and attitude accuracies compatible with CAT-II precision approach requirements. Simulation of the second system architecture (VBN/IMU/GPS/ADM) also showed promising results since the achieved attitude accuracy was higher using the ADM/VBS/IMU than using VBS/IMU only. However, due to rapid divergence of the ADM virtual sensor, there was a need for frequent re-initialisation of the ADM data module, which was strongly dependent on the UAV flight dynamics and the specific manoeuvring transitions performed

Keywords: Vision-Based Navigation; Integrated Navigation System; MEMS Inertial Measurement Unit; Unmanned Aerial Vehicle; Low-cost Navigation Sensors

  • [1] Blanc G., Mezouar Y., Martinet P., Indoor navigation of a wheeled mobile robot along visual routes, Proceeding of International Conference of Robotics & Automation, 2005, pp. 3354-3359.

  • [2] CAA Safety Regulation Group Paper 2003/09, GPS Integrity and Potential Impact on Aviation Safety, 2003.

  • [3] Chen Z., Birchfield S. T., Qualitative Vision-Based path following, IEEE Trans. on Robotics, June 2009, Vol. 25, issue 3, pp. 749-754. [Web of Science]

  • [4] Courbon J., Mezouar Y., Guenard N., Martinet P., Vision-Based navigation of unmanned aerial vehicles, Control Engineering Practice, July 2010, Vol. 18, issue 7, pp. 789-799. [Web of Science]

  • [5] Courbon J., Mezouar Y., Guenard N., Martinet P., Visual navigation of a quadrotor aerial vehicle, Proceedings of the 2009 IEEE/RSJ Conference on Intelligent Robots and Systems, Oct. 2009, pp. 5315-5320.

  • [6] Cui P., Yue F., Stereo Vision-Based autonomous navigation for lunar rovers, Aircraft Engineering and Aerospace Technology: An International Journal, 2007, Vol. 79, No. 4, pp. 398-405. [Web of Science]

  • [7] Desouza G. N., Kak A. C., Vision for mobile robot navigation: a survey, IEEE Trans. Pattern Analysis and Machine Intelligence, Feb. 2002, Vol. 24, issue 2, pp. 237-267.

  • [8] Ding W., Wang J., Precise Velocity Estimation with a Stand-Alone GPS receiver, University of New South Wales, Journal of the Institute of Navigation, USA, 2011.

  • [9] Dusha D., Mejias L., Walker R., Fixed-wing attitude estimation using temporal tracking of the horizon and optical flow, Journal of Field Robotics, 2011, Vol. 28, No. 3, pp. 355-372.

  • [10] Godha S., Performance Evaluation of Low Cost MEMS-Based IMU Integrated with GPS for Land Vehicle Navigation Application, UCGE Report, 2006, No. 20239, University of Calgary, Department of Geomatics Engineering, Alberta, Canada.

  • [11] ICAO - Annex 10 to the Convention on International Civil Aviation, Aeronautical Telecommunications, Volume 1: Radio Navigation Aids., ed. 6, July 2006.

  • [12] Matsumoto Y., Sakai K., Inaba, M., Inoue H., View-based approach to robot navigation, Proceedings of the 2000 IEEE/RSJ Conference on Intelligent Robots and Systems, Nov. 2000, Vol. 3, pp. 1702-1708.

  • [13] Olivares-Mendez M. A., Mondragon I. F., Campoy P., Martinez C., Fuzzy controller for UAV-landing task using 3D position visual estimation, Proceedings of IEEE International Conference on Fuzzy Systems, 2010.

About the article

Published Online: 2013-07-27

Published in Print: 2012-12-01

Citation Information: Annual of Navigation, ISSN (Online) 1640-8632, DOI: https://doi.org/10.2478/v10367-012-0019-3. Export Citation

This content is open access.

Citing Articles

Here you can find all Crossref-listed publications in which this article is cited. If you would like to receive automatic email messages as soon as this article is cited in other publications, simply activate the “Citation Alert” on the top of this page.

Prof Renuganth Varatharajoo, Roberto Sabatini, Francesco Cappello, Subramanian Ramasamy, Alessandro Gardi, and Reece Clothier
Aircraft Engineering and Aerospace Technology, 2015, Volume 87, Number 6, Page 540

Comments (0)

Please log in or register to comment.
Log in