PL EN


Preferencje help
Widoczny [Schowaj] Abstrakt
Liczba wyników
Tytuł artykułu

Low-cost navigation and guidance systems for Unmanned Aerial Vehicles. Part 1: Vision-based and integrated sensors

Treść / Zawartość
Identyfikatory
Warianty tytułu
Języki publikacji
EN
Abstrakty
EN
In this paper we present a new low-cost navigation system designed for small size Unmanned Aerial Vehicles (UAVs) based on Vision-Based Navigation (VBN) and other avionics sensors. The main objective of our research was to design a compact, light and relatively inexpensive system capable of providing the Required Navigation Performance (RNP) in all phases of flight of a small UAV, with a special focus on precision approach and landing, where Vision Based Navigation (VBN) techniques can be fully exploited in a multisensor integrated architecture. Various existing techniques for VBN were compared and the Appearance-Based Approach (ABA) was selected for implementation. Feature extraction and optical flow techniques were employed to estimate flight parameters such as roll angle, pitch angle, deviation from the runway and body rates. Additionally, we addressed the possible synergies between VBN, Global Navigation Satellite System (GNSS) and MEMS-IMU (Micro-Electromechanical System Inertial Measurement Unit) sensors, as well as the aiding from Aircraft Dynamics Models (ADMs). In particular, by employing these sensors/models, we aimed to compensate for the shortcomings of VBN and MEMS-IMU sensors in high-dynamics attitude determination tasks. An Extended Kalman Filter (EKF) was developed to fuse the information provided by the different sensors and to provide estimates of position, velocity and attitude of the UAV platform in real-time. Two different integrated navigation system architectures were implemented. The first used VBN at 20 Hz and GPS at 1 Hz to augment the MEMS-IMU running at 100 Hz. The second mode also included the ADM (computations performed at 100 Hz) to provide augmentation of the attitude channel. Simulation of these two modes was accomplished in a significant portion of the AEROSONDE UAV operational flight envelope and performing a variety of representative manoeuvres (i.e., straight climb, level turning, turning descent and climb, straight descent, etc.). Simulation of the first integrated navigation system architecture (VBN/IMU/GPS) showed that the integrated system can reach position, velocity and attitude accuracies compatible with CAT-II precision approach requirements. Simulation of the second system architecture (VBN/IMU/GPS/ADM) also showed promising results since the achieved attitude accuracy was higher using the ADM/VBS/IMU than using VBS/IMU only. However, due to rapid divergence of the ADM virtual sensor, there was a need for frequent re-initialisation of the ADM data module, which was strongly dependent on the UAV flight dynamics and the specific manoeuvring transitions performed.
Rocznik
Strony
71--98
Opis fizyczny
Bibliogr. 13 poz., rys., tab.
Twórcy
autor
  • Cranfield University, Department of Aerospace Engineering, UK
autor
  • Cranfield University, Department of Aerospace Engineering, UK
autor
  • Cranfield University, Department of Aerospace Engineering, UK
autor
  • Cranfield University, Department of Aerospace Engineering, UK
autor
  • Cranfield University, Department of Aerospace Engineering, UK
  • Cranfield University, Department of Aerospace Engineering, UK
autor
  • Cranfield University, Department of Aerospace Engineering, UK
Bibliografia
  • [1] Blanc G., Mezouar Y., Martinet P., Indoor navigation of a wheeled mobile robot along visual routes, Proceeding of International Conference of Robotics & Automation, 2005, pp. 3354-3359.
  • [2] CAA Safety Regulation Group Paper 2003/09, GPS Integrity and Potential Impact on Aviation Safety, 2003.
  • [3] Chen Z., Birchfield S. T., Qualitative Vision-Based path following, IEEE Trans. on Robotics, June 2009, Vol. 25, issue 3, pp. 749-754.
  • [4] Courbon J., Mezouar Y., Guenard N., Martinet P., Vision-Based navigation of unmanned aerial vehicles, Control Engineering Practice, July 2010, Vol. 18, issue 7, pp. 789-799.
  • [5] Courbon J., Mezouar Y., Guenard N., Martinet P., Visual navigation of a quadrotor aerial vehicle, Proceedings of the 2009 IEEE/RSJ Conference on Intelligent Robots and Systems, Oct. 2009, pp. 5315-5320.
  • [6] Cui P., Yue F., Stereo Vision-Based autonomous navigation for lunar rovers, Aircraft Engineering and Aerospace Technology: An International Journal, 2007, Vol. 79, No. 4, pp. 398-405.
  • [7] Desouza G. N., Kak A. C., Vision for mobile robot navigation: a survey, IEEE Trans. Pattern Analysis and Machine Intelligence, Feb. 2002, Vol. 24, issue 2, pp. 237-267.
  • [8] Ding W., Wang J., Precise Velocity Estimation with a Stand-Alone GPS receiver, University of New South Wales, Journal of the Institute of Navigation, USA, 2011.
  • [9] Dusha D., Mejias L., Walker R., Fixed-wing attitude estimation using temporal tracking of the horizon and optical flow, Journal of Field Robotics, 2011, Vol. 28, No. 3, pp. 355-372.
  • [10] Godha S., Performance Evaluation of Low Cost MEMS-Based IMU Integrated with GPS for Land Vehicle Navigation Application, UCGE Report, 2006, No. 20239, University of Calgary, Department of Geomatics Engineering, Alberta, Canada.
  • [11] ICAO - Annex 10 to the Convention on International Civil Aviation, Aeronautical Telecommunications, Volume 1: Radio Navigation Aids., ed. 6, July 2006.
  • [12] Matsumoto Y., Sakai K., Inaba, M., Inoue H., View-based approach to robot navigation, Proceedings of the 2000 IEEE/RSJ Conference on Intelligent Robots and Systems, Nov. 2000, Vol. 3, pp. 1702-1708.
  • [13] Olivares-Mendez M. A., Mondragon I. F., Campoy P., Martinez C., Fuzzy controller for UAV-landing task using 3D position visual estimation, Proceedings of IEEE International Conference on Fuzzy Systems, 2010.
Typ dokumentu
Bibliografia
Identyfikator YADDA
bwmeta1.element.baztech-f6a314f5-eb45-4b5c-93d3-c6476bcae396
JavaScript jest wyłączony w Twojej przeglądarce internetowej. Włącz go, a następnie odśwież stronę, aby móc w pełni z niej korzystać.