The civil applications of Unmanned Aerial Vehicle (UAV) technology are constantly on a rise and the safety rules for the operation of UAVs in populated areas are being drafted. The UAV technology is an active area of academic research due to the challenges related to aerodynamics, tight power and payload budgets, multi-sensor information fusion, reactive real-time path planning, perception and communication bandwidth requirements. Autonomous navigation is a complex problem due to the challenges of algorithmic complexity and their real-time implementation. The challenges like long-term GPS errors/outage/jamming and exponential error growth in inertial sensors increase the complexity of autonomous navigation to an extent that high level of redundancy is mandatory in the design of navigation systems. Typical UAV systems use multi-sensor (GPS + INS +Vision) data fusion coupled with responsive sensors, innovative navigation algorithms, computationally capable onboard computers and reactive electromechanical systems to accomplish the navigational needs of safe operations in urban environments. Machine learning is a very promising technology and has broad applicability in the many real-life problems: ranging from hand-held & wearable computers to intelligent cars and homes. It can be efficiently used in autonomous navigation of UAVs. This work presents a novel absolute position estimation solution that leverages Radial Basis Function (RBF) classifier for robust aerial image registration. The proposed solution covers the entire spectrum of the problem involving algorithm design, hardware architecture and real-time hardware implementation. The system relies on single passive imaging source for acquisition of aerial images. The sensed image is geometrically transformed to bring it in a common view point as the reference satellite image. The orthorectified aerial image is then learned by the RBF network and full search is performed in the Region of Interest (ROI) of the reference satellite image. The real-time implementation of computationally intensive algorithm is accomplished by designing a customized wide data path in Field Programmable Gate Array (FPGA). The proposed architecture offers a reliable drift-free position estimation solution by conglomerating information from the inertial sensors and geo-registration of the aerial images over a geodetically aligned satellite reference image. We compare the robustness of our proposed matching algorithm with the standard normalized area correlation techniques and present limitations and False Acceptance Rates (FAR) of the two algorithms. This analysis has been performed on a set of real aerial and satellite imagery, acquired under different lightening and weather conditions. This is then followed by a discussion on real-time FPGA based architecture and power analysis. We conclude by presenting future directions of the work. Keywords: Inertial Measurement Units, Vision based Navigation, Real-time implementation, FPGA, Neural Network

A Real-time Absolute Position Estimation Architecture for Autonomous Aerial Robots using Artificial Neural Networks / Hussain, Moazzam. - (2014).

A Real-time Absolute Position Estimation Architecture for Autonomous Aerial Robots using Artificial Neural Networks

HUSSAIN, MOAZZAM
2014

Abstract

The civil applications of Unmanned Aerial Vehicle (UAV) technology are constantly on a rise and the safety rules for the operation of UAVs in populated areas are being drafted. The UAV technology is an active area of academic research due to the challenges related to aerodynamics, tight power and payload budgets, multi-sensor information fusion, reactive real-time path planning, perception and communication bandwidth requirements. Autonomous navigation is a complex problem due to the challenges of algorithmic complexity and their real-time implementation. The challenges like long-term GPS errors/outage/jamming and exponential error growth in inertial sensors increase the complexity of autonomous navigation to an extent that high level of redundancy is mandatory in the design of navigation systems. Typical UAV systems use multi-sensor (GPS + INS +Vision) data fusion coupled with responsive sensors, innovative navigation algorithms, computationally capable onboard computers and reactive electromechanical systems to accomplish the navigational needs of safe operations in urban environments. Machine learning is a very promising technology and has broad applicability in the many real-life problems: ranging from hand-held & wearable computers to intelligent cars and homes. It can be efficiently used in autonomous navigation of UAVs. This work presents a novel absolute position estimation solution that leverages Radial Basis Function (RBF) classifier for robust aerial image registration. The proposed solution covers the entire spectrum of the problem involving algorithm design, hardware architecture and real-time hardware implementation. The system relies on single passive imaging source for acquisition of aerial images. The sensed image is geometrically transformed to bring it in a common view point as the reference satellite image. The orthorectified aerial image is then learned by the RBF network and full search is performed in the Region of Interest (ROI) of the reference satellite image. The real-time implementation of computationally intensive algorithm is accomplished by designing a customized wide data path in Field Programmable Gate Array (FPGA). The proposed architecture offers a reliable drift-free position estimation solution by conglomerating information from the inertial sensors and geo-registration of the aerial images over a geodetically aligned satellite reference image. We compare the robustness of our proposed matching algorithm with the standard normalized area correlation techniques and present limitations and False Acceptance Rates (FAR) of the two algorithms. This analysis has been performed on a set of real aerial and satellite imagery, acquired under different lightening and weather conditions. This is then followed by a discussion on real-time FPGA based architecture and power analysis. We conclude by presenting future directions of the work. Keywords: Inertial Measurement Units, Vision based Navigation, Real-time implementation, FPGA, Neural Network
2014
File in questo prodotto:
Non ci sono file associati a questo prodotto.
Pubblicazioni consigliate

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11583/2542487
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo