The focus of the thesis is to develop navigation and perception system for small UAVs using low cost inertial and vision sensors, for situations where external positioning information such as GPS losses or totally unavailable in the isolated environments such as valleys, tunnels, mines or indoor areas. Two important areas related to the robot navigation and perception are considered: (1) vision and inertial based navigation system using low cost cameras and inertial sensors in GPS denied situations and (2) Interpretation of the environment simultaneously. A robot’s poses are estimated by visual odometry (VO), which estimates the ego-motion by detecting, tracking and matching the features from a sequence of images and/or point cloud. Due to uncertainties and ambiguities in features and 3D point, outliers add noise in estimated egomotion. Visual odometry estimates the relative position of robot, therefore, in order to compute the current position with respect to the starting point, needs to concatenate the egomotion estimated by all previous frames, therefore drift accumulates. Inertial sensor measurements are fused with visual odometry (VO) to improve accuracy and restrict drifts. A 3D spatial Map is generated using these poses estimated and optimized by visual odometry (VO) and back-end library, respectively. Perception is important for a robot to interact with the complex environment and operate in it safely. The aim is to extract semantic information from 3D map, to develop object or semantic map. Semantic mapping is a process of labeling or tagging the entities in the environment, for example objects, other coarse classes and/or regions in the map, which should be meaningful for the humans and other robots. The spatial map is semantically divided, and partial map is segmented. The features are extracted from each segment, and labels are assigned to the objects and classes in the segment by the classifier.

Inertial and Vision based Navigation and Perception for small UAVs / Din, Ahmad. - STAMPA. - (2013).

Inertial and Vision based Navigation and Perception for small UAVs

DIN, AHMAD
2013

Abstract

The focus of the thesis is to develop navigation and perception system for small UAVs using low cost inertial and vision sensors, for situations where external positioning information such as GPS losses or totally unavailable in the isolated environments such as valleys, tunnels, mines or indoor areas. Two important areas related to the robot navigation and perception are considered: (1) vision and inertial based navigation system using low cost cameras and inertial sensors in GPS denied situations and (2) Interpretation of the environment simultaneously. A robot’s poses are estimated by visual odometry (VO), which estimates the ego-motion by detecting, tracking and matching the features from a sequence of images and/or point cloud. Due to uncertainties and ambiguities in features and 3D point, outliers add noise in estimated egomotion. Visual odometry estimates the relative position of robot, therefore, in order to compute the current position with respect to the starting point, needs to concatenate the egomotion estimated by all previous frames, therefore drift accumulates. Inertial sensor measurements are fused with visual odometry (VO) to improve accuracy and restrict drifts. A 3D spatial Map is generated using these poses estimated and optimized by visual odometry (VO) and back-end library, respectively. Perception is important for a robot to interact with the complex environment and operate in it safely. The aim is to extract semantic information from 3D map, to develop object or semantic map. Semantic mapping is a process of labeling or tagging the entities in the environment, for example objects, other coarse classes and/or regions in the map, which should be meaningful for the humans and other robots. The spatial map is semantically divided, and partial map is segmented. The features are extracted from each segment, and labels are assigned to the objects and classes in the segment by the classifier.
2013
File in questo prodotto:
Non ci sono file associati a questo prodotto.
Pubblicazioni consigliate

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11583/2506286
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo