Aerial Systems: Perception and Autonomy; Localizati
Résumé :
[en] In this paper we propose a particle filter localization approach, based on stereo visual odometry (VO) and semantic information from indoor environments, for mini-aerial robots. The prediction stage of the particle filter is performed using the 3D pose of the aerial robot estimated by the stereo VO algorithm. This predicted 3D pose is updated using inertial as well as semantic measurements. The algorithm processes semantic measurements in two phases; firstly, a pre-trained deep learning (DL) based object detector is used for real time object detections in the RGB spectrum. Secondly, from the corresponding 3D point clouds of the detected objects, we segment their dominant horizontal plane and estimate their relative position, also augmenting a prior map with new detections. The augmented map is then used in order to obtain a drift free pose estimate of the aerial robot. We validate our approach in several real flight experiments where we compare it against ground truth and a state of the art visual SLAM approach.
Disciplines :
Ingénierie électrique & électronique
Auteur, co-auteur :
BAVLE, Hriday ; University of Luxembourg > Interdisciplinary Centre for Security, Reliability and Trust (SNT) > Automation
Manthe, Stephan
De La Puente, Paloma
Rodriguez-Ramos, Alejandro
Sampedro, Carlos
Campoy, Pascual
Co-auteurs externes :
yes
Langue du document :
Anglais
Titre :
Stereo Visual Odometry and Semantics based Localization of Aerial Robots in Indoor Environments
Date de publication/diffusion :
2018
Nom de la manifestation :
IEEE International Conference on Intelligent Robots and Systems