Reference : LSPnet: A 2D Localization-oriented Spacecraft Pose Estimation Neural Network
Scientific congresses, symposiums and conference proceedings : Paper published in a journal
Engineering, computing & technology : Computer science
Security, Reliability and Trust
http://hdl.handle.net/10993/47169
LSPnet: A 2D Localization-oriented Spacecraft Pose Estimation Neural Network
English
Garcia Sanchez, Albert mailto [University of Luxembourg > Interdisciplinary Centre for Security, Reliability and Trust (SNT) > CVI2 >]
Mohamed Ali, Mohamed Adel mailto [University of Luxembourg > Interdisciplinary Centre for Security, Reliability and Trust (SNT) > CVI2 >]
Gaudilliere, Vincent mailto [University of Luxembourg > Interdisciplinary Centre for Security, Reliability and Trust (SNT) > CVI2 >]
Ghorbel, Enjie mailto [University of Luxembourg > Interdisciplinary Centre for Security, Reliability and Trust (SNT) > CVI2 >]
Al Ismaeil, Kassem mailto [University of Luxembourg > Interdisciplinary Centre for Security, Reliability and Trust (SNT) > CVI2 >]
Perez, Marcos [LMO > > CTO]
Aouada, Djamila mailto [University of Luxembourg > Interdisciplinary Centre for Security, Reliability and Trust (SNT) > CVI2 >]
Jun-2021
Proceedings of Conference on Computer Vision and Pattern Recognition Workshops
Institute of Electrical and Electronics Engineers
2048-2056
Yes
International
2160-7508
2160-7516
Piscataway
NJ
2021 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops (CVPRW)
from 19-06-2021 to 25-06-2021
[en] Computer Vision ; AI for Space ; Pose Estimation
[en] Being capable of estimating the pose of uncooperative objects in space has been proposed as a key asset for enabling safe close-proximity operations such as space rendezvous, in-orbit servicing and active debris removal. Usual approaches for pose estimation involve classical computer vision-based solutions or the application of Deep Learning (DL) techniques. This work explores a novel DL-based methodology, using Convolutional Neural Networks (CNNs), for estimating the pose of uncooperative spacecrafts. Contrary to other approaches, the proposed CNN directly regresses poses without needing any prior 3D information. Moreover, bounding boxes of the spacecraft in the image are predicted in a simple, yet efficient manner. The performed experiments show how this work competes with the state-of-the-art in uncooperative spacecraft pose estimation, including works which require 3D information as well as works which predict bounding boxes through sophisticated CNNs.
Interdisciplinary Centre for Security, Reliability and Trust (SnT) > Other
http://hdl.handle.net/10993/47169
This work was funded by the Luxembourg National Research Fund (FNR), under the project reference BRIDGES2020/IS/14755859/MEETA/Aouada, and by LMO (https://www.lmo.space).

File(s) associated to this reference

Fulltext file(s):

FileCommentaryVersionSizeAccess
Open access
AI4Space2021-AlbertGarcia-Orbi.pdfAuthor preprint3.93 MBView/Open

Bookmark and Share SFX Query

All documents in ORBilu are protected by a user license.