Reference : Towards incremental autonomy framework for on-orbit vision-based grasping
Scientific congresses, symposiums and conference proceedings : Paper published in a book
Engineering, computing & technology : Aerospace & aeronautics engineering
http://hdl.handle.net/10993/49355
Towards incremental autonomy framework for on-orbit vision-based grasping
English
Barad, Kuldeep Rambhai mailto [University of Luxembourg > Interdisciplinary Centre for Security, Reliability and Trust (SNT) > Space Robotics >]
Martinez Luna, Carol [University of Luxembourg > Interdisciplinary Centre for Security, Reliability and Trust (SNT) > Space Robotics >]
Dentler, Jan [Redwire Space Europe > Robotics]
Olivares Mendez, Miguel Angel [University of Luxembourg > Interdisciplinary Centre for Security, Reliability and Trust (SNT) > Space Robotics >]
29-Oct-2021
Proceedings of the International Astronautical Congress, IAC-2021
No
No
International
International Astronautical Congress
24-10-2021 to 29-10-2021
International Astronautical Federation
Dubai
United Arab Emirates
[en] OSAM ; Autonomy, On-orbit Robotics, Perception ; , Manipulation, Learning
[en] This work presents a software-oriented autonomy framework that enables the incremental development of high
robotic autonomy. The autonomy infrastructure in space applications is often cost-driven and built for a narrow
time/complexity domain. In domains like On-orbit Servicing Assembly and Manufacturing (OSAM), this prevents
scalability and generalizability, motivating a more consistent approach for the incremental development of robotic
autonomy. For this purpose, the problem of vision-based grasping is described as a building block for high autonomy
of dexterous space robots. Subsequently, the need for a framework is highlighted to enable bottom-up development
of general autonomy with vision-based grasping as the starting point. The preliminary framework presented here
comprises three components. First, an autonomy level classification provides a clear description of the autonomous
behavior of the system. The stack abstraction provides a general classification of the development layers. Finally, the
generic execution architecture condenses the flow of translating a high-level task description into real-world sense-planact routines. Overall, this work lays down foundational elements towards development of general robotic autonomy for
scalablity in space application domains like OSAM.
http://hdl.handle.net/10993/49355
FnR ; FNR15799985 > Kuldeep Rambhai Barad > MIS-URSO > Modular Vision For Dynamic Grasping Of Unknown Resident Space Objects > 01/04/2021 > 15/01/2025 > 2021

File(s) associated to this reference

Fulltext file(s):

FileCommentaryVersionSizeAccess
Open access
IAC-21,D1,6,7,x65061.pdfPublisher postprint787.23 kBView/Open

Bookmark and Share SFX Query

All documents in ORBilu are protected by a user license.