Article (Périodiques scientifiques)
GraspLDM: Generative 6-DoF Grasp Synthesis Using Latent Diffusion Models
BARAD, Kuldeep Rambhai; ORSULA, Andrej; RICHARD, Antoine et al.
2024In IEEE Access, 12, p. 164621 - 164633
Peer reviewed vérifié par ORBi
 

Documents


Texte intégral
2312.11243v2.pdf
Postprint Auteur (7.25 MB) Licence Creative Commons - Attribution
Télécharger

Tous les documents dans ORBilu sont protégés par une licence d'utilisation.

Envoyer vers



Détails



Mots-clés :
diffusion models; Generative modeling; grasp synthesis; robotic grasping; Autonomous robotics; Diffusion model; Generative model; Grasp synthesis; Grasping of unknown object; Point-clouds; Real-world; Robotic grasping; Unstructured environments; Vision-based grasping; Computer Science (all); Materials Science (all); Engineering (all); Point cloud compression; Decoding; Noise reduction; 6-DOF; Grasping; Visualization; Training; Grippers; Data models; Computer Science - Robotics
Résumé :
[en] Vision-based grasping of unknown objects in unstructured environments is a key challenge for autonomous robotic manipulation. A practical grasp synthesis system is required to generate a diverse set of 6-DoF grasps from which a task-relevant grasp can be executed. Although generative models are suitable for learning such complex data distributions, existing models have limitations in grasp quality, long training times, and a lack of flexibility for task-specific generation. In this work, we present GraspLDM, a modular generative framework for 6-DoF grasp synthesis that uses diffusion models as priors in the latent space of a VAE. GraspLDM learns a generative model of object-centric SE(3) grasp poses conditioned on point clouds. GraspLDM's architecture enables us to train task-specific models efficiently by only re-training a small denoising network in the low-dimensional latent space, as opposed to existing models that need expensive re-training. Our framework provides robust and scalable models on both full and partial point clouds. GraspLDM models trained with simulation data transfer well to the real world without any further fine-tuning. Our models provide an 80% success rate for 80 grasp attempts of diverse test objects across two real-world robotic setups.
Centre de recherche :
Interdisciplinary Centre for Security, Reliability and Trust (SnT) > SpaceR – Space Robotics
Disciplines :
Sciences informatiques
Auteur, co-auteur :
BARAD, Kuldeep Rambhai  ;  University of Luxembourg > Interdisciplinary Centre for Security, Reliability and Trust (SNT) > Space Robotics ; Redwire Space Europe, Luxembourg City, Luxembourg
ORSULA, Andrej  ;  University of Luxembourg > Interdisciplinary Centre for Security, Reliability and Trust (SNT) > Space Robotics
RICHARD, Antoine ;  University of Luxembourg > Interdisciplinary Centre for Security, Reliability and Trust (SNT) > Space Robotics
DENTLER, Jan Eric ;  University of Luxembourg > Interdisciplinary Centre for Security, Reliability and Trust > Automation ; Redwire Space Europe, Luxembourg City, Luxembourg
OLIVARES MENDEZ, Miguel Angel ;  University of Luxembourg > Interdisciplinary Centre for Security, Reliability and Trust (SNT) > Space Robotics
MARTINEZ LUNA, Carol  ;  University of Luxembourg > Interdisciplinary Centre for Security, Reliability and Trust (SNT) > Space Robotics
Co-auteurs externes :
yes
Langue du document :
Anglais
Titre :
GraspLDM: Generative 6-DoF Grasp Synthesis Using Latent Diffusion Models
Date de publication/diffusion :
2024
Titre du périodique :
IEEE Access
ISSN :
2169-3536
Maison d'édition :
Institute of Electrical and Electronics Engineers Inc.
Volume/Tome :
12
Pagination :
164621 - 164633
Peer reviewed :
Peer reviewed vérifié par ORBi
Focus Area :
Computational Sciences
Objectif de développement durable (ODD) :
9. Industrie, innovation et infrastructure
Projet FnR :
FNR15799985 - Modular Vision For Dynamic Grasping Of Unknown Resident Space Objects, 2021 (01/04/2021-15/01/2025) - Kuldeep Rambhai Barad
Intitulé du projet de recherche :
Modular Vision For Dynamic Grasping Of Unknown Resident Space Objects
Organisme subsidiant :
Fonds National de la Recherche (FNR) Industrial Fellowship under
Redwire Space Europe
N° du Fonds :
15799985
Subventionnement (détails) :
This work was supported in part by the Fonds National de la Recherche (FNR) Industrial Fellowship under Grant 15799985, and in part by the Redwire Space Europe. Code and resources are available at: https://github.com/kuldeepbrd1/graspLDM.
Disponible sur ORBilu :
depuis le 26 décembre 2024

Statistiques


Nombre de vues
127 (dont 12 Unilu)
Nombre de téléchargements
102 (dont 1 Unilu)

citations Scopus®
 
3
citations Scopus®
sans auto-citations
3
OpenCitations
 
0
citations OpenAlex
 
12
citations WoS
 
3

Bibliographie


Publications similaires



Contacter ORBilu