Article (Scientific journals)
GraspLDM: Generative 6-DoF Grasp Synthesis Using Latent Diffusion Models
BARAD, Kuldeep Rambhai; ORSULA, Andrej; RICHARD, Antoine et al.
2024In IEEE Access, 12, p. 164621 - 164633
Peer Reviewed verified by ORBi
 

Files


Full Text
2312.11243v2.pdf
Author postprint (7.25 MB) Creative Commons License - Attribution
Download

All documents in ORBilu are protected by a user license.

Send to



Details



Keywords :
diffusion models; Generative modeling; grasp synthesis; robotic grasping; Autonomous robotics; Diffusion model; Generative model; Grasp synthesis; Grasping of unknown object; Point-clouds; Real-world; Robotic grasping; Unstructured environments; Vision-based grasping; Computer Science (all); Materials Science (all); Engineering (all); Point cloud compression; Decoding; Noise reduction; 6-DOF; Grasping; Visualization; Training; Grippers; Data models; Computer Science - Robotics
Abstract :
[en] Vision-based grasping of unknown objects in unstructured environments is a key challenge for autonomous robotic manipulation. A practical grasp synthesis system is required to generate a diverse set of 6-DoF grasps from which a task-relevant grasp can be executed. Although generative models are suitable for learning such complex data distributions, existing models have limitations in grasp quality, long training times, and a lack of flexibility for task-specific generation. In this work, we present GraspLDM, a modular generative framework for 6-DoF grasp synthesis that uses diffusion models as priors in the latent space of a VAE. GraspLDM learns a generative model of object-centric SE(3) grasp poses conditioned on point clouds. GraspLDM's architecture enables us to train task-specific models efficiently by only re-training a small denoising network in the low-dimensional latent space, as opposed to existing models that need expensive re-training. Our framework provides robust and scalable models on both full and partial point clouds. GraspLDM models trained with simulation data transfer well to the real world without any further fine-tuning. Our models provide an 80% success rate for 80 grasp attempts of diverse test objects across two real-world robotic setups.
Research center :
Interdisciplinary Centre for Security, Reliability and Trust (SnT) > SpaceR – Space Robotics
Disciplines :
Computer science
Author, co-author :
BARAD, Kuldeep Rambhai  ;  University of Luxembourg > Interdisciplinary Centre for Security, Reliability and Trust (SNT) > Space Robotics ; Redwire Space Europe, Luxembourg City, Luxembourg
ORSULA, Andrej  ;  University of Luxembourg > Interdisciplinary Centre for Security, Reliability and Trust (SNT) > Space Robotics
RICHARD, Antoine ;  University of Luxembourg > Interdisciplinary Centre for Security, Reliability and Trust (SNT) > Space Robotics
DENTLER, Jan Eric ;  University of Luxembourg > Interdisciplinary Centre for Security, Reliability and Trust > Automation ; Redwire Space Europe, Luxembourg City, Luxembourg
OLIVARES MENDEZ, Miguel Angel ;  University of Luxembourg > Interdisciplinary Centre for Security, Reliability and Trust (SNT) > Space Robotics
MARTINEZ LUNA, Carol  ;  University of Luxembourg > Interdisciplinary Centre for Security, Reliability and Trust (SNT) > Space Robotics
External co-authors :
yes
Language :
English
Title :
GraspLDM: Generative 6-DoF Grasp Synthesis Using Latent Diffusion Models
Publication date :
2024
Journal title :
IEEE Access
ISSN :
2169-3536
Publisher :
Institute of Electrical and Electronics Engineers Inc.
Volume :
12
Pages :
164621 - 164633
Peer reviewed :
Peer Reviewed verified by ORBi
Focus Area :
Computational Sciences
Development Goals :
9. Industry, innovation and infrastructure
FnR Project :
FNR15799985 - Modular Vision For Dynamic Grasping Of Unknown Resident Space Objects, 2021 (01/04/2021-15/01/2025) - Kuldeep Rambhai Barad
Name of the research project :
Modular Vision For Dynamic Grasping Of Unknown Resident Space Objects
Funders :
Fonds National de la Recherche (FNR) Industrial Fellowship under
Redwire Space Europe
Funding number :
15799985
Funding text :
This work was supported in part by the Fonds National de la Recherche (FNR) Industrial Fellowship under Grant 15799985, and in part by the Redwire Space Europe. Code and resources are available at: https://github.com/kuldeepbrd1/graspLDM.
Available on ORBilu :
since 26 December 2024

Statistics


Number of views
127 (12 by Unilu)
Number of downloads
87 (1 by Unilu)

Scopus citations®
 
3
Scopus citations®
without self-citations
3
OpenCitations
 
0
OpenAlex citations
 
12
WoS citations
 
3

Bibliography


Similar publications



Contact ORBilu