Doctoral thesis (Dissertations and theses)
DESIGN INTENT AWARE CAD REVERSE ENGINEERING: DEEP NEURAL APPROACHES FOR RECOVERING FEATURE-BASED SEQUENCES FROM 3D SCANS
DUPONT, Elona Marcelle Eugénie
2025
 

Files


Full Text
Thesis_EDupont.pdf
Author postprint (31.29 MB)
Download

All documents in ORBilu are protected by a user license.

Send to



Details



Keywords :
Computer Vision; CAD; 3D; Reverse Engineering
Abstract :
[en] Computer-Aided Design (CAD) modeling plays a fundamental role in modern industrial manufacturing processes, enabling precise design and modification of products across diverse sectors. However, when existing physical objects lack associated CAD models because of lost documentation, or legacy part replacement requirements, engineers must engage in the time-consuming and expertise-intensive process of CAD reverse engineering. This process involves translating 3D scanned data into parametric CAD models that not only capture geometric form but also preserve design intent. Despite significant advances in 3D scanning technologies, automated CAD reverse engineering faces several persistent challenges: scanning artifacts that corrupt geometric fidelity, the lack of unified CAD representation standards, the inherent ambiguity in recovering design intent, and the limited availability of high-quality datasets with construction history annotations. This thesis presents a systematic progression of approaches to address these challenges, beginning with the recovery of elements of construction history from existing Boundary Representations through joint learning of operation types and steps. Moving closer to the 3D reverse engineering scenario, we then advance to predicting CAD history sequences directly from point clouds using hierarchical transformer architectures that eliminate the need for intermediate representations. To overcome the inherent ambiguity in reverse engineering, we develop geometry-guided search strategies that explore multiple design alternatives, mimicking the decision-making process of expert CAD designers. Finally, we introduce a paradigm shift by leveraging Large Language Models to generate executable Python code directly from point clouds, simultaneously addressing representation limitations and training data constraints through procedural generation techniques. Throughout this progression, we contribute novel datasets that enable more realistic evaluation and establish evaluation methodologies specifically designed for the CAD reverse engineering domain. The resulting frameworks significantly improve reconstruction quality while enhancing practical usability, advancing the field toward fully automated yet flexible CAD reverse engineering that can seamlessly integrate into industrial design workflows.
Research center :
Interdisciplinary Centre for Security, Reliability and Trust (SnT) > CVI² - Computer Vision Imaging & Machine Intelligence
Disciplines :
Computer science
Author, co-author :
DUPONT, Elona Marcelle Eugénie ;  University of Luxembourg > Interdisciplinary Centre for Security, Reliability and Trust (SNT) > CVI2
Language :
English
Title :
DESIGN INTENT AWARE CAD REVERSE ENGINEERING: DEEP NEURAL APPROACHES FOR RECOVERING FEATURE-BASED SEQUENCES FROM 3D SCANS
Defense date :
30 June 2025
Institution :
Unilu - University of Luxembourg [FSTM], Luxembourg, Luxembourg
Degree :
Docteur en Informatique (DIP_DOC_0006_B)
President :
FRIDGEN, Gilbert  ;  University of Luxembourg > Interdisciplinary Centre for Security, Reliability and Trust (SNT) > FINATRAX
Jury member :
AOUADA, Djamila  ;  University of Luxembourg > Interdisciplinary Centre for Security, Reliability and Trust (SNT) > CVI2
Agapito, Lourdes;  University College London
Birdal, Tolga;  Imperial College London
VANDEWALLE, Patrick;  KU Leuven - Katholieke Universiteit Leuven
Focus Area :
Computational Sciences
Funding text :
The work of this thesis is supported by the National Research Fund, Luxembourg under the BRIDGES2021/IS/16849599/FREE-3D and IF/17052459/CASCADES projects, and by Artec 3D.
Available on ORBilu :
since 11 July 2025

Statistics


Number of views
238 (9 by Unilu)
Number of downloads
622 (9 by Unilu)

Bibliography


Similar publications



Contact ORBilu