Article (Scientific journals)
Hybrid Attention for Robust RGB-T Pedestrian Detection in Real-World Conditions
RATHINAM, Arunkumar; PAULY, Leo; SHABAYEK, Abd El Rahman et al.
2025In IEEE Robotics and Automation Letters, 10 (1), p. 1-8
Peer Reviewed verified by ORBi
 

Files


Full Text
LRA3504296 (1).pdf
Publisher postprint (1.8 MB) Creative Commons License - Attribution
Download

All documents in ORBilu are protected by a user license.

Send to



Details



Keywords :
Computer Vision for Transportation; Deep Learning for Visual Perception; Human Detection and Tracking; Multi-Modal Perception for HRI; Sensor Fusion; Artificial Intelligence
Abstract :
[en] Multispectral pedestrian detection has gained significant attention in recent years, particularly in autonomous driving applications. To address the challenges posed by adversarial illumination conditions, the combination of thermal and visible images has demonstrated its advantages. However, existing fusion methods rely on the critical assumption that the RGB-Thermal (RGB-T) image pairs are fully overlapping. These assumptions often do not hold in real-world applications, where only partial overlap between images can occur due to sensors configuration. Moreover, sensor failure can cause loss of information in one modality. In this paper, we propose a novel module called the Hybrid Attention (HA) mechanism as our main contribution to mitigate performance degradation caused by partial overlap and sensor failure, i.e. when at least part of the scene is acquired by only one sensor. We propose an improved RGB-T fusion algorithm, robust against partial overlap and sensor failure encountered during inference in real-world applications. We also leverage a mobile-friendly backbone to cope with resource constraints in embedded systems. We conducted experiments by simulating various partial overlap and sensor failure scenarios to evaluate the performance of our proposed method. The results demonstrate that our approach outperforms state-of-the-art methods, showcasing its superiority in handling real-world challenges.
Research center :
Interdisciplinary Centre for Security, Reliability and Trust (SnT) > CVI² - Computer Vision Imaging & Machine Intelligence
Disciplines :
Computer science
Author, co-author :
RATHINAM, Arunkumar  ;  University of Luxembourg > Interdisciplinary Centre for Security, Reliability and Trust (SNT) > CVI2
PAULY, Leo  ;  University of Luxembourg > Interdisciplinary Centre for Security, Reliability and Trust > CVI2 > Team Djamila AOUADA
SHABAYEK, Abd El Rahman  ;  University of Luxembourg > Interdisciplinary Centre for Security, Reliability and Trust (SNT) > CVI2
RHARBAOUI, Wassim  ;  University of Luxembourg > Interdisciplinary Centre for Security, Reliability and Trust > CVI2 > Team Djamila AOUADA ; University of Poitiers, Xlim institute, Limoges, France
KACEM, Anis  ;  University of Luxembourg > Interdisciplinary Centre for Security, Reliability and Trust (SNT) > CVI2
GAUDILLIERE, Vincent  ;  University of Luxembourg > Interdisciplinary Centre for Security, Reliability and Trust > CVI2 > Team Djamila AOUADA ; Université de Lorraine, Cnrs, Inria, Loria, Nancy, France
AOUADA, Djamila  ;  University of Luxembourg > Interdisciplinary Centre for Security, Reliability and Trust (SNT) > CVI2
External co-authors :
no
Language :
English
Title :
Hybrid Attention for Robust RGB-T Pedestrian Detection in Real-World Conditions
Publication date :
January 2025
Journal title :
IEEE Robotics and Automation Letters
eISSN :
2377-3766
Publisher :
Institute of Electrical and Electronics Engineers Inc.
Volume :
10
Issue :
1
Pages :
1-8
Peer reviewed :
Peer Reviewed verified by ORBi
Focus Area :
Security, Reliability and Trust
FnR Project :
FNR14755859 - Multi-modal Fusion Of Electro-optical Sensors For Spacecraft Pose Estimation Towards Autonomous In-orbit Operations, 2020 (01/01/2021-31/12/2023) - Djamila Aouada
Name of the research project :
R-AGR-3874 - BRIDGES/20/14755859 MEET-A - LMO Contrib - AOUADA Djamila
Funders :
Luxembourg National Research Fund
Funding number :
BRIDGES2020/IS/14755859/MEET-A/Aouada
Funding text :
This work was supported by Luxembourg National Research Fund (FNR), under the project reference BRIDGES2020/IS/14755859/MEET-A/Aouada.
Available on ORBilu :
since 04 December 2024

Statistics


Number of views
139 (11 by Unilu)
Number of downloads
27 (1 by Unilu)

Scopus citations®
 
0
Scopus citations®
without self-citations
0
OpenCitations
 
0
OpenAlex citations
 
0

Bibliography


Similar publications



Contact ORBilu