Working paper (E-prints, Working papers and Research blog)
Transformer Meets Gated Residual Networks To Enhance Photoplethysmogram Artifact Detection Informed by Mutual Information Neural Estimation
LE, Thanh-Dung
2025
 

Files


Full Text
IEEE_TNNLS_GRNTransformer_MINE_Minor_Cleaned.pdf
Author postprint (4.98 MB)
Download

All documents in ORBilu are protected by a user license.

Send to



Details



Keywords :
eess.SP
Abstract :
[en] This study delves into the effectiveness of various learning methods in improving Transformer models, focusing particularly on the Gated Residual Network Transformer (GRN-Transformer) in the context of pediatric intensive care units (PICU) with limited data availability. Our findings indicate that Transformers trained via supervised learning are less effective compared to MLP, CNN, and LSTM networks in such environments. Yet, leveraging unsupervised and self-supervised learning on unannotated data, with subsequent fine-tuning on annotated data, notably enhances Transformer performance, although not to the level of the GRN-Transformer. Central to our research is the analysis of different activation functions for the Gated Linear Unit (GLU), a crucial element of the GRN structure. We also employ Mutual Information Neural Estimation (MINE) to evaluate the GRN's contribution. Additionally, the study examines the effects of integrating GRN within the Transformer's Attention mechanism versus using it as a separate intermediary layer. Our results highlight that GLU with sigmoid activation stands out, achieving 0.98 accuracy, 0.91 precision, 0.96 recall, and 0.94 F1 score. The MINE analysis supports the hypothesis that GRN enhances the mutual information between the hidden representations and the output. Moreover, the use of GRN as an intermediate filter layer proves more beneficial than incorporating it within the Attention mechanism. In summary, this research clarifies how GRN bolsters GRN-Transformer's performance, surpassing other learning techniques. These findings offer a promising avenue for adopting sophisticated models like Transformers in data-constrained environments, such as PPG artifact detection in PICU settings.
Disciplines :
Computer science
Author, co-author :
LE, Thanh-Dung  ;  University of Luxembourg > Interdisciplinary Centre for Security, Reliability and Trust (SNT) > SigCom
Language :
English
Title :
Transformer Meets Gated Residual Networks To Enhance Photoplethysmogram Artifact Detection Informed by Mutual Information Neural Estimation
Publication date :
2025
Commentary :
Under the revision on IEEE Transactions on Neural Networks and Learning Systems
Available on ORBilu :
since 03 September 2024

Statistics


Number of views
113 (14 by Unilu)
Number of downloads
25 (0 by Unilu)

Bibliography


Similar publications



Contact ORBilu