Article (Scientific journals)
The Impact of LoRA Adapters on LLMs for Clinical Text Classification Under Computational and Data Constraints
LE, Thanh-Dung; Ti Nguyen, Ti; Nguyen Ha, Vu et al.
2025In IEEE Access, 13, p. 109365 - 109377
Peer Reviewed verified by ORBi
 

Files


Full Text
3.pdf
Author postprint (2.27 MB)
Download

All documents in ORBilu are protected by a user license.

Send to



Details



Keywords :
adapters; cardiac failure; clinical NLP; LLM; Low-rank adaptation (LoRA); text classification; Adapter; Cardiac failure; Clinical natural language processing; Fine tuning; Language model; Language processing; Large language model; Low-rank adaptation; Natural languages; Text classification; Computer Science (all); Materials Science (all); Engineering (all); Transformers; Text categorization; Natural language processing; Adaptation models; Acute respiratory distress syndrome; Graphics processing units; Biological system modeling; Accuracy; Tuning; Training; Computer Science - Computation and Language; eess.SP
Abstract :
[en] Fine-tuning Large Language Models (LLMs) for clinical Natural Language Processing (NLP) poses significant challenges due to domain gap, limited data, and stringent hardware constraints. In this study, we evaluate four adapter techniques—Adapter, Lightweight, TinyAttention, and Gated Residual Network (GRN) - equivalent to Low-Rank Adaptation (LoRA), for clinical note classification under real-world, resource-constrained conditions. All experiments were conducted on a single NVIDIA Quadro P620 GPU (2 GB VRAM, 512 CUDA cores, 1.386 TFLOPS FP32), limiting batch sizes to ≤ 8 sequences and maximum sequence length to 256 tokens. Our clinical corpus comprises only 580 000 tokens, several orders of magnitude smaller than standard LLM pre-training datasets. We fine-tuned three biomedical pre-trained LLMs (CamemBERT-bio, AliBERT, DrBERT) and two lightweight Transformer models trained from scratch. Results show that 1) adapter structures provide no consistent gains when fine-tuning biomedical LLMs under these constraints, and 2) simpler Transformers, with minimal parameter counts and training times under six hours, outperform adapter-augmented LLMs, which required over 1000 GPU-hours. Among adapters, GRN achieved the best metrics (accuracy, precision, recall, F1 = 0.88). These findings demonstrate that, in low-resource clinical settings with limited data and compute, lightweight Transformers trained from scratch offer a more practical and efficient solution than large LLMs, while GRN remains a viable adapter choice when minimal adaptation is needed.
Disciplines :
Computer science
Author, co-author :
LE, Thanh-Dung  ;  University of Luxembourg > Interdisciplinary Centre for Security, Reliability and Trust (SNT) > SigCom ; Biomedical Information Processing Laboratory, École de Technologie Supérieure, Montreal, Canada
Ti Nguyen, Ti ;  University of Luxembourg, Interdisciplinary Centre for Security, Reliability, and Trust (SnT), Esch-sur-Alzette, Luxembourg
Nguyen Ha, Vu ;  University of Luxembourg, Interdisciplinary Centre for Security, Reliability, and Trust (SnT), Esch-sur-Alzette, Luxembourg
CHATZINOTAS, Symeon  ;  University of Luxembourg > Interdisciplinary Centre for Security, Reliability and Trust (SNT) > SigCom
Jouvet, Philippe ;  University of Montreal, CHU Sainte-Justine Research Center, CHU Sainte-Justine Hospital, Montreal, Canada
Noumeir, Rita ;  Biomedical Information Processing Laboratory, École de Technologie Supérieure, Montreal, Canada
External co-authors :
yes
Language :
English
Title :
The Impact of LoRA Adapters on LLMs for Clinical Text Classification Under Computational and Data Constraints
Publication date :
2025
Journal title :
IEEE Access
ISSN :
2169-3536
Publisher :
Institute of Electrical and Electronics Engineers Inc.
Volume :
13
Pages :
109365 - 109377
Peer reviewed :
Peer Reviewed verified by ORBi
Funders :
Natural Sciences and Engineering Research Council
Institut de Valorisation des données de l’Université de Montréal
Fonds de la recherche en sante du Quebec
Funding text :
This work was supported in part by the Natural Sciences and Engineering Research Council (NSERC), in part by Institut de Valorisation des donn\u00E9es de l\u2019Universit\u00E9 de Montr\u00E9al (IVADO), and in part by Fonds de la recherche en sante du Quebec (FRQS). The clinical data were provided by the Research Center at CHU Sainte-Justine Hospital. Data and reproducible codes are available upon reasonable request to Prof. Philippe Jouvet, M.D., Ph.D. (e-mail: philippe.jouvet. med@ssss.gouv.qc.ca).
Commentary :
Accepted for publication in the IEEE Access
Available on ORBilu :
since 07 November 2025

Statistics


Number of views
32 (1 by Unilu)
Number of downloads
40 (0 by Unilu)

Scopus citations®
 
0
Scopus citations®
without self-citations
0
OpenCitations
 
0
OpenAlex citations
 
1
WoS citations
 
0

Bibliography


Similar publications



Contact ORBilu