Article (Scientific journals)
DRE: density-based data selection with entropy for adversarial-robust deep learning models
GUO, Yuejun; HU, Qiang; CORDY, Maxime et al.
2023In Neural Computing and Applications, 35 (5), p. 4009 - 4026
Peer Reviewed verified by ORBi
 

Files


Full Text
s00521-022-07812-2.pdf
Author postprint (893.79 kB) Creative Commons License - Attribution
Download

All documents in ORBilu are protected by a user license.

Send to



Details



Keywords :
Active learning; Adversarial robustness; Deep learning testing; Image classification; Active Learning; Data Selection; Density-based; Images classification; Learning models; Performance; Software developer; State of the art; Software; Artificial Intelligence
Abstract :
[en] Active learning helps software developers reduce the labeling cost when building high-quality machine learning models. A core component of active learning is the acquisition function that determines which data should be selected to annotate.State-of-the-art (SOTA) acquisition functions focus on clean performance (e.g. accuracy) but disregard robustness (an important quality property), leading to fragile models with negligible robustness (less than 0.20%). In this paper, we first propose to integrate adversarial training into active learning (adversarial-robust active learning, ARAL) to produce robust models. Our empirical study on 11 acquisition functions and 15105 trained deep neural networks (DNNs) shows that ARAL can produce models with robustness ranging from 2.35% to 63.85%. Our study also reveals, however, that the acquisition functions that perform well on accuracy are worse than random sampling when it comes to robustness. Via examining the reasons behind this, we devise the density-based robust sampling with entropy (DRE) to target both clean performance and robustness. The core idea of DRE is to maintain a balance between selected data and the entire set based on the entropy density distribution. DRE outperforms SOTA functions in terms of robustness by up to 24.40%, while remaining competitive on accuracy. Additionally, the in-depth evaluation shows that DRE is applicable as a test selection metric for model retraining and stands out from all compared functions by up to 8.21% robustness.
Disciplines :
Computer science
Author, co-author :
GUO, Yuejun  ;  University of Luxembourg > Interdisciplinary Centre for Security, Reliability and Trust > SerVal > Team Yves LE TRAON
HU, Qiang ;  University of Luxembourg > Interdisciplinary Centre for Security, Reliability and Trust (SNT) > SerVal
CORDY, Maxime  ;  University of Luxembourg > Interdisciplinary Centre for Security, Reliability and Trust (SNT) > SerVal
Papadakis, Michail;  SnT, University of Luxembourg, Esch-sur-Alzette, Luxembourg
Le Traon, Yves;  SnT, University of Luxembourg, Esch-sur-Alzette, Luxembourg
External co-authors :
no
Language :
English
Title :
DRE: density-based data selection with entropy for adversarial-robust deep learning models
Publication date :
February 2023
Journal title :
Neural Computing and Applications
ISSN :
0941-0643
eISSN :
1433-3058
Publisher :
Springer Science and Business Media Deutschland GmbH
Volume :
35
Issue :
5
Pages :
4009 - 4026
Peer reviewed :
Peer Reviewed verified by ORBi
Funders :
Fonds National de la Recherche Luxembourg
Funding text :
This work is supported by the Luxembourg National Research Funds (FNR) through CORE project C18/IS/12669767/STELLAR/LeTraon.
Available on ORBilu :
since 11 January 2024

Statistics


Number of views
31 (0 by Unilu)
Number of downloads
18 (0 by Unilu)

Scopus citations®
 
3
Scopus citations®
without self-citations
1
OpenCitations
 
1
OpenAlex citations
 
3
WoS citations
 
2

Bibliography


Similar publications



Contact ORBilu