Paper published in a book (Scientific congresses, symposiums and conference proceedings)
Unveiling the Power of Sparse Neural Networks for Feature Selection
Atashgahi, Zahra; Liu, Tennison; Pechenizkiy, Mykola et al.
2024In Endriss, Ulle (Ed.) ECAI 2024 - 27th European Conference on Artificial Intelligence, Including 13th Conference on Prestigious Applications of Intelligent Systems, PAIS 2024, Proceedings
Peer reviewed
 

Files


Full Text
2408.04583v1.pdf
Author postprint (817.24 kB)
Download

All documents in ORBilu are protected by a user license.

Send to



Details



Keywords :
Comparative performance; Computational overheads; Dense network; Efficient feature selections; Features selection; Network training; Power; Sparse neural networks; Systematic analysis; Training algorithms; Dynamic Sparse Training; Machine Learning
Abstract :
[en] Sparse Neural Networks (SNNs) have emerged as powerful tools for efficient feature selection. Leveraging the dynamic sparse training (DST) algorithms within SNNs has demonstrated promising feature selection capabilities while drastically reducing computational overheads. Despite these advancements, several critical aspects remain insufficiently explored for feature selection. Questions persist regarding the choice of the DST algorithm for network training, the choice of metric for ranking features/neurons, and the comparative performance of these methods across diverse datasets when compared to dense networks. This paper addresses these gaps by presenting a comprehensive systematic analysis of feature selection with sparse neural networks. Moreover, we introduce a novel metric considering sparse neural network characteristics, which is designed to quantify feature importance within the context of SNNs. Our findings show that feature selection with SNNs trained with DST algorithms can achieve, on average, more than 50% memory and 55% FLOPs reduction compared to the dense networks, while outperforming them in terms of the quality of the selected features.
Disciplines :
Computer science
Author, co-author :
Atashgahi, Zahra;  Faculty of Electrical Engineering, Mathematics and Computer Science, University of Twente, Netherlands
Liu, Tennison;  Department of Applied Mathematics and Theoretical Physics, University of Cambridge, United Kingdom
Pechenizkiy, Mykola;  Department of Mathematics and Computer Science, Eindhoven University of Technology, Netherlands
Veldhuis, Raymond;  Faculty of Electrical Engineering, Mathematics and Computer Science, University of Twente, Netherlands
MOCANU, Decebal Constantin  ;  University of Luxembourg > Faculty of Science, Technology and Medicine (FSTM) > Department of Computer Science (DCS)
van der Schaar, Mihaela;  Department of Applied Mathematics and Theoretical Physics, University of Cambridge, United Kingdom
External co-authors :
yes
Language :
English
Title :
Unveiling the Power of Sparse Neural Networks for Feature Selection
Publication date :
16 October 2024
Event name :
ECAI 2024: 27th European Conference on Artificial Intelligence
Event place :
Santiago de Compostela, Esp
Event date :
19-10-2024 => 24-10-2024
Audience :
International
Main work title :
ECAI 2024 - 27th European Conference on Artificial Intelligence, Including 13th Conference on Prestigious Applications of Intelligent Systems, PAIS 2024, Proceedings
Editor :
Endriss, Ulle
Publisher :
IOS Press BV
ISBN/EAN :
978-1-64368-548-9
Peer reviewed :
Peer reviewed
Focus Area :
Computational Sciences
Development Goals :
9. Industry, innovation and infrastructure
Available on ORBilu :
since 01 February 2026

Statistics


Number of views
6 (0 by Unilu)
Number of downloads
1 (0 by Unilu)

Scopus citations®
 
1
Scopus citations®
without self-citations
1
OpenCitations
 
0
OpenAlex citations
 
4
WoS citations
 
0

Bibliography


Similar publications



Contact ORBilu