Poster (Scientific congresses, symposiums and conference proceedings)
Can Less Yield More? Insights into Truly Sparse Training
Xiao, Qiao; WU, Boqian; Yin, Lu et al.
2023ICLR 2023 Workshop on Sparsity in Neural Networks
Peer reviewed
 

Files


Full Text
LYM_poster_SNN_boqian_wu.png
Author postprint (1.23 MB)
Download

All documents in ORBilu are protected by a user license.

Send to



Details



Keywords :
Truly dynamic sparse training; Efficient learning
Abstract :
[en] Truly dynamic sparse training (T-DST), unlocking the great potential of dynamic sparse training for achieving comparable or even higher accuracy with less resource cost (less yields more), has been a renewed research topic towards green AI. However, certain aspects of T-DST, such as its sensitivity to the dataset, network architecture, and sparsity strategy, are still not well understood. In this paper, we first implement the truly sparse training for Rigged Lottery (RigL) algorithm, then evaluate its “less yields more” hypothesis by demonstrating that “95% fewer parameters and FLOPs yield up to 33% test accuracy improvement” on CIFAR100. Further on, we take broader insights into how the dataset size, the activation function, and weights distribution affect the performance of the neural network with T-DST. Based on the empirical study, we summarize a guideline in order to exploit “less yields more” in T-DST, hoping to catalyze research progress on the topic of T-DST. Our code will be available online.
Disciplines :
Computer science
Author, co-author :
Xiao, Qiao 
WU, Boqian  ;  University of Luxembourg > Faculty of Science, Technology and Medicine (FSTM) > Department of Computer Science (DCS) ; University of Twente
Yin, Lu;  Eindhoven University of Technology
Keulen, Maurice van;  University of Twente
Pechenizkiy, Mykola;  Eindhoven University of Technology
 These authors have contributed equally to this work.
External co-authors :
no
Language :
English
Title :
Can Less Yield More? Insights into Truly Sparse Training
Publication date :
01 May 2023
Event name :
ICLR 2023 Workshop on Sparsity in Neural Networks
Event place :
Kigali, Rwanda
Event date :
5 May 2023 → 5 May 2023
Audience :
International
Peer reviewed :
Peer reviewed
Focus Area :
Computational Sciences
Development Goals :
9. Industry, innovation and infrastructure
Available on ORBilu :
since 01 February 2026

Statistics


Number of views
3 (0 by Unilu)
Number of downloads
0 (0 by Unilu)

Bibliography


Similar publications



Contact ORBilu