Poster (Scientific congresses, symposiums and conference proceedings)
Information Theoretic Pruning of Coupled Channels in Deep Neural Networks
ROSTAMI ABENDANSARI, Peyman; SINHA, Nilotpal; CHENNI, Nidhal Eddine et al.
20252025 IEEE/CVF Winter Conference on Applications of Computer Vision (WACV)
Peer reviewed
 

Files


Full Text
Rostami_Information_Theoretic_Pruning_of_Coupled_Channels_in_Deep_Neural_Networks_WACV_2025_paper.pdf
Author preprint (1.72 MB) Creative Commons License - Public Domain Dedication
Download
Full Text Parts
Rostami_Information_Theoretic_Pruning_of_Coupled_Channels_in_Deep_Neural_Networks_WACV_2025_supplementary.pdf
Author postprint (1.09 MB) Creative Commons License - Public Domain Dedication
Download

All documents in ORBilu are protected by a user license.

Send to



Details



Keywords :
channel pruning; coupled channels; information bottleneck; neural network compression; variational pruning; vib; Channel pruning; Coupled-channels; Information bottleneck; Network compression; Neural network compression; Neural-networks; Stochastic nature; Structured sparsities; Variational pruning; Vib; Artificial Intelligence; Computer Science Applications; Computer Vision and Pattern Recognition
Abstract :
[en] Variational channel pruning approaches have obtained impressive results thanks to their stochastic nature, well established foundation in information theory, and the practically appealing structured sparsity pattern they offer. Despite their success in pruning Plain Networks (PlainNets), their application has faced certain limitations in networks with structurally coupled channels such as ResNets. In such scenarios, not only is it required to prune structurally coupled channels together, but it is also necessary to ensure that the whole coupled group is irrelevant before pruning is applied. This is an under-investigated problem as most existing methods are designed without taking these couplings into account. In this paper, we propose a novel approach based on Information Theoretic Pruning of structurally Coupled Channels (ITPCC) in neural networks. IT-PCC allows for learning the probabilistic distribution of coupled channel set importance and prunes the ones with the least relevant information to the task at hand. Experimental results for image classification on CIFAR10, CI-FAR100, and ImageNet datasets show that the proposed method outperforms the state-of-the-art, more significantly at high compression rates.
Research center :
See outside Interdisciplinary Centre for Security, Reliability and Trust (SnT), University of Luxembourg
Disciplines :
Computer science
Author, co-author :
ROSTAMI ABENDANSARI, Peyman ;  University of Luxembourg > Interdisciplinary Centre for Security, Reliability and Trust (SNT) > CVI2
SINHA, Nilotpal  ;  University of Luxembourg > Interdisciplinary Centre for Security, Reliability and Trust > CVI2 > Team Djamila AOUADA
CHENNI, Nidhal Eddine ;  University of Luxembourg > Interdisciplinary Centre for Security, Reliability and Trust (SNT) > CVI2
KACEM, Anis  ;  University of Luxembourg > Interdisciplinary Centre for Security, Reliability and Trust (SNT) > CVI2
SHABAYEK, Abd El Rahman  ;  University of Luxembourg > Interdisciplinary Centre for Security, Reliability and Trust (SNT) > CVI2
SHNEIDER, Carl  ;  University of Luxembourg > Interdisciplinary Centre for Security, Reliability and Trust (SNT) > SPASYS
AOUADA, Djamila  ;  University of Luxembourg > Interdisciplinary Centre for Security, Reliability and Trust (SNT) > CVI2
External co-authors :
no
Language :
English
Title :
Information Theoretic Pruning of Coupled Channels in Deep Neural Networks
Publication date :
2025
Event name :
2025 IEEE/CVF Winter Conference on Applications of Computer Vision (WACV)
Event place :
Tucson, Usa
Event date :
28-02-2025 => 04-03-2025
Peer reviewed :
Peer reviewed
FnR Project :
C21/IS/15965298/ELITE
Name of the research project :
U-AGR-7116 - C21/IS/15965298/ELITE - AOUADA Djamila
Funders :
FNR - Luxembourg National Research Fund
Funding text :
This work is supported by the Luxembourg National Research Fund (FNR), under the project reference C21/IS/15965298/ELITE.
Available on ORBilu :
since 06 January 2026

Statistics


Number of views
22 (0 by Unilu)
Number of downloads
18 (1 by Unilu)

Scopus citations®
 
0
Scopus citations®
without self-citations
0
OpenCitations
 
0
OpenAlex citations
 
1

Bibliography


Similar publications



Contact ORBilu