Article (Scientific journals)
Scalable training of artificial neural networks with adaptive sparse connectivity inspired by network science.
MOCANU, Decebal Constantin; Mocanu, Elena; Stone, Peter et al.
2018In Nature Communications, 9 (1), p. 2383
Peer Reviewed verified by ORBi
 

Files


Full Text
1707.04780.pdf
Author postprint (1.48 MB)
Download

All documents in ORBilu are protected by a user license.

Send to



Details



Keywords :
Machine Learning; Deep Learning; Sparse Neural Networks; Dynamic Sparse Training
Abstract :
[en] Through the success of deep learning in various domains, artificial neural networks are currently among the most used artificial intelligence methods. Taking inspiration from the network properties of biological neural networks (e.g. sparsity, scale-freeness), we argue that (contrary to general practice) artificial neural networks, too, should not have fully-connected layers. Here we propose sparse evolutionary training of artificial neural networks, an algorithm which evolves an initial sparse topology (Erdős-Rényi random graph) of two consecutive layers of neurons into a scale-free topology, during learning. Our method replaces artificial neural networks fully-connected layers with sparse ones before training, reducing quadratically the number of parameters, with no decrease in accuracy. We demonstrate our claims on restricted Boltzmann machines, multi-layer perceptrons, and convolutional neural networks for unsupervised and supervised learning on 15 datasets. Our approach has the potential to enable artificial neural networks to scale up beyond what is currently possible.
Disciplines :
Computer science
Author, co-author :
MOCANU, Decebal Constantin  ;  University of Luxembourg ; Department of Mathematics and Computer Science, Eindhoven University of Technology, De Rondom 70, 5612 AP, Eindhoven, The Netherlands. d.c.mocanu@tue.nl ; Department of Electrical Engineering, Eindhoven University of Technology, De Rondom 70, 5612 AP, Eindhoven, The Netherlands. d.c.mocanu@tue.nl
Mocanu, Elena ;  Department of Electrical Engineering, Eindhoven University of Technology, De Rondom 70, 5612 AP, Eindhoven, The Netherlands ; Department of Mechanical Engineering, Eindhoven University of Technology, De Rondom 70, 5612 AP, Eindhoven, The Netherlands
Stone, Peter;  Department of Computer Science, The University of Texas at Austin, 2317 Speedway, Stop D9500, Austin, TX, 78712-1757, USA
Nguyen, Phuong H;  Department of Electrical Engineering, Eindhoven University of Technology, De Rondom 70, 5612 AP, Eindhoven, The Netherlands
Gibescu, Madeleine;  Department of Electrical Engineering, Eindhoven University of Technology, De Rondom 70, 5612 AP, Eindhoven, The Netherlands
Liotta, Antonio ;  Data Science Centre, University of Derby, Lonsdale House, Quaker Way, Derby, DE1 3HD, UK
External co-authors :
yes
Language :
English
Title :
Scalable training of artificial neural networks with adaptive sparse connectivity inspired by network science.
Publication date :
19 June 2018
Journal title :
Nature Communications
eISSN :
2041-1723
Publisher :
Nature Publishing Group, Basingstoke, Hampshire, England
Volume :
9
Issue :
1
Pages :
2383
Peer reviewed :
Peer Reviewed verified by ORBi
Focus Area :
Computational Sciences
Available on ORBilu :
since 18 October 2023

Statistics


Number of views
26 (7 by Unilu)
Number of downloads
17 (0 by Unilu)

Scopus citations®
 
438
Scopus citations®
without self-citations
397
OpenCitations
 
124
OpenAlex citations
 
268
WoS citations
 
344

publications
0
supporting
0
mentioning
0
contrasting
0
Smart Citations
0
0
0
0
Citing PublicationsSupportingMentioningContrasting
View Citations

See how this article has been cited at scite.ai

scite shows how a scientific paper has been cited by providing the context of the citation, a classification describing whether it supports, mentions, or contrasts the cited claim, and a label indicating in which section the citation was made.

Bibliography


Similar publications



Contact ORBilu