Paper published in a book (Scientific congresses, symposiums and conference proceedings)
Training Very Deep Networks via Residual Learning with Stochastic Input Shortcut Connections
Oyedotun, Oyebade; Shabayek, Abd El Rahman; Aouada, Djamila et al.
2017In 24th International Conference on Neural Information Processing, Guangzhou, China, November 14–18, 2017
Peer reviewed
 

Files


Full Text
typeinst_V19_review_V02.pdf
Author preprint (650.2 kB)
Download

All documents in ORBilu are protected by a user license.

Send to



Details



Keywords :
Deep neural networks; residual learning; optimization
Abstract :
[en] Many works have posited the benefit of depth in deep networks. However, one of the problems encountered in the training of very deep networks is feature reuse; that is, features are ’diluted’ as they are forward propagated through the model. Hence, later network layers receive less informative signals about the input data, consequently making training less effective. In this work, we address the problem of feature reuse by taking inspiration from an earlier work which employed residual learning for alleviating the problem of feature reuse. We propose a modification of residual learning for training very deep networks to realize improved generalization performance; for this, we allow stochastic shortcut connections of identity mappings from the input to hidden layers.We perform extensive experiments using the USPS and MNIST datasets. On the USPS dataset, we achieve an error rate of 2.69% without employing any form of data augmentation (or manipulation). On the MNIST dataset, we reach a comparable state-of-the-art error rate of 0.52%. Particularly, these results are achieved without employing any explicit regularization technique.
Research center :
Interdisciplinary Centre for Security, Reliability and Trust (SnT) > SIGCOM
Disciplines :
Computer science
Author, co-author :
Oyedotun, Oyebade ;  University of Luxembourg > Interdisciplinary Centre for Security, Reliability and Trust (SNT)
Shabayek, Abd El Rahman ;  University of Luxembourg > Interdisciplinary Centre for Security, Reliability and Trust (SNT)
Aouada, Djamila  ;  University of Luxembourg > Interdisciplinary Centre for Security, Reliability and Trust (SNT)
Ottersten, Björn ;  University of Luxembourg > Interdisciplinary Centre for Security, Reliability and Trust (SNT)
External co-authors :
yes
Language :
English
Title :
Training Very Deep Networks via Residual Learning with Stochastic Input Shortcut Connections
Publication date :
31 July 2017
Event name :
24th International Conference on Neural Information Processing, Guangzhou, China, November 14–18, 2017
Event place :
Guangzhou, China
Event date :
November 14–18, 2017
Audience :
International
Main work title :
24th International Conference on Neural Information Processing, Guangzhou, China, November 14–18, 2017
Peer reviewed :
Peer reviewed
Focus Area :
Security, Reliability and Trust
FnR Project :
FNR11295431 - Automatic Feature Selection For Visual Recognition, 2016 (01/02/2017-31/01/2021) - Oyebade Oyedotun
Funders :
This work was funded by the National Research Fund (FNR), Luxembourg, under the project reference R-AGR-0424-05-D/Bjorn Ottersten
Available on ORBilu :
since 05 September 2017

Statistics


Number of views
219 (40 by Unilu)
Number of downloads
427 (43 by Unilu)

Scopus citations®
 
13
Scopus citations®
without self-citations
5

Bibliography


Similar publications



Contact ORBilu