Communication publiée dans un ouvrage (Colloques, congrès, conférences scientifiques et actes)
Training Very Deep Networks via Residual Learning with Stochastic Input Shortcut Connections
OYEDOTUN, Oyebade; SHABAYEK, Abd El Rahman; AOUADA, Djamila et al.
2017In 24th International Conference on Neural Information Processing, Guangzhou, China, November 14–18, 2017
Peer reviewed
 

Documents


Texte intégral
typeinst_V19_review_V02.pdf
Preprint Auteur (650.2 kB)
Télécharger

Tous les documents dans ORBilu sont protégés par une licence d'utilisation.

Envoyer vers



Détails



Mots-clés :
Deep neural networks; residual learning; optimization
Résumé :
[en] Many works have posited the benefit of depth in deep networks. However, one of the problems encountered in the training of very deep networks is feature reuse; that is, features are ’diluted’ as they are forward propagated through the model. Hence, later network layers receive less informative signals about the input data, consequently making training less effective. In this work, we address the problem of feature reuse by taking inspiration from an earlier work which employed residual learning for alleviating the problem of feature reuse. We propose a modification of residual learning for training very deep networks to realize improved generalization performance; for this, we allow stochastic shortcut connections of identity mappings from the input to hidden layers.We perform extensive experiments using the USPS and MNIST datasets. On the USPS dataset, we achieve an error rate of 2.69% without employing any form of data augmentation (or manipulation). On the MNIST dataset, we reach a comparable state-of-the-art error rate of 0.52%. Particularly, these results are achieved without employing any explicit regularization technique.
Centre de recherche :
Interdisciplinary Centre for Security, Reliability and Trust (SnT) > SIGCOM
Disciplines :
Sciences informatiques
Auteur, co-auteur :
OYEDOTUN, Oyebade ;  University of Luxembourg > Interdisciplinary Centre for Security, Reliability and Trust (SNT)
SHABAYEK, Abd El Rahman  ;  University of Luxembourg > Interdisciplinary Centre for Security, Reliability and Trust (SNT)
AOUADA, Djamila  ;  University of Luxembourg > Interdisciplinary Centre for Security, Reliability and Trust (SNT)
OTTERSTEN, Björn  ;  University of Luxembourg > Interdisciplinary Centre for Security, Reliability and Trust (SNT)
Co-auteurs externes :
yes
Langue du document :
Anglais
Titre :
Training Very Deep Networks via Residual Learning with Stochastic Input Shortcut Connections
Date de publication/diffusion :
31 juillet 2017
Nom de la manifestation :
24th International Conference on Neural Information Processing, Guangzhou, China, November 14–18, 2017
Lieu de la manifestation :
Guangzhou, Chine
Date de la manifestation :
November 14–18, 2017
Manifestation à portée :
International
Titre de l'ouvrage principal :
24th International Conference on Neural Information Processing, Guangzhou, China, November 14–18, 2017
Peer reviewed :
Peer reviewed
Focus Area :
Security, Reliability and Trust
Projet FnR :
FNR11295431 - Automatic Feature Selection For Visual Recognition, 2016 (01/02/2017-31/01/2021) - Oyebade Oyedotun
Organisme subsidiant :
This work was funded by the National Research Fund (FNR), Luxembourg, under the project reference R-AGR-0424-05-D/Bjorn Ottersten
Disponible sur ORBilu :
depuis le 05 septembre 2017

Statistiques


Nombre de vues
306 (dont 40 Unilu)
Nombre de téléchargements
511 (dont 43 Unilu)

citations Scopus®
 
16
citations Scopus®
sans auto-citations
8

Bibliographie


Publications similaires



Contacter ORBilu