Reference : Training Very Deep Networks via Residual Learning with Stochastic Input Shortcut Conn... |
Scientific congresses, symposiums and conference proceedings : Paper published in a book | |||
Engineering, computing & technology : Computer science | |||
Security, Reliability and Trust | |||
http://hdl.handle.net/10993/32080 | |||
Training Very Deep Networks via Residual Learning with Stochastic Input Shortcut Connections | |
English | |
Oyedotun, Oyebade ![]() | |
Shabayek, Abd El Rahman ![]() | |
Aouada, Djamila ![]() | |
Ottersten, Björn ![]() | |
31-Jul-2017 | |
24th International Conference on Neural Information Processing, Guangzhou, China, November 14–18, 2017 | |
Yes | |
No | |
International | |
24th International Conference on Neural Information Processing, Guangzhou, China, November 14–18, 2017 | |
November 14–18, 2017 | |
Guangzhou | |
China | |
[en] Deep neural networks ; residual learning ; optimization | |
[en] Many works have posited the benefit of depth in deep networks. However,
one of the problems encountered in the training of very deep networks is feature reuse; that is, features are ’diluted’ as they are forward propagated through the model. Hence, later network layers receive less informative signals about the input data, consequently making training less effective. In this work, we address the problem of feature reuse by taking inspiration from an earlier work which employed residual learning for alleviating the problem of feature reuse. We propose a modification of residual learning for training very deep networks to realize improved generalization performance; for this, we allow stochastic shortcut connections of identity mappings from the input to hidden layers.We perform extensive experiments using the USPS and MNIST datasets. On the USPS dataset, we achieve an error rate of 2.69% without employing any form of data augmentation (or manipulation). On the MNIST dataset, we reach a comparable state-of-the-art error rate of 0.52%. Particularly, these results are achieved without employing any explicit regularization technique. | |
Interdisciplinary Centre for Security, Reliability and Trust (SnT) > SIGCOM | |
This work was funded by the National Research Fund (FNR), Luxembourg, under the project reference R-AGR-0424-05-D/Bjorn Ottersten | |
Researchers ; Professionals ; Students | |
http://hdl.handle.net/10993/32080 | |
FnR ; FNR11295431 > Oyebade Oyedotun > AVR > Automatic Feature Selection For Visual Recognition > 01/02/2017 > 31/01/2021 > 2016 |
File(s) associated to this reference | ||||||||||||||
Fulltext file(s):
| ||||||||||||||
All documents in ORBilu are protected by a user license.