Poster (Scientific congresses, symposiums and conference proceedings)
Why do Deep Neural Networks with Skip Connections and Concatenated Hidden Representations Work?
Oyedotun, Oyebade; Aouada, Djamila
2020The 27th International Conference on Neural Information Processing (ICONIP2020)
 

Files


Full Text
OyedotunAouada_ICONIP2020_full paper.pdf
Author postprint (2.13 MB)
Download

All documents in ORBilu are protected by a user license.

Send to



Details



Research center :
Interdisciplinary Centre for Security, Reliability and Trust (SnT) > SIGCOM
Disciplines :
Computer science
Author, co-author :
Oyedotun, Oyebade ;  University of Luxembourg > Interdisciplinary Centre for Security, Reliability and Trust (SNT) > CVI2
Aouada, Djamila  ;  University of Luxembourg > Interdisciplinary Centre for Security, Reliability and Trust (SNT) > CVI2
External co-authors :
no
Language :
English
Title :
Why do Deep Neural Networks with Skip Connections and Concatenated Hidden Representations Work?
Publication date :
18 November 2020
Event name :
The 27th International Conference on Neural Information Processing (ICONIP2020)
Event date :
18-11-2020 to 22-11-2020
FnR Project :
FNR11295431 - Automatic Feature Selection For Visual Recognition, 2016 (01/02/2017-31/01/2021) - Oyebade Oyedotun
Funders :
FNR - Fonds National de la Recherche [LU]
Available on ORBilu :
since 15 October 2020

Statistics


Number of views
126 (18 by Unilu)
Number of downloads
336 (14 by Unilu)

Bibliography


Similar publications



Contact ORBilu