Doctoral thesis (Dissertations and theses)
Online Learning Using Distributed Neural Networks
DALLE LUCCA TOSI, Mauro
2024
 

Files


Full Text
2024_Online_Learning_Using_Distributed_Neural_Networks.pdf
Author postprint (7.12 MB)
Download

In reference to IEEE copyrighted material which is used with permission in this thesis, the IEEE does not endorse any of University of Luxembourg’s products or services. Internal or personal use of this material is permitted. If interested in reprinting/republishing IEEE copyrighted material for advertising or promotional purposes or for creating new collective works for resale or redistribution, please go to http://www.ieee.org/publications_standards/publications/rights/rights_link.html to learn how to obtain a License from RightsLink. If applicable, University Microfilms and/or ProQuest Library, or the Archives of Canada may supply single copies of the dissertation.


All documents in ORBilu are protected by a user license.

Send to



Details



Keywords :
Online Learning; ASGD; Concept Drift; Neural Networks
Abstract :
[en] Online Learning (OL), a sub-field of Machine Learning (ML), focuses on addressing time-sensitive problems through iterative learning from data streams. This field is characterized by the challenge of concept drift, where the data distribution evolves over time, necessitating algorithms that can adapt dynamically for accurate predictions. Traditional OL algorithms, while efficient and less resource-intensive than conventional ML methods, often fall short in non-linear high-dimensional problems. This gap has led to the integration of Artificial Neural Networks (ANN) in OL, albeit with limitations -- these solutions allow for real-time inference but require offline training, leading to periods of low-quality predictions during concept drifts. This doctoral thesis explores the potential of online training ANN models within an OL setting, a multidisciplinary endeavor encompassing Big Data, Machine Learning, Optimization, Distributed Computing, and System Development. The thesis introduces TensAIR, the first OL system capable of training ANN models online. TensAIR utilizes the iterative aspect of Stochastic Gradient Descent (SGD) to train models directly from data streams. Its asynchronous and decentralized architecture allows for significantly faster processing -- handling 6 to 116 times more data samples per second compared to baselines modified for online training. Further, the thesis addresses the challenge of unnecessary retraining of ANN models that have already converged and are unaffected by concept drift. Commonly used concept-drift detectors are often tailored for binary data and exhibit a high rate of false positives, rendering them less suitable for ANN models. Addressing this gap, we introduce OPTWIN, which evaluates shifts in both the mean and the standard deviation of prediction errors. This approach not only matches the recall rates of existing detectors but reduces the incidence of false positives. The impact of OPTWIN's advanced detection capabilities lead it to a significant 21% decrease in the time required for retraining models, as compared to traditional methods. This enhancement in retraining efficiency represents a major stride forward in maintaining the accuracy and reliability of ANN models in dynamic OL environments. In addition to presenting empirical evidence that underscores the effectiveness of TensAIR, this thesis delves into theoretical contributions, notably through the validation of the convergence rate of the newly proposed decentralized and asynchronous Stochastic Gradient Descent (DASGD) algorithm. This critical proof establishes that DASGD can achieve a convergence rate on par with traditional synchronous methods, particularly under conditions characterized by low average communication delays. Importantly, this algorithm not only parallels the efficiency of synchronous approaches but also significantly alleviates the computational and communication bottlenecks often associated with synchronous processing. As a result, DASGD holds the potential to accelerate the training processes of ANN models in an OL setting, thereby enhancing their applicability and effectiveness in online scenarios. In conclusion, this research successfully demonstrates the viability of training ANN models in an OL context. It not only offers practical solutions like TensAIR and OPTWIN but also contributes theoretically with the DASGD algorithm. This work is expected to inspire further research and lead to the creation of novel applications and use cases, leveraging the advantages of ANN models trained in Online Learning environments.
Disciplines :
Computer science
Author, co-author :
DALLE LUCCA TOSI, Mauro  ;  University of Luxembourg > Faculty of Science, Technology and Medicine > Department of Computer Science > Team Martin THEOBALD
Language :
English
Title :
Online Learning Using Distributed Neural Networks
Defense date :
28 March 2024
Institution :
Unilu - University of Luxembourg [Faculty of Science, Technology and Medicine], Esch-sur-Alzette, Luxembourg
Degree :
Docteur en Informatique (DIP_DOC_0006_B)
Promotor :
THEOBALD, Martin ;  University of Luxembourg > Faculty of Science, Technology and Medicine (FSTM) > Department of Computer Science (DCS)
President :
PANG, Jun  ;  University of Luxembourg > Faculty of Science, Technology and Medicine (FSTM) > Department of Computer Science (DCS)
Jury member :
STICH, Sebastian;  CISPA Helmholtz Center for Information Security
MOCANU, Decebal Constantin  ;  University of Luxembourg > Faculty of Science, Technology and Medicine (FSTM) > Department of Computer Science (DCS)
DOS REIS, Julio Cesar;  State University of Campinas
Focus Area :
Computational Sciences
FnR Project :
FNR12252781 - Data-driven Computational Modelling And Applications, 2017 (01/09/2018-28/02/2025) - Andreas Zilian
Funders :
FNR - Luxembourg National Research Fund
Funding number :
1212252781
Funding text :
This work is funded by the Luxembourg National Research Fund under the PRIDE program (PRIDE17/12252781).
Available on ORBilu :
since 26 April 2024

Statistics


Number of views
181 (8 by Unilu)
Number of downloads
332 (8 by Unilu)

Bibliography


Similar publications



Contact ORBilu