Article (Périodiques scientifiques)
A Hybrid Architecture for Federated and Centralized Learning
Elbir, Ahmet M.; Coleri, Sinem; Papazafeiropoulos, Anastasios K. et al.
2022In IEEE Transactions on Cognitive Communications and Networking, 8 (3), p. 1529 - 1542
Peer reviewed vérifié par ORBi Dataset
 

Documents


Texte intégral
paper_2_preprint.pdf
Postprint Auteur (6.06 MB)
Télécharger

Tous les documents dans ORBilu sont protégés par une licence d'utilisation.

Envoyer vers



Détails



Mots-clés :
centralized learning; edge efficiency; edge intelligence; federated learning; Machine learning; Centralised; Centralized learning; Collaborative Work; Communication overheads; Computational modelling; Computational resources; Edge efficiency; Edge intelligence; Federated learning; Hybrid architectures; Hardware and Architecture; Computer Networks and Communications; Artificial Intelligence
Résumé :
[en] Many of the machine learning tasks rely on centralized learning (CL), which requires the transmission of local datasets from the clients to a parameter server (PS) entailing huge communication overhead. To overcome this, federated learning (FL) has been suggested as a promising tool, wherein the clients send only the model updates to the PS instead of the whole dataset. However, FL demands powerful computational resources from the clients. In practice, not all the clients have sufficient computational resources to participate in training. To address this common scenario, we propose a more efficient approach called hybrid federated and centralized learning (HFCL), wherein only the clients with sufficient resources employ FL, while the remaining ones send their datasets to the PS, which computes the model on behalf of them. Then, the model parameters are aggregated at the PS. To improve the efficiency of dataset transmission, we propose two different techniques: i) increased computation-per-client and ii) sequential data transmission. Notably, the HFCL frameworks outperform FL with up to 20% improvement in the learning accuracy when only half of the clients perform FL while having 50% less communication overhead than CL since all the clients collaborate on the learning process with their datasets.
Disciplines :
Sciences informatiques
Auteur, co-auteur :
Elbir, Ahmet M. ;  Duzce University, Department of Electrical and Electronics Engineering, Duzce, Turkey ; University of Luxembourg, SnT, Luxembourg City, Luxembourg
Coleri, Sinem ;  Koc University, Department of Electrical and Electronics Engineering, Istanbul, Turkey
Papazafeiropoulos, Anastasios K. ;  Duzce University, Department of Electrical and Electronics Engineering, Duzce, Turkey ; University of Hertfordshire, Cis Research Group, Hatfield, United Kingdom
Kourtessis, Pandelis ;  University of Hertfordshire, Cis Research Group, Hatfield, United Kingdom
CHATZINOTAS, Symeon  ;  University of Luxembourg > Interdisciplinary Centre for Security, Reliability and Trust (SNT) > SigCom ; Duzce University, Department of Electrical and Electronics Engineering, Duzce, Turkey
Co-auteurs externes :
yes
Langue du document :
Anglais
Titre :
A Hybrid Architecture for Federated and Centralized Learning
Date de publication/diffusion :
septembre 2022
Titre du périodique :
IEEE Transactions on Cognitive Communications and Networking
eISSN :
2332-7731
Maison d'édition :
Institute of Electrical and Electronics Engineers Inc.
Volume/Tome :
8
Fascicule/Saison :
3
Pagination :
1529 - 1542
Peer reviewed :
Peer reviewed vérifié par ORBi
Organisme subsidiant :
ERC project AGNOSTIC, and by CHIST-ERA
Scientific and Technological Council of Turkey
Disponible sur ORBilu :
depuis le 30 novembre 2023

Statistiques


Nombre de vues
80 (dont 0 Unilu)
Nombre de téléchargements
102 (dont 0 Unilu)

citations Scopus®
 
43
citations Scopus®
sans auto-citations
40
OpenCitations
 
5
citations OpenAlex
 
46
citations WoS
 
34

Bibliographie


Publications similaires



Contacter ORBilu