FedFog: Network-Aware Optimization of Federated Learning over Wireless Fog-Cloud Systems

Van-Dinh Nguyen, Symeon Chatzinotas, Bjorn Ottersen, Trung Q. Duong

Research output: Contribution to journalArticlepeer-review

15 Citations (Scopus)
31 Downloads (Pure)

Abstract

Federated learning (FL) is capable of performing large distributed machine learning tasks across multiple edge users by periodically aggregating trained local parameters. To address key challenges of enabling FL over a wireless fog-cloud system (e.g., non-i.i.d. data, users’ heterogeneity), we first propose an efficient FL algorithm based on Federated Averaging (called FedFog) to perform the local aggregation of gradient parameters at fog servers and global training update at the cloud. Next, we employ FedFog in wireless fog-cloud systems by investigating a novel network-aware FL optimization problem that strikes the balance between the global loss and completion time. An iterative algorithm is then developed to obtain a precise measurement of the system performance, which helps design an efficient stopping criteria to output an appropriate number of global rounds. To mitigate the straggler effect, we propose a flexible user aggregation strategy that trains fast users first to obtain a certain level of accuracy before allowing slow users to join the global training updates. Extensive numerical results using several real-world FL tasks are provided to verify the theoretical convergence of FedFog. We also show that the proposed co-design of FL and communication is essential to substantially improve resource utilization while achieving comparable accuracy of the learning model.
Original languageEnglish
Number of pages18
JournalIEEE Transactions on Wireless Communications
Early online date20 Apr 2022
DOIs
Publication statusEarly online date - 20 Apr 2022

Fingerprint

Dive into the research topics of 'FedFog: Network-Aware Optimization of Federated Learning over Wireless Fog-Cloud Systems'. Together they form a unique fingerprint.

Cite this