Edge-based communication optimization for distributed federated learning

Tian Wang, Yan Liu, Xi Zheng, Hong Ning Dai, Weijia Jia, Mande Xie*

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

39 Citations (Scopus)


Federated learning can achieve distributed machine learning without sharing privacy and sensitive data of end devices. However, high concurrent access to cloud servers increases the transmission delay of model updates. Some local models may be unnecessary with an opposite gradient from the global model, thus incurring many additional communication costs. Existing work mainly focuses on reducing communication rounds or cleaning local defect data, and neither takes into account latency associated with high server concurrency. To this end, we study an edge-based communication optimization framework to reduce the number of end devices directly connected to the parameter server while avoiding uploading unnecessary local updates. Specifically, we cluster devices in the same network location and deploy mobile edge nodes in different network locations to serve as hubs for cloud and end devices communications, thereby avoiding the latency associated with high server concurrency. Meanwhile, we propose a method based on cosine similarity to filter out unnecessary models, thus avoiding unnecessary communication. Experimental results show that compared with traditional federated learning, the proposed scheme reduces the number of local updates by 60%, and the convergence speed of the evaluated model increases by 10.3%.

Original languageEnglish
Pages (from-to)2015-2024
Number of pages10
JournalIEEE Transactions on Network Science and Engineering
Issue number4
Early online date3 Jun 2021
Publication statusPublished - Jul 2022


  • Clustering
  • Collaborative work
  • Communication optimization
  • Computational modeling
  • Data models
  • Data privacy
  • Federated learning
  • Mobile edge nodes
  • Model filtering
  • Optimization
  • Servers
  • Training


Dive into the research topics of 'Edge-based communication optimization for distributed federated learning'. Together they form a unique fingerprint.

Cite this