DFLStar: a decentralized federated learning framework with self-knowledge distillation and participant selection

Research output: Chapter in Book/Report/Conference proceedingConference proceeding contributionpeer-review

23 Downloads (Pure)

Abstract

Federated learning (FL) is a distributed machine learning paradigm in which clients collaboratively train models in a privacy-preserving manner. While centralized FL (CFL) suffers from single points of failure and performance bottlenecks, decentralized FL (DFL), which depends on inter-client communication, has emerged to eliminate the need of a central entity. However, due to lack of the coordination of a central server, heterogeneous data distribution across clients makes local models in DFL inclined to diverge towards their local objectives, resulting in poor model accuracy. Moreover, each client in DFL needs to communicate with multiple neighbors, yielding a heavy communication load. To tackle these challenges, we propose a novel DFL framework called DFLStar, which can improve DFL from two perspectives. First, to avoid significantly diverging towards local data, DFLStar incorporates self-knowledge distillation to enhance the local model training by assimilating knowledge from the aggregated model. Second, clients in DFLStar identify and only select the most informative neighbors (based on the last layer model similarity) for parameter exchange, thereby minimizing the communication overhead. Our experimental results on two real datasets demonstrate that DFLStar significantly improves both communication overhead and training time compared to traditional DFL algorithms while achieving a specific target accuracy. Furthermore, within a fixed training duration, DFLStar constantly obtains the highest model accuracy compared to the baselines.

Original languageEnglish
Title of host publicationCIKM '24
Subtitle of host publicationproceedings of the 33rd ACM International Conference on Information and Knowledge Management
Place of PublicationNew York
PublisherAssociation for Computing Machinery
Pages2108-2117
Number of pages10
ISBN (Electronic)9798400704369
DOIs
Publication statusPublished - 2024
Event33rd ACM International Conference on Information and Knowledge Management, CIKM 2024 - Boise, United States
Duration: 21 Oct 202425 Oct 2024

Conference

Conference33rd ACM International Conference on Information and Knowledge Management, CIKM 2024
Country/TerritoryUnited States
CityBoise
Period21/10/2425/10/24

Bibliographical note

Copyright the Author(s) 2024. Version archived for private and non-commercial use with the permission of the author/s and according to publisher conditions. For further rights please contact the publisher.

Keywords

  • Decentralized Federated Learning
  • Participant Selection
  • Knowledge Distillation

Cite this