TY - JOUR
T1 - Communication cost-aware client selection in online federated learning
T2 - a Lyapunov approach
AU - Su, Dongyuan
AU - Zhou, Yipeng
AU - Cui, Laizhong
AU - Sheng, Quan Z.
PY - 2024/7
Y1 - 2024/7
N2 - The proliferation of intelligence services brings data breaches and privacy infringement concerns. To preserve data privacy when training machine learning models, the federated learning (FL) paradigm emerges. Most existing works assume that training data on FL clients are static during the entire learning process. Nevertheless, various real-time intelligent services call for timely processing of continuously generated data, which fosters the advent of online federated learning (OFL). Currently, how to reconcile model utility and communication cost in OFL is still an open problem. To address this challenge, we leverage the Lyapunov optimization framework to devise a novel Low Cost Client Selection (LCCS) algorithm for OFL, which can judiciously select participating clients to maximize model utility with a low communication cost. Specifically, we design the objective as the sum of a penalty function and a Lyapunov drift function to take both gradient-based client valuation and communication cost into account. By minimizing the objective, we further design the LCCS algorithm, which is lightweight for execution on clients. At last, we conduct extensive experiments with traces generated from public datasets. The experimental results demonstrate that LCCS achieves the highest model utility with a fixed communication cost in comparison with the state-of-the-art baselines.
AB - The proliferation of intelligence services brings data breaches and privacy infringement concerns. To preserve data privacy when training machine learning models, the federated learning (FL) paradigm emerges. Most existing works assume that training data on FL clients are static during the entire learning process. Nevertheless, various real-time intelligent services call for timely processing of continuously generated data, which fosters the advent of online federated learning (OFL). Currently, how to reconcile model utility and communication cost in OFL is still an open problem. To address this challenge, we leverage the Lyapunov optimization framework to devise a novel Low Cost Client Selection (LCCS) algorithm for OFL, which can judiciously select participating clients to maximize model utility with a low communication cost. Specifically, we design the objective as the sum of a penalty function and a Lyapunov drift function to take both gradient-based client valuation and communication cost into account. By minimizing the objective, we further design the LCCS algorithm, which is lightweight for execution on clients. At last, we conduct extensive experiments with traces generated from public datasets. The experimental results demonstrate that LCCS achieves the highest model utility with a fixed communication cost in comparison with the state-of-the-art baselines.
KW - Online federated learning
KW - Lyapunov optimization
KW - Client selection
UR - http://www.scopus.com/inward/record.url?scp=85194032582&partnerID=8YFLogxK
U2 - 10.1016/j.comnet.2024.110517
DO - 10.1016/j.comnet.2024.110517
M3 - Article
AN - SCOPUS:85194032582
SN - 1389-1286
VL - 249
SP - 1
EP - 12
JO - Computer Networks
JF - Computer Networks
M1 - 110517
ER -