Exploring the practicality of differentially private federated learning: a local iteration tuning approach

Yipeng Zhou, Runze Wang, Jiahao Liu, Di Wu*, Shui Yu, Yonggang Wen

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

Abstract

Although Federated Learning (FL) prevents the exposure of original data samples when collaboratively training machine learning models among decentralized clients, it has been revealed that vanilla FL is still susceptible to adversarial attacks if model parameters are leaked to malicious attackers. To enhance the protection level of FL, Differential Private Federated Learning (DPFL) has been proposed in recent years. DPFL injects zero-mean noises randomly generated by differential private (DP) mechanisms on local model parameters before they are disclosed. Nevertheless, DP noises can significantly deteriorate model utility jeopardizing the practicality of DPFL. In this article, we are among the first to explore how to improve the model utility of DPFL by tuning the number of local iterations (LIs) on DPFL clients. Our work shows that such a local iteration tuning approach can well mitigate the adverse influence of DP noises on the final model utility. Formally, we derive the sensitivity (a measure of the maximum change of the output given two adjacent inputs) with respect to the number of LIs conducted on DPFL clients for the Laplace mechanism, and the aggregated variances of Laplace noises at the server side. We further conduct convergence rate analysis to quantify the influence of the Laplace noises on the final model accuracy and determine how to optimally set the number of LIs. Finally, to verify our theoretical findings, we perform extensive experiments using three real-world datasets, namely, Lending Club, MNIST and Fashion-MNIST. The results not only corroborate our analysis, but also demonstrate that our approach significantly improves the practicality of DPFL.

Original languageEnglish
Pages (from-to)3280-3294
Number of pages15
JournalIEEE Transactions on Dependable and Secure Computing
Volume21
Issue number4
Early online date19 Oct 2023
DOIs
Publication statusPublished - 2024

Fingerprint

Dive into the research topics of 'Exploring the practicality of differentially private federated learning: a local iteration tuning approach'. Together they form a unique fingerprint.

Cite this