Gain without pain: offsetting DP-injected nosies stealthily in cross-device federated learning

Wenzhuo Yang, Yipeng Zhou, Miao Hu, Di Wu*, Xi Zheng, Jessie Hui Wang, Song Guo, Chao Li

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

2 Citations (Scopus)


Federated learning (FL) is an emerging paradigm through which decentralized devices can collaboratively train a common model. However, a serious concern is the leakage of privacy from exchanged gradient information between clients and the parameter server (PS) in FL. To protect gradient information, clients can adopt differential privacy (DP) to add additional noises and distort original gradients before they are uploaded to the PS. Nevertheless, the model accuracy will be significantly impaired by DP noises, making DP impracticable in real systems. In this work, we propose a novel noise information secretly sharing (NISS) algorithm to alleviate the disturbance of DP noises by sharing negated noises among clients. We theoretically prove that: 1) if clients are trustworthy, DP noises can be perfectly offset on the PS and 2) clients can easily distort negated DP noises to protect themselves in case that other clients are not totally trustworthy, though the cost lowers model accuracy. NISS is particularly applicable for FL across multiple Internet of Things (IoT) systems, in which all IoT devices need to collaboratively train a model. To verify the effectiveness and the superiority of the NISS algorithm, we conduct experiments with the MNIST and CIFAR-10 data sets. The experimental results verify our analysis and demonstrate that NISS can improve model accuracy by 19% on average and obtain better privacy protection if clients are trustworthy.

Original languageEnglish
Pages (from-to)22147-22157
Number of pages11
JournalIEEE Internet of Things Journal
Issue number22
Early online date3 Aug 2021
Publication statusPublished - 15 Nov 2022


Dive into the research topics of 'Gain without pain: offsetting DP-injected nosies stealthily in cross-device federated learning'. Together they form a unique fingerprint.

Cite this