Voting Based Approaches For Differentially Private Federated Learning

Publication Date: 10/6/2020

Event: arXiv

Reference: https://arxiv.org/abs/2010.04851

Authors: Yuqing Zhu, University of California, Santa Barbara, NEC Laboratories America, Inc., Xiang Yu, NEC Laboratories America, Inc., Yi Hsuan Tsai, NEC Laboratories America, Inc., Francesco Pittaluga, NEC Laboratories America, Inc., Masoud Faraki, NEC Laboratories America, Inc., Manmohan Chandraker, NEC Laboratories America, Inc., Yu Xiang Wang, University of California, Santa Barbara

Abstract: Differentially Private Federated Learning (DPFL) is an emerging field with many applications. Gradient averaging based DPFL methods require costly communication rounds and hardly work with large capacity models, due to the explicit dimension dependence in its added noise. In this work, inspired by knowledge transfer non federated privacy learning from Papernot et al.(2017, 2018), we design two new DPFL schemes, by voting among the data labels returned from each local model, instead of averaging the gradients, which avoids the dimension dependence and significantly reduces the communication cost. Theoretically, by applying secure multi party computation, we could exponentially amplify the (data dependent) privacy guarantees when the margin of the voting scores are large. Extensive experiments show that our approaches significantly improve the privacy utility trade off over the state of the arts in DPFL.

Publication Link: https://arxiv.org/pdf/2010.04851.pdf