Distributed empirical risk minimization with differential privacy
Date
2024
Authors
Liu, Changxin
Johansson, Karl H.
Shi, Yang
Journal Title
Journal ISSN
Volume Title
Publisher
Automatica
Abstract
This work studies the distributed empirical risk minimization (ERM) problem under differential privacy (DP) constraint. Standard distributed algorithms achieve DP typically by perturbing all local subgradients with noise, leading to significantly degenerated utility. To tackle this issue, we develop a class of private distributed dual averaging (DDA) algorithms, which activates a fraction of nodes to perform optimization. Such subsampling procedure provably amplifies the DP guarantee, thereby achieving an equivalent level of DP with reduced noise. We prove that the proposed algorithms have utility loss comparable to centralized private algorithms for both general and strongly convex problems. When removing the noise, our algorithm attains the optimal O(1/t) convergence for non-smooth stochastic optimization. Finally, experimental results on two benchmark datasets are given to verify the effectiveness of the proposed algorithms.
Description
Keywords
distributed optimization, empirical risk minimization, differential privacy, dual averaging
Citation
Liu, C., Johansson, K. H., & Shi, Y. Distributed empirical risk minimization with differential privacy. Automatica, 162, 111514. https://doi.org/10.1016/j.automatica.2024.111514