Noise Augmentation for Privacy-preserving Empirical Risk Minimization

Friday, January 17, 2020 - 2:00pm
Wexler 304


Fang Liu
Associate Professor
Applied and Computational Mathematics and Statistics
University of Notre Dame


Differential privacy is a mathematically rigorous definition of privacy and provides a theoretical framework for strong privacy guarantee for individuals when releasing or sharing data. In this talk, I will introduce an adaptive Noise Augmentation technique for differentially Private (NAP) Empirical Risk Minimization (ERM) problems. With appropriately designed augmented data, NAP-ERM iteratively utilizes non-private ERM solvers to promote regularization and sparsity in model estimation and simultaneously achieve differentially privacy. NAP-ERM leads to strong convexity of the noise-perturbed ERM and eliminates the need for an ad-hoc L2 term for achieving strong convexity employed by the existing differentially private ERM approaches. As a result, both the excess risk bound and the sample complexity decreases in NAP-ERM, especially when the required strong convexity is purely guaranteed by the target regularization effects. In addition, NAP-ERM enables privacy budget to be retrieved and recycled to further improve the accuracy of the ERM solutions in some settings.  We illustrate through simulated and real-life data the advantages of NAP-ERM in differentially privately learning generalized linear models against existing privacy-preserving ERM learners.


Event Type