Differential privacy complicates learning procedures because each access to the data carries a privacy cost. For example, standard parameter tuning with a validation set cannot be easily applied. We propose an algorithm for the adaptation of the learning rate for differentially private stochastic gradient descent (DP-SGD) that avoids the need for validation set use. The idea for the adaptiveness comes from the technique of extrapolation: to get an estimate for the error against the gradient flow which underlies SGD, we compare the result obtained by one full step and two half-steps. We prove the privacy of the method using the moments accountant mechanism. We show in experiments that the method is competitive with optimally tuned DP-SGD and differentially private version of Adam.
|Publication status||Published - 8 Dec 2018|
|MoE publication type||Not Eligible|
|Event||NeurIPS 2018 Workshop: Privacy Preserving Machine Learning - Montreal, Canada|
Duration: 8 Dec 2017 → 8 Dec 2018
|Conference||NeurIPS 2018 Workshop: Privacy Preserving Machine Learning|
|Period||08/12/2017 → 08/12/2018|
Fields of Science
- 113 Computer and information sciences
Koskela, A. H., & Honkela, A. J. H. (2018). Learning rate adaptation for differentially private stochastic gradient descent. Poster session presented at NeurIPS 2018 Workshop: Privacy Preserving Machine Learning, Montreal, Canada.