Differentially private Bayesian learning on distributed data

Forskningsoutput: Kapitel i bok/rapport/konferenshandlingKonferensbidragVetenskapligPeer review

Sammanfattning

Many applications of machine learning, for example in health care, would benefit from methods that can guarantee privacy of data subjects. Differential privacy (DP) has become established as a standard for protecting learning results. The standard DP algorithms require a single trusted party to have access to the entire data, which is a clear weakness, or add prohibitive amounts of noise. We consider DP Bayesian learning in a distributed setting, where each party only holds a single sample or a few samples of the data. We propose a learning strategy based on a secure multi-party sum function for aggregating summaries from data holders and the Gaussian mechanism for DP. Our method builds on an asymptotically optimal and practically efficient DP Bayesian inference with rapidly diminishing extra cost.
Originalspråkengelska
Titel på värdpublikationAdvances in Neural Information Processing Systems 30 (NIPS 2017)
RedaktörerI. Guyon, U.V. Luxburg, S. Bengio, H. Wallach, R. Fergus, S. Vishwanathan, R. Garnett
Antal sidor10
Volym30
FörlagNEURAL INFORMATION PROCESSING SYSTEMS (NIPS)
Utgivningsdatum2017
StatusPublicerad - 2017
MoE-publikationstypA4 Artikel i en konferenspublikation
EvenemangAnnual Conference on Neural Information Processing Systems - Long Beach, Förenta Staterna (USA)
Varaktighet: 4 dec. 20179 dec. 2017
Konferensnummer: 31
http://nips.cc/Conferences/2017

Publikationsserier

NamnAdvances in Neural Information Processing Systems
FörlagNEURAL INFORMATION PROCESSING SYSTEMS (NIPS)
Volym30
ISSN (tryckt)1049-5258

Vetenskapsgrenar

  • 112 Statistik
  • 113 Data- och informationsvetenskap

Citera det här