Abstract
Variational Bayesian Monte Carlo (VBMC) is a recently introduced framework that uses Gaussian process surrogates to perform approximate Bayesian inference in models with black-box, non-cheap likelihoods. In this work, we extend VBMC to deal with noisy log-likelihood evaluations, such as those arising from simulation-based models. We introduce new `global' acquisition functions, such as expected information gain (EIG) and variational interquantile range (VIQR), which are robust to noise and can be efficiently evaluated within the VBMC setting. In a novel, challenging, noisy-inference benchmark comprising of a variety of models with real datasets from computational and cognitive neuroscience, VBMC+VIQR achieves state-of-the-art performance in recovering the ground-truth posteriors and model evidence. In particular, our method vastly outperforms `local' acquisition functions and other surrogate-based inference methods while keeping a small algorithmic cost. Our benchmark corroborates VBMC as a general-purpose technique for sample-efficient black-box Bayesian inference also with noisy models.
Original language | English |
---|---|
Title of host publication | NeurIPS 2020 |
Number of pages | 12 |
Publisher | Neural Information Processing Systems Foundation |
Publication date | Dec 2020 |
Publication status | Published - Dec 2020 |
MoE publication type | A4 Article in conference proceedings |
Event | Conference on Neural Information Processing System - Vancouver, Canada Duration: 6 Dec 2020 → 12 Dec 2020 Conference number: 34 https://nips.cc/ |
Publication series
Name | Advances in Neural Information Processing Systems |
---|---|
Publisher | Neural information processing systems foundation |
Volume | 33 |
ISSN (Electronic) | 1049-5258 |
Fields of Science
- 113 Computer and information sciences