Practical Equivariances via Relational Conditional Neural Processes

Daolang Huang, Manuel Haussmann, Ulpu Remes, ST John, Grégoire Christophe Clarté, Samuel Kaski, Kevin Luck, Luigi Acerbi

Forskningsoutput: Kapitel i bok/rapport/konferenshandlingKonferensbidragVetenskapligPeer review

Sammanfattning

Conditional Neural Processes (CNPs) are a class of metalearning models popular for combining the runtime efficiency of amortized inference with reliable uncertainty quantification. Many relevant machine learning tasks, such as in spatio-temporal modeling, Bayesian Optimization and continuous control, inherently contain equivariances -- for example to translation -- which the model can exploit for maximal performance. However, prior attempts to include equivariances in CNPs do not scale effectively beyond two input dimensions. In this work, we propose Relational Conditional Neural Processes (RCNPs), an effective approach to incorporate equivariances into any neural process model. Our proposed method extends the applicability and impact of equivariant neural processes to higher dimensions. We empirically demonstrate the competitive performance of RCNPs on a large array of tasks naturally containing equivariances.
Originalspråkengelska
Titel på värdpublikationAdvances in Neural Information Processing Systems 36 (NeurIPS 2023)
RedaktörerA. Oh, T. Neumann, A. Globerson, K. Saenko, M. Hardt, S. Levine
FörlagMorgan Kaufmann Publishers
Utgivningsdatumdec. 2023
DOI
StatusPublicerad - dec. 2023
MoE-publikationstypA4 Artikel i en konferenspublikation
EvenemangConference on Neural Information Processing Systems 2023 - New Orleans, Förenta Staterna (USA)
Varaktighet: 10 dec. 202316 dec. 2023
https://neurips.cc/

Publikationsserier

NamnAdvances in Neural Information Processing Systems
Volym36
ISSN (elektroniskt)1049-5258

Vetenskapsgrenar

  • 113 Data- och informationsvetenskap

Citera det här