Dual parameterization of sparse variational Gaussian processes
Loading...
Access rights
openAccess
publishedVersion
URL
Journal Title
Journal ISSN
Volume Title
A4 Artikkeli konferenssijulkaisussa
This publication is imported from Aalto University research portal.
View publication in the Research portal (opens in new window)
View/Open full text file from the Research portal (opens in new window)
Other link related to publication (opens in new window)
View publication in the Research portal (opens in new window)
View/Open full text file from the Research portal (opens in new window)
Other link related to publication (opens in new window)
Date
Department
Major/Subject
Mcode
Degree programme
Language
en
Pages
12
Series
Advances in Neural Information Processing Systems 34 (NeurIPS 2021), Advances in Neural Information Processing Systems
Abstract
Sparse variational Gaussian process (SVGP) methods are a common choice for non-conjugate Gaussian process inference because of their computational benefits. In this paper, we improve their computational efficiency by using a dual parameterization where each data example is assigned dual parameters, similarly to site parameters used in expectation propagation. Our dual parameterization speeds-up inference using natural gradient descent, and provides a tighter evidence lower bound for hyperparameter learning. The approach has the same memory cost as the current SVGP methods, but it is faster and more accurate.Description
Keywords
Other note
Citation
Adam, V, Chang, P, Khan, M E & Solin, A 2021, Dual parameterization of sparse variational Gaussian processes. in Advances in Neural Information Processing Systems 34 (NeurIPS 2021). Advances in Neural Information Processing Systems, Curran Associates Inc., Conference on Neural Information Processing Systems, Virtual, Online, 06/12/2021. < https://papers.nips.cc/paper/2021/hash/5fcc629edc0cfa360016263112fe8058-Abstract.html >