Memory-Based Dual Gaussian Processes for Sequential Learning

Loading...
Thumbnail Image

Access rights

openAccess
publishedVersion

URL

Journal Title

Journal ISSN

Volume Title

A4 Artikkeli konferenssijulkaisussa

Date

2023-07

Major/Subject

Mcode

Degree programme

Language

en

Pages

20

Series

Proceedings of the 40th International Conference on Machine Learning, pp. 4035-4054, Proceedings of Machine Learning Research ; Volume 202

Abstract

Sequential learning with Gaussian processes (GPs) is challenging when access to past data is limited, for example, in continual and active learning. In such cases, errors can accumulate over time due to inaccuracies in the posterior, hyperparameters, and inducing points, making accurate learning challenging. Here, we present a method to keep all such errors in check using the recently proposed dual sparse variational GP. Our method enables accurate inference for generic likelihoods and improves learning by actively building and updating a memory of past data. We demonstrate its effectiveness in several applications involving Bayesian optimization, active learning, and continual learning.

Description

Keywords

Other note

Citation

Chang, P E, Verma, P, John, S T, Solin, A & Emtiyaz Khan, M 2023, Memory-Based Dual Gaussian Processes for Sequential Learning . in A Krause, E Brunskill, K Cho, B Engelhardt, S Sabato & J Scarlett (eds), Proceedings of the 40th International Conference on Machine Learning . Proceedings of Machine Learning Research, vol. 202, JMLR, pp. 4035-4054, International Conference on Machine Learning, Honolulu, Hawaii, United States, 23/07/2023 . < https://proceedings.mlr.press/v202/chang23a.html >