State Space Expectation Propagation: Efficient Inference Schemes for Temporal Gaussian Processes

Loading...
Thumbnail Image

Access rights

openAccess

URL

Journal Title

Journal ISSN

Volume Title

A4 Artikkeli konferenssijulkaisussa

Date

2020-07-13

Major/Subject

Mcode

Degree programme

Language

en

Pages

10270-10281

Series

Proceedings of the 37th International Conference on Machine Learning, Proceedings of Machine Learning Research, Volume 119

Abstract

We formulate approximate Bayesian inference in non-conjugate temporal and spatio-temporal Gaussian process models as a simple parameter update rule applied during Kalman smoothing. This viewpoint encompasses most inference schemes, including expectation propagation (EP), the classical (Extended, Unscented, etc.) Kalman smoothers, and variational inference. We provide a unifying perspective on these algorithms, showing how replacing the power EP moment matching step with linearisation recovers the classical smoothers. EP provides some benefits over the traditional methods via introduction of the so-called cavity distribution, and we combine these benefits with the computational efficiency of linearisation, providing extensive empirical analysis demonstrating the efficacy of various algorithms under this unifying framework. We provide a fast implementation of all methods in JAX.

Description

Keywords

Other note

Citation

Wilkinson, W, Chang, P, Riis Andersen, M & Solin, A 2020, State Space Expectation Propagation: Efficient Inference Schemes for Temporal Gaussian Processes . in Proceedings of the 37th International Conference on Machine Learning . Proceedings of Machine Learning Research, vol. 119, JMLR, pp. 10270-10281, International Conference on Machine Learning, Vienna, Austria, 12/07/2020 . < http://proceedings.mlr.press/v119/wilkinson20a.html >