Bayesian Basis Function Approximation for Scalable Gaussian Process Priors in Deep Generative Models
Loading...
Access rights
openAccess
CC BY-SA
CC BY-SA
publishedVersion
URL
Journal Title
Journal ISSN
Volume Title
A4 Artikkeli konferenssijulkaisussa
This publication is imported from Aalto University research portal.
View publication in the Research portal (opens in new window)
View/Open full text file from the Research portal (opens in new window)
Other link related to publication (opens in new window)
View publication in the Research portal (opens in new window)
View/Open full text file from the Research portal (opens in new window)
Other link related to publication (opens in new window)
Unless otherwise stated, all rights belong to the author. You may download, display and print this publication for Your own personal use. Commercial use is prohibited.
Date
Department
Major/Subject
Mcode
Degree programme
Language
en
Pages
24
Series
Proceedings of the 42nd International Conference on Machine Learning, pp. 2673-2696, Proceedings of Machine Learning Research ; Volume 267
Abstract
High-dimensional time-series datasets are common in domains such as healthcare and economics. Variational autoencoder (VAE) models, where latent variables are modeled with a Gaussian process (GP) prior, have become a prominent model class to analyze such correlated datasets. However, their applications are challenged by the inherent cubic time complexity that requires specific GP approximation techniques, as well as the general challenge of modeling both shared and individual-specific correlations across time. Though inducing points enhance GP prior VAE scalability, optimizing them remains challenging, especially since discrete covariates resist gradient‑based methods. In this work, we propose a scalable basis function approximation technique for GP prior VAEs that mitigates these challenges and results in linear time complexity, with a global parametrization that eliminates the need for amortized variational inference and the associated amortization gap, making it well-suited for conditional generation tasks where accuracy and efficiency are crucial. Empirical evaluations on synthetic and real-world benchmark datasets demonstrate that our approach not only improves scalability and interpretability but also drastically enhances predictive performance.Description
Keywords
Other note
Citation
Balik, M Y, Sinelnikov, M, Ong, P & Lähdesmäki, H 2025, Bayesian Basis Function Approximation for Scalable Gaussian Process Priors in Deep Generative Models. in Proceedings of the 42nd International Conference on Machine Learning. Proceedings of Machine Learning Research, vol. 267, JMLR, pp. 2673-2696. < https://proceedings.mlr.press/v267/balik25a.html >