Robust Bayesian Inference: variable and structure selection and variational inference
Loading...
URL
Journal Title
Journal ISSN
Volume Title
School of Science |
Doctoral thesis (article-based)
| Defence date: 2023-11-03
Unless otherwise stated, all rights belong to the author. You may download, display and print this publication for Your own personal use. Commercial use is prohibited.
Authors
Date
Major/Subject
Mcode
Degree programme
Language
en
Pages
58 + app. 89
Series
Aalto University publication series DOCTORAL THESES, 158/2023
Abstract
This thesis studies Bayesian inference in the context of high dimensional and complex models.The main focus is on robustness and reliability from the angle of two important topics. First, we study projection predictive inference in the context of variable selection, where the goal is to accurately identify the minimal subset of variables which are relevant to predict the outcome. Second, we study variational inference and how different choices in the variational family, divergence measure and gradient estimator affect posterior inference in high dimensional problems. Traditionally, variable selection is carried out as part of the model estimation by means of incorporating a penalized likelihood term or sparsifying prior. These approaches favour sparse solutions, but the ultimate variable selection depends on arbitrary criteria imposed by the user (e.g. thresholding the inclusion probability of a variable). Instead, projection predictive inference solves variable selection and estimation in two stages. First, one builds the best-performing model possible. Then, one finds the minimal subset of variables that achieve the closest predictions to the reference model. Variable selection becomes a substantially easier problem through the use of a very accurate prediction model as the reference model. On the other hand, variational inference is a widely used framework for approximate inference in many models where exact inference is often not tractable. While it has been shown to scale well to large datasets, it has its limitations when dealing with high dimensional data. One limitation is the unreliable estimation of the objective function, which impacts the lack of robustness in the termination of these algorithms. This thesis studies and connects these topics. We extend projection predictive inference to complex models, such as generalized multilevel models and models whose observation family does not belong to the exponential family. In such cases, the underlying projection cannot be solved exactly and one needs to rely on approximate inference methods. Complementary, we develop novel methods to ensure more robust convergence on variational inference algorithms. We also study how variational inference extrapolates to high dimensional problems and propose a unified framework to better understand its limitations, which directly benefits our projection predictive inference work.Description
Supervising professor
Vehtari, Aki, Prof., Aalto University, Department of Computer Science, FinlandThesis advisor
Vehtari, Aki, Prof., Aalto University, Department of Computer Science, FinlandKeywords
Other note
Parts
-
[Publication 1]: Alejandro Catalina, Paul Bürkner and Aki Vehtari. Projection Predictive Inference for Generalized Linear and Additive Multilevel Models. Proceedings of The 25th International Conference on Artificial Intelligence and Statistics, Volume 151, pages 4446–4461, 2022.
Full text in Acris/Aaltodoc: http://urn.fi/URN:NBN:fi:aalto-202209285779
- [Publication 2]: Alejandro Catalina, Paul Bürkner and Aki Vehtari. Latent Space Projection Predictive Inference. Submitted, https://arxiv.org/abs/2109.04702, 2021
- [Publication 3]: Akash Kumar Dhaka, Alejandro Catalina, Michael Riis Andersen, Mans Magnusson, Jonathan Huggins, and Aki Vehtari. Robust, Accurate Stochastic Optimisation for Variational Inference. Advances in Neural Information Processing Systems, Volume 33, pages 10961–10973, 2020
- [Publication 4]: Akash Kumar Dhaka, Alejandro Catalina, Michael Riis Andersen, Manushi Welandawe, Jonathan Huggins, and Aki Vehtari. Challenges and Opportunities in high-dimensional Variational Inference. Advances in Neural Information Processing Systems, Volume 34, pages 7787–7798, 2021