Sparse Gaussian processes for stochastic differential equations
No Thumbnail Available
URL
Journal Title
Journal ISSN
Volume Title
Perustieteiden korkeakoulu |
Master's thesis
Authors
Date
2021-08-23
Department
Major/Subject
Machine Learning, Data Science and Artificial Intelligence (Macadamia)
Mcode
SCI3044
Degree programme
Master’s Programme in Computer, Communication and Information Sciences
Language
en
Pages
59
Series
Abstract
Dynamical systems present in the real world are often well represented using stochastic differential equations (SDEs) incorporating the sources of stochasticity. With the recent advances in machine learning (ML), research has been done to develop algorithms to learn SDEs based on observations of dynamical systems. The thesis frames the SDE learning problem as an inference problem and aims to maximize the marginal likelihood of the observations in a joint model of the unobserved paths and the observations through an observation model. As this problem is intractable, a variational approximate inference algorithm is employed to maximize a lower bound to the log marginal likelihood instead of the original objective. In the variational framework, Gaussian processes (GPs) have been used as approximate posterior over paths. However, the resulting algorithms require fine discretization of the time horizon resulting in high complexity. The recent advances related to exploiting sparse structure in the GPs are explored in the thesis, and an alternate parameterization of the approximate distribution over paths using a sparse Markovian Gaussian process is proposed. The proposed method is efficient in storage and computation, allowing the usage of well-established optimizing algorithms such as natural gradient descent. The capability of the proposed method to learn the SDE from observations is showcased in the two experiments: the Ornstein–Uhlenbeck (OU) process and a double-well process.Description
Supervisor
Solin, ArnoThesis advisor
Adam, VincentKeywords
Gaussian processes, stochastic differential equations, variational inference, dynamic system, natural gradient descent