Sampling Methods for Missing Value Reconstruction

No Thumbnail Available

URL

Journal Title

Journal ISSN

Volume Title

School of Science | Master's thesis
Checking the digitized thesis and permission for publishing
Instructions for the author

Date

2012

Major/Subject

Informaatiotekniikka

Mcode

T-61

Degree programme

Language

en

Pages

[8] + 60 s. + liitt. 10

Series

Abstract

The main theme of this thesis is reconstruction of missing value in sparse datasets with algorithms based on sampling methods. A probabilistic principal component analysis model is used to model the data. In contrast to the standard principal component analysis, this model allows to avoid over fitting and to use Bayesian inference methods by adding the noise term into the model and introducing prior distributions over the model parameters. The parameters of the model are estimated by approximate inference methods. Particularly, variational Bayesian principal component analysis is used as a baseline method in the experiments part where the parameters of the model are estimated with a maximum likelihood estimate by using expectation maximization algorithm. The other approach to approximate inference is sampling from posterior distribution. Particularly, Metropolis-Hastings sampling, Gibbs sampling, Langevin dynamics MCMC and Langevin dynamics MCMC with stochastic gradient, are considered. Gradient-based algorithms allow to use the geometry information of the posterior distribution and to move in the direction of the higher probability density at each step. Langevin dynamics sampling method is also generalized by using natural gradient instead of the standard gradient in the update rules. Presented gradient-based sampling methods are tested on a simple example involving only two parameters. Also, all sampling methods are applied to the problem of missing value reconstruction in the Movielens dataset. The results are promising and show that the proposed sampling methods outperform variational Bayesian inference approach and suggest that sampling methods can be efficiently applied for large-scale problems.

Description

Supervisor

Oja, Erkki

Thesis advisor

Ilin, Alexander

Keywords

sampling methods, Markov chain, Monte Carlo, Gibbs sampling, Langevin dynamics, variational inference, natural gradient, stochastic gradient, sparse dataset

Other note

Citation