Uncertainty in Recurrent Neural Network with Dropout
Loading...
URL
Journal Title
Journal ISSN
Volume Title
Perustieteiden korkeakoulu |
Master's thesis
Unless otherwise stated, all rights belong to the author. You may download, display and print this publication for Your own personal use. Commercial use is prohibited.
Authors
Date
2020-08-18
Department
Major/Subject
Machine Learning, Data Science, and Artificial Intelligence
Mcode
SCI3044
Degree programme
Master’s Programme in Computer, Communication and Information Sciences
Language
en
Pages
68 + 7
Series
Abstract
Recurrent Neural Network is a powerful tool for processing temporal data. However, assessing prediction uncertainty from recurrent models has proven challenging. This thesis attempts to evaluate the validity of uncertainty from recurrent models using dropout. Traditional neural network focuses on optimising data likelihood; in order to obtain model and predictive uncertainty, we need to, instead, optimise model posterior. Model posterior is usually intractable, thus we employ various dropout based approach, in the form of variational Bayesian Monte Carlo, to estimate the learning objective. This technique is applied to existing recurrent neural network benchmarks MIMIC-III. The thesis shows that Monte Carlo dropout applied to recurrent neural network can give comparable performance to the current state of the art methods, and meaningful uncertainty of predictions.Description
Supervisor
Marttinen, PekkaThesis advisor
Cui, TianyuAjanki, Antti
Keywords
uncertainty, deep learning, recurrent neural network, probabilistic, variational Bayesian