Uncertainty in recurrent neural network with dropout

Loading...
Thumbnail Image

URL

Journal Title

Journal ISSN

Volume Title

Perustieteiden korkeakoulu | Master's thesis

Department

Mcode

SCI3044

Language

en

Pages

68 + 7

Series

Abstract

Recurrent Neural Network is a powerful tool for processing temporal data. However, assessing prediction uncertainty from recurrent models has proven challenging. This thesis attempts to evaluate the validity of uncertainty from recurrent models using dropout. Traditional neural network focuses on optimising data likelihood; in order to obtain model and predictive uncertainty, we need to, instead, optimise model posterior. Model posterior is usually intractable, thus we employ various dropout based approach, in the form of variational Bayesian Monte Carlo, to estimate the learning objective. This technique is applied to existing recurrent neural network benchmarks MIMIC-III. The thesis shows that Monte Carlo dropout applied to recurrent neural network can give comparable performance to the current state of the art methods, and meaningful uncertainty of predictions.

Description

Supervisor

Marttinen, Pekka

Thesis advisor

Cui, Tianyu
Ajanki, Antti

Other note

Citation