Manifold Mixup: Better Representations by Interpolating Hidden States

Loading...
Thumbnail Image

Access rights

openAccess

URL

Journal Title

Journal ISSN

Volume Title

A4 Artikkeli konferenssijulkaisussa

Date

2019

Major/Subject

Mcode

Degree programme

Language

en

Pages

Series

Proceedings of the 36th International Conference on Machine Learning, Proceedings of Machine Learning Research, Volume 97

Abstract

Deep neural networks excel at learning the training data, but often provide incorrect and confident predictions when evaluated on slightly different test examples. This includes distribution shifts, outliers, and adversarial examples. To address these issues, we propose Manifold Mixup, a simple regularizer that encourages neural networks to predict less confidently on interpolations of hidden representations. Manifold Mixup leverages semantic interpolations as additional training signal, obtaining neural networks with smoother decision boundaries at multiple levels of representation. As a result, neural networks trained with Manifold Mixup learn class-representations with fewer directions of variance. We prove theory on why this flattening happens under ideal conditions, validate it on practical situations, and connect it to previous works on information theory and generalization. In spite of incurring no significant computation and being implemented in a few lines of code, Manifold Mixup improves strong baselines in supervised learning, robustness to single-step adversarial attacks, and test log-likelihood.

Description

Keywords

Deep Learning

Other note

Citation

Verma, V, Lamb, A, Beckham, C, Najafi, A, Mitliagkas, I, Lopez-Paz, D & Bengio, Y 2019, Manifold Mixup: Better Representations by Interpolating Hidden States . in Proceedings of the 36th International Conference on Machine Learning . Proceedings of Machine Learning Research, vol. 97, JMLR, International Conference on Machine Learning, Long Beach, California, United States, 09/06/2019 . < http://proceedings.mlr.press/v97/verma19a.html >