Projective inference in high-dimensional problems

Loading...
Thumbnail Image
Journal Title
Journal ISSN
Volume Title
A1 Alkuperäisartikkeli tieteellisessä aikakauslehdessä
Date
2020-01-01
Major/Subject
Mcode
Degree programme
Language
en
Pages
43
2155-2197
Series
ELECTRONIC JOURNAL OF STATISTICS, Volume 14, issue 1
Abstract
This paper reviews predictive inference and feature selection for generalized linear models with scarce but high-dimensional data. We demonstrate that in many cases one can benefit from a decision theoretically justified two-stage approach: first, construct a possibly non-sparse model that predicts well, and then find a minimal subset of features that characterize the predictions. The model built in the first step is referred to as the reference model and the operation during the latter step as predictive projection. The key characteristic of this approach is that it finds an excellent tradeoff between sparsity and predictive accuracy, and the gain comes from utilizing all available information including prior and that coming from the left out features. We review several methods that follow this principle and provide novel methodological contributions. We present a new projection technique that unifies two existing techniques and is both accurate and fast to compute. We also propose a way of evaluating the feature selection process using fast leave-one-out cross-validation that allows for easy and intuitive model size selection. Furthermore, we prove a theorem that helps to understand the conditions under which the projective approach could be beneficial. The key ideas are illustrated via several experiments using simulated and real world data.
Description
Keywords
Feature selection, Post-selection inference, Prediction, Projection, Sparsity
Citation
Piironen , J , Paasiniemi , M & Vehtari , A 2020 , ' Projective inference in high-dimensional problems : Prediction and feature selection ' , Electronic Journal of Statistics , vol. 14 , no. 1 , pp. 2155-2197 . https://doi.org/10.1214/20-EJS1711