Localized Lasso for High-Dimensional Regression

dc.contributorAalto-yliopistofi
dc.contributorAalto Universityen
dc.contributor.authorYamada, Makotoen_US
dc.contributor.authorKoh, Takeuchien_US
dc.contributor.authorIwata, Tomoharuen_US
dc.contributor.authorShawe-Taylor, Johnen_US
dc.contributor.authorKaski, Samuelen_US
dc.contributor.departmentDepartment of Computer Scienceen
dc.contributor.editorSingh, Aartien_US
dc.contributor.editorZhu, Jerryen_US
dc.contributor.groupauthorCentre of Excellence in Computational Inference, COINen
dc.contributor.groupauthorProfessorship Kaski Samuelen
dc.contributor.groupauthorHelsinki Institute for Information Technology (HIIT)en
dc.contributor.groupauthorProbabilistic Machine Learningen
dc.contributor.organizationRIKEN Center for Advanced Intelligence Projecten_US
dc.contributor.organizationNTT Communication Science Laboratoriesen_US
dc.contributor.organizationUniversity College Londonen_US
dc.date.accessioned2019-07-30T07:19:20Z
dc.date.available2019-07-30T07:19:20Z
dc.date.issued2017-08-01en_US
dc.description.abstractWe introduce the localized Lasso, which learns models that both are interpretable and have a high predictive power in problems with high dimensionality d and small sample size n. More specifically, we consider a function defined by local sparse models, one at each data point. We introduce sample-wise network regularization to borrow strength across the models, and sample-wise exclusive group sparsity (a.k.a., l12 norm) to introduce diversity into the choice of feature sets in the local models. The local models are interpretable in terms of similarity of their sparsity patterns. The cost function is convex, and thus has a globally optimal solution. Moreover, we propose a simple yet efficient iterative least-squares based optimization procedure for the localized Lasso, which does not need a tuning parameter, and is guaranteed to converge to a globally optimal solution. The solution is empirically shown to outperform alternatives for both simulated and genomic personalized/precision medicine data.en
dc.description.versionPeer revieweden
dc.format.extent9
dc.format.extent325-333
dc.format.mimetypeapplication/pdfen_US
dc.identifier.citationYamada, M, Koh, T, Iwata, T, Shawe-Taylor, J & Kaski, S 2017, Localized Lasso for High-Dimensional Regression . in A Singh & J Zhu (eds), Proceedings of the 20th International Conference on Artificial Intelligence and Statistics . Proceedings of Machine Learning Research, vol. 54, JMLR, Fort Lauderdale, FL, USA, pp. 325-333, International Conference on Artificial Intelligence and Statistics, Fort Lauderdale, United States, 20/04/2017 . < http://proceedings.mlr.press/v54/yamada17a.html >en
dc.identifier.issn1938-7228
dc.identifier.otherPURE UUID: aadd9131-ac96-4d1d-9384-9ea9b6a1fe97en_US
dc.identifier.otherPURE ITEMURL: https://research.aalto.fi/en/publications/aadd9131-ac96-4d1d-9384-9ea9b6a1fe97en_US
dc.identifier.otherPURE LINK: http://proceedings.mlr.press/v54/yamada17a.htmlen_US
dc.identifier.otherPURE FILEURL: https://research.aalto.fi/files/35131960/yamada17a.pdfen_US
dc.identifier.urihttps://aaltodoc.aalto.fi/handle/123456789/39485
dc.identifier.urnURN:NBN:fi:aalto-201907304540
dc.language.isoenen
dc.publisherPMLR
dc.relation.ispartofInternational Conference on Artificial Intelligence and Statisticsen
dc.relation.ispartofseriesProceedings of the 20th International Conference on Artificial Intelligence and Statisticsen
dc.relation.ispartofseriesProceedings of Machine Learning Researchen
dc.relation.ispartofseriesVolume 54en
dc.rightsopenAccessen
dc.titleLocalized Lasso for High-Dimensional Regressionen
dc.typeA4 Artikkeli konferenssijulkaisussafi
dc.type.versionpublishedVersion

Files