Learning GPLVM with arbitrary kernels using the unscented transformation

dc.contributorAalto-yliopistofi
dc.contributorAalto Universityen
dc.contributor.authorde Souza, Daniel Augustoen_US
dc.contributor.authorMesquita, Diegoen_US
dc.contributor.authorMattos, Cesar Lincolnen_US
dc.contributor.authorGomes, Joao Pauloen_US
dc.contributor.departmentDepartment of Computer Scienceen
dc.contributor.editorBanerjee, Aen_US
dc.contributor.editorFukumizu, Ken_US
dc.contributor.groupauthorProfessorship Kaski Samuelen
dc.contributor.organizationUniversidade Federal do Cearáen_US
dc.date.accessioned2021-09-15T06:41:46Z
dc.date.available2021-09-15T06:41:46Z
dc.date.issued2021en_US
dc.description.abstractGaussian Process Latent Variable Model (GPLVM) is a flexible framework to handle uncertain inputs in Gaussian Processes (GPs) and incorporate GPs as components of larger graphical models. Nonetheless, the standard GPLVM variational inference approach is tractable only for a narrow family of kernel functions. The most popular implementations of GPLVM circumvent this limitation using quadrature methods, which may become a computational bottleneck even for relatively low dimensions. For instance, the widely employed Gauss-Hermite quadrature has exponential complexity on the number of dimensions. In this work, we propose using the unscented transformation instead. Overall, this method presents comparable, if not better, performance than off-the-shelf solutions to GPLVM, and its computational complexity scales only linearly on dimension. In contrast to Monte Carlo methods, our approach is deterministic and works well with quasi-Newton methods, such as the Broyden-Fletcher-Goldfarb-Shanno (BFGS) algorithm. We illustrate the applicability of our method with experiments on dimensionality reduction and multistep-ahead prediction with uncertainty propagation.en
dc.description.versionPeer revieweden
dc.format.extent10
dc.format.mimetypeapplication/pdfen_US
dc.identifier.citationde Souza, D A, Mesquita, D, Mattos, C L & Gomes, J P 2021, Learning GPLVM with arbitrary kernels using the unscented transformation. in A Banerjee & K Fukumizu (eds), 24TH INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS (AISTATS). Proceedings of Machine Learning Research, vol. 130, Microtome Publishing, International Conference on Artificial Intelligence and Statistics, Virtual, Online, 13/04/2021.en
dc.identifier.issn2640-3498
dc.identifier.otherPURE UUID: e74b51e1-2807-49bc-be5c-5d93d0e44676en_US
dc.identifier.otherPURE ITEMURL: https://research.aalto.fi/en/publications/e74b51e1-2807-49bc-be5c-5d93d0e44676en_US
dc.identifier.otherPURE FILEURL: https://research.aalto.fi/files/67372834/Souza_Learning_GPLVM_with_arbitrary_kernels_using_the_unscented_transformation.pdfen_US
dc.identifier.urihttps://aaltodoc.aalto.fi/handle/123456789/109968
dc.identifier.urnURN:NBN:fi:aalto-202109159191
dc.language.isoenen
dc.relation.fundinginfoWe gratefully acknowledge the support from National Council for Scientific and Technological Development (CNPq grant 302289/2019-4), Delfos Intelligent Maintenance, and Fundacao ASTEF (FASTEF grant F0229).
dc.relation.ispartofInternational Conference on Artificial Intelligence and Statisticsen
dc.relation.ispartofseries24TH INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS (AISTATS)en
dc.relation.ispartofseriesProceedings of Machine Learning Research ; Volume 130en
dc.rightsopenAccessen
dc.titleLearning GPLVM with arbitrary kernels using the unscented transformationen
dc.typeA4 Artikkeli konferenssijulkaisussafi
dc.type.versionpublishedVersion

Files