Insightful dimensionality reduction with very low rank variable subsets
Conference article in proceedings
Proceedings of the Web Conference, WWW 2021
AbstractDimensionality reduction techniques can be employed to produce robust, cost-effective predictive models, and to enhance interpretability in exploratory data analysis. However, the models produced by many of these methods are formulated in terms of abstract factors or are too high-dimensional to facilitate insight and fit within low computational budgets. In this paper we explore an alternative approach to interpretable dimensionality reduction. Given a data matrix, we study the following question: are there subsets of variables that can be primarily explained by a single factor? We formulate this challenge as the problem of finding submatrices close to rank one. Despite its potential, this topic has not been sufficiently addressed in the literature, and there exist virtually no algorithms for this purpose that are simultaneously effective, efficient and scalable. We formalize the task as two problems which we characterize in terms of computational complexity, and propose efficient, scalable algorithms with approximation guarantees. Our experiments demonstrate how our approach can produce insightful findings in data, and show our algorithms to be superior to strong baselines.
Funding Information: This work was supported by the Academy of Finland project AIDA (317085), the EC H2020RIA project “SoBigData++” (871042), and the Polish National Agency for Academic Exchange within the Bekker programme, number PPN/BEK/2019/1/00133. Publisher Copyright: Â© 2021 ACM. | openaire: EC/H2020/871042/EU//SoBigData-PlusPlus
Data mining, Dimensionality reduction, Explainability, Variable selection
Ordozgoiti , B , Pai , S & Kolczynska , M 2021 , Insightful dimensionality reduction with very low rank variable subsets . in Proceedings of the Web Conference, WWW 2021 . ACM , pp. 3066-3075 , The Web Conference , Ljubljana , Slovenia , 19/04/2021 . https://doi.org/10.1145/3442381.3450067