Learning on Hypergraphs With Sparsity

Loading...
Thumbnail Image
Journal Title
Journal ISSN
Volume Title
A1 Alkuperäisartikkeli tieteellisessä aikakauslehdessä
Date
2021-08-01
Major/Subject
Mcode
Degree programme
Language
en
Pages
13
2710-2722
Series
IEEE Transactions on Pattern Analysis and Machine Intelligence, Volume 43, issue 8
Abstract
Hypergraph is a general way of representing high-order relations on a set of objects. It is a generalization of graph, in which only pairwise relations can be represented. It finds applications in various domains where relationships of more than two objects are observed. On a hypergraph, as a generalization of graph, one wishes to learn a smooth function with respect to its topology. A fundamental issue is to find suitable smoothness measures of functions on the nodes of a graph/hypergraph. We show a general framework that generalizes previously proposed smoothness measures and also generates new ones. To address the problem of irrelevant or noisy data, we wish to incorporate sparse learning framework into learning on hypergraphs. We propose sparsely smooth formulations that learn smooth functions and induce sparsity on hypergraphs at both hyperedge and node levels. We show their properties and sparse support recovery results. We conduct experiments to show that our sparsely smooth models are beneficial to learning irrelevant and noisy data, and usually give similar or improved performances compared to dense models.
Description
Keywords
Noise measurement, Data models, Laplace equations, Additives, Machine learning, Computational modeling, Topology, Sparse learning, learning on hypergraphs, learning on graphs, sparsistency, MODEL SELECTION, REGRESSION
Other note
Citation
Nguyen , C H & Mamitsuka , H 2021 , ' Learning on Hypergraphs With Sparsity ' , IEEE Transactions on Pattern Analysis and Machine Intelligence , vol. 43 , no. 8 , 9001176 , pp. 2710-2722 . https://doi.org/10.1109/TPAMI.2020.2974746