Browsing by Author "Wimalawarne, Kishan"
Now showing 1 - 3 of 3
Results Per Page
Sort Options
Item Convex Coupled Matrix and tensor completion(2018-11-01) Wimalawarne, Kishan; Yamada, Makoto; Mamitsuka, Hiroshi; Kyoto University; RIKEN; Department of Computer ScienceWe propose a set of convex low-rank inducing norms for coupled matrices and tensors (hereafter referred to as coupled tensors), in which information is shared between thematrices and tensors through commonmodes. More specifically,we first propose a mixture of the overlapped trace norm and the latent normswith thematrix trace norm, and then, propose a completion model regularized using these norms to impute coupled tensors. A key advantage of the proposed norms is that they are convex and can be used to find a globally optimal solution, whereas existingmethods for coupled learning are nonconvex.We also analyze the excess risk bounds of the completionmodel regularized using our proposed norms and show that they can exploit the low-rankness of coupled tensors, leading to better bounds compared to those obtained using uncoupled norms. Through synthetic and real-data experiments, we show that the proposed completion model compares favorably with existing ones.Item Reshaped tensor nuclear norms for higher order tensor completion(Springer Netherlands, 2021-03) Wimalawarne, Kishan; Mamitsuka, Hiroshi; University of Tokyo; Probabilistic Machine Learning; Department of Computer ScienceWe investigate optimal conditions for inducing low-rankness of higher order tensors by using convex tensor norms with reshaped tensors. We propose the reshaped tensor nuclear norm as a generalized approach to reshape tensors to be regularized by using the tensor nuclear norm. Furthermore, we propose the reshaped latent tensor nuclear norm to combine multiple reshaped tensors using the tensor nuclear norm. We analyze the generalization bounds for tensor completion models regularized by the proposed norms and show that the novel reshaping norms lead to lower Rademacher complexities. Through simulation and real-data experiments, we show that our proposed methods are favorably compared to existing tensor norms consolidating our theoretical claims.Item Scaled coupled norms and coupled higher-order tensor completion(MIT PRESS, 2020-02-01) Wimalawarne, Kishan; Yamada, Makoto; Mamitsuka, Hiroshi; Bioinformatics Center; RIKEN; Probabilistic Machine Learning; Department of Computer ScienceRecently, a set of tensor norms known as coupled norms has been proposed as a convex solution to coupled tensor completion. Coupled norms have been designed by combining low-rank inducing tensor norms with the matrix trace norm. Though coupled norms have shown good performances, they have two major limitations: they do not have a method to control the regularization of coupled modes and uncoupled modes, and they are not optimal for couplings among higher-order tensors. In this letter, we propose a method that scales the regularization of coupled components against uncoupled components to properly induce the low-rankness on the coupled mode. We also propose coupled norms for higher-order tensors by combining the square norm to coupled norms. Using the excess risk-bound analysis, we demonstrate that our proposed methods lead to lower risk bounds compared to existing coupled norms. We demonstrate the robustness of our methods through simulation and real-data experiments.