Graph Error Effect on Graph Neural Networks: Theoretical Bounds

No Thumbnail Available

URL

Journal Title

Journal ISSN

Volume Title

Sähkötekniikan korkeakoulu | Master's thesis

Date

2022-08-22

Department

Major/Subject

Acoustics and Audio Technology

Mcode

ELEC3030

Degree programme

CCIS - Master’s Programme in Computer, Communication and Information Sciences (TS2013)

Language

en

Pages

45+7

Series

Abstract

Graph neural networks (GNNs) can successfully learn the graph signal representation by graph convolution. The graph convolution depends on the graph filter, which contains the topological dependency of data and propagates data features. However, the estimation errors in the propagation matrix (e.g., the adjacency matrix) can have a significant impact on graph filters and GNNs. In this thesis, the effect on the performance of GNNs is studied when the adjacency matrix is estimated with errors, namely, it is assumed to be perturbed by a probabilistic graph error model. It is proven that the error as measured by the difference in the spectral norm of the true adjacency matrix and the mislearned adjacency matrix under the error model is bounded, and the error upper bound is a function of graph size and error probability parameters regarding erroneously adding or deleting edges from the graph topology. A similar error upper bound regarding the normalized adjacency matrix with self-loop added is also derived. Diverse numerical experiments in the thesis illustrate the derived error bounds on a synthetic dataset and study the error sensitivity of a simple graph convolutional neural network under the probabilistic error model.

Description

Supervisor

Ollila, Esa

Thesis advisor

Vorobyov, Sergiy A.

Keywords

graph signal processing, graph neural network, probabilistic error model, sensitivity

Other note

Citation