A graphon-signal analysis of graph neural networks

R Levie - Advances in Neural Information Processing …, 2024 - proceedings.neurips.cc
Advances in Neural Information Processing Systems, 2024proceedings.neurips.cc
We present an approach for analyzing message passing graph neural networks (MPNNs)
based on an extension of graphon analysis to a so called graphon-signal analysis. A MPNN
is a function that takes a graph and a signal on the graph (a graph-signal) and returns some
value. Since the input space of MPNNs is non-Euclidean, ie, graphs can be of any size and
topology, properties such as generalization are less well understood for MPNNs than for
Euclidean neural networks. We claim that one important missing ingredient in past work is a …
Abstract
We present an approach for analyzing message passing graph neural networks (MPNNs) based on an extension of graphon analysis to a so called graphon-signal analysis. A MPNN is a function that takes a graph and a signal on the graph (a graph-signal) and returns some value. Since the input space of MPNNs is non-Euclidean, ie, graphs can be of any size and topology, properties such as generalization are less well understood for MPNNs than for Euclidean neural networks. We claim that one important missing ingredient in past work is a meaningful notion of graph-signal similarity measure, that endows the space of inputs to MPNNs with a regular structure. We present such a similarity measure, called the graphon-signal cut distance, which makes the space of all graph-signals a dense subset of a compact metric space--the graphon-signal space. Informally, two deterministic graph-signals are close in cut-distance if they``look like''they were sampled from the same random graph-signal model. Hence, our cut distance is a natural notion of graph-signal similarity, which allows comparing any pair of graph-signals of any size and topology. We prove that MPNNs are Lipschitz continuous functions over the graphon-signal metric space. We then give two applications of this result: 1) a generalization bound for MPNNs, and, 2) the stability of MPNNs to subsampling of graph-signals. Our results apply to any regular enough MPNN on any distribution of graph-signals, making the analysis rather universal.
proceedings.neurips.cc
Showing the best result for this search. See all results