Chapter5: The Expressive Power of Graph Neural Networks
Pan Li, Purdue University, panli@purdue.edu
Jure Leskovec, Stanford University, jure@cs.stanford.edu
Abstract
The success of neural networks is based on their strong expressive power that allows them to approximate complex non-linear mappings from features to predictions. Since the universal approximation theorem by (Cybenko, 1989), many studies have proved that feed-forward neural networks can approximate any function of interest. However, these results have not been applied to graph neural networks (GNNs) due to the inductive bias imposed by additional constraints on the GNN parameter space. New theoretical studies are needed to better understand these constraints and characterize the expressive power of GNNs. In this chapter, we will review the recent progress on the expressive power of GNNs in graph representation learning. We will start by introducing the most widely-used GNN framework— message passing— and analyze its power and limitations. We will next introduce some recently proposed techniques to overcome these limitations, such as injecting random attributes, injecting deterministic distance attributes, and building higher-order GNNs. We will present the key insights of these techniques and highlight their advantages and disadvantages.
Contents
- Introduction
- Graph Representation Learning and Problem Definition
- The Power of Message Passing Graph Neural Networks
- Preliminaries: Neural Networks for Sets
- Message Passing Graph Neural Networks
- The Expressive Power of MP-GNN
- MP-GNN with the Power of the 1-WL Test
- Graph Neural Network Architectures that are more Powerful than 1-WL Test
- Limitations of MP-GNN
- Injecting Random Attributes
- Injecting Deterministic Distance Attributes
- Higher-order GNNs
- Summary
Citation
@incollection{GNNBook-ch5-li,
author = "Li, Pan and Leskovec, Jure",
editor = "Wu, Lingfei and Cui, Peng and Pei, Jian and Zhao, Liang",
title = "The Expressive Power of Graph Neural Networks",
booktitle = "Graph Neural Networks: Foundations, Frontiers, and Applications",
year = "2022",
publisher = "Springer Singapore",
address = "Singapore",
pages = "63--98",
}
P. Li and J. Leskovec, “The expressive power of graph neural networks,” in Graph Neural Networks: Foundations, Frontiers, and Applications, L. Wu, P. Cui, J. Pei, and L. Zhao, Eds. Singapore: Springer Singapore, 2022, pp. 63–98.