Chapter6: Graph Neural Networks: Scalability

Hehuan Ma, University of Texas at Arlington,
Yu Rong, Tencent AI Lab,
Junzhou Huang, University of Texas at Arlington,


Over the past decade, Graph Neural Networks have achieved remarkable success in modeling complex graph data. Nowadays, graph data is increasing exponentially in both magnitude and volume, e.g., a social network can be constituted by billions of users and relationships. Such circumstance leads to a crucial question, how to properly extend the scalability of Graph Neural Networks? There remain two major challenges while scaling the original implementation of GNN to large graphs. First, most of the GNN models usually compute the entire adjacency matrix and node embeddings of the graph, which demands a huge memory space. Second, training GNN requires recursively updating each node in the graph, which becomes infeasible and ineffective for large graphs. Current studies propose to tackle these obstacles mainly from three sampling paradigms: node-wise sampling, which is executed based on the target nodes in the graph; layer-wise sampling, which is implemented on the convolutional layers; and graph-wise sampling, which constructs sub-graphs for the model inference. In this chapter, we will introduce several representative research accordingly.


  • Introduction
  • Preliminary
  • Sampling Paradigms
    • Node-wise Sampling
    • Layer-wise Sampling
    • Graph-wise Sampling
  • Applications of Large-scale Graph Neural Networks on Recommendation Systems
    • Item-item Recommendation
    • User-item Recommendation
  • Future Directions


author = "Ma, Hehuan and Rong, Yu and Huang, Junzhou",
editor = "Wu, Lingfei and Cui, Peng and Pei, Jian and Zhao, Liang",
title = "Graph Neural Networks: Scalability",
booktitle = "Graph Neural Networks: Foundations, Frontiers, and Applications",
year = "2022",
publisher = "Springer Singapore",
address = "Singapore",
pages = "99--119",

H. Ma, Y. Rong, and J. Huang, “Graph neural networks: Scalability,” in Graph Neural Networks: Foundations, Frontiers, and Applications, L. Wu, P. Cui, J. Pei, and L. Zhao, Eds. Singapore: Springer Singapore, 2022, pp. 99–119.