Training Graph Neural Networks with large-scale Graphs

Speaker: Jana Vatter
Technical University of Munich (TUM)

Title: Training Graph Neural Networks with large-scale Graphs

Date: Friday, 21 March 2025

Time: 2:00pm - 3:00pm

Venue: Room 5510 (via lift 25/26), HKUST

Abstract:

Graphs are ubiquitous, used in diverse domains such as social networks, biological systems, and recommendation engines. To learn from and make predictions on these complex structures, Graph Neural Networks (GNNs) have emerged as a specialized machine learning technique. However, as real-world graphs continue to grow in size, training GNNs efficiently becomes increasingly challenging.

This talk will present two advanced methods to train large-scale graphs efficiently. The first method proposes to combine graph sparsification with sampling during GNN training to reduce computational overhead while preserving key structural information. The second method takes ideas of the waveform relaxation method and refines an existing approach which is based on historical embeddings. We will explore the theoretical foundations of these approaches, their practical implementations, and empirical results demonstrating their effectiveness.

The talk draws from the following publications:

  • The Evolution of Distributed Systems for Graph Neural Networks and Their Origin in Graph Processing and Deep Learning: A Survey, ACM Computing Surveys, August 2023, https://dl.acm.org/doi/abs/10.1145/3597428
  • Size Does (Not) Matter? Sparsification and Graph Neural Network Sampling for Large-scale Graphs, LSGDA@VLDB 2024, August 2024, https://vldb.org/workshops/2024/proceedings/LSGDA/LSGDA24.06.pdf
  • WaveGAS: Waveform Relaxation for Scaling Graph Neural Networks, Preprint, February 2025, https://arxiv.org/pdf/2502.19986

Biography:

Jana Vatter is a research associate at the Technical University of Munich (TUM) at the Chair for Decentralized Information Systems and Data Management. She received her M.Sc. in Computer Science from the Technical University of Darmstadt in 2021 where she worked as a student assistant at the Ubiquitous Knowledge Processing (UKP) lab. Currently, her work focuses on efficient GNN training methods for large-scale graphs. Her research interests include everything related to graphs and machine learning.