
Paper Accepted to MLSys 2025: Scaling Graph Learning with Pre-Propagation GNNs
This work provides the first in-depth system-level study of Pre-Propagation GNNs (PP-GNNs)—a promising alternative to traditional message-passing GNNs that sidesteps the neighbor explosion problem through feature pre-processing. While PP-GNNs offer comparable accuracy to graph-sampling methods, we uncover new performance bottlenecks in data loading and scalability. Our proposed optimizations boost training throughput by an average of 15×, achieving up to 100× speedup on large-scale graphs. These findings reshape the system design space for scalable graph learning.
Congratulations to the authors!
Check out the implementation here: https://github.com/cornell-zhang/preprop-gnn.
Yixiao Du
publication