Multi-Domain Riemannian Graph Gluing for Building Graph Foundation Models
Li Sun, Zhenhao Huang, Silei Chen, Lanxu Yang, Junda Ye, Sen Su, Philip S. Yu
From differential geometry perspective, we present a novel framework that merges multi-domain graphs into a unified, smooth manifold with geometric consistency, enabling quantifiable transferability and geometric scaling behavior.
Abstract
Multi-domain graph pre-training integrates knowledge from diverse domains to enhance performance in the target domains, which is crucial for building graph foundation models. Despite initial success, existing solutions often fall short of answering a fundamental question: how is knowledge integrated or transferred across domains? This theoretical limitation motivates us to rethink the consistency and transferability between the pre-trained model and target domains. In this paper, we propose a fresh differential geometry perspective, whose core idea is to merge any graph dataset into a unified, smooth Riemannian manifold, enabling a systematic understanding of knowledge integration and transfer. To achieve this, our key contribution is the theoretical establishment of neural manifold gluing, which first characterizes local geometry using an adaptive orthogonal frame and then “glues” the local pieces together into a coherent whole. Building on this theory, we present the GraphGlue framework, which supports batched pre-training with EMA prototyping and provides a transferability measure based on geometric consistence. Extensive experiments demonstrate its superior performance across diverse graph domains. Moreover, we empirically validated GraphGlue’s geometric scaling law, showing that larger quantities of datasets improve model transferability by producing a smoother manifold.
GraphGlue uses Riemannian geometry to merge multi-domain graphs into unified manifolds, enabling knowledge transfer across graph domains.
- Theoretical establishment of neural manifold gluing that characterizes local geometry with adaptive orthogonal frames
- Framework supporting batched pre-training with EMA prototyping and geometric transferability measure
- Empirical validation of geometric scaling law showing larger datasets improve transferability
- Supports multi-domain graph pre-training through differential geometry perspective
- Differential geometry
- Neural manifold gluing
- Riemannian manifolds
- Graph neural networks
Authors did not state explicit limitations.
Authors did not state explicit future directions.
Author keywords
- Multi-domain graph pre-training
- graph neural network
- graph foundation model
- Riemannian geometry
Related orals
One for Two: A Unified Framework for Imbalanced Graph Classification via Dynamic Balanced Prototype
Unified framework for imbalanced graph classification using dynamic balanced prototypes and prototype load-balancing optimization.
Compactness and Consistency: A Conjoint Framework for Deep Graph Clustering
CoCo framework captures compactness and consistency in graph neural network representations for improved deep graph clustering.
Learning with Dual-level Noisy Correspondence for Multi-modal Entity Alignment
Proposes framework to handle noisy entity-attribute and inter-graph correspondences in multi-modal entity alignment.
Exchangeability of GNN Representations with Applications to Graph Retrieval
Graph embeddings exhibit exchangeability property, enabling efficient graph retrieval via transport-based similarity approximation with locality-sensitive hashing.
Scaling Laws and Spectra of Shallow Neural Networks in the Feature Learning Regime
Analyzes scaling laws for shallow networks with feature learning via sparse estimation and matrix compression theory.