One for Two: A Unified Framework for Imbalanced Graph Classification via Dynamic Balanced Prototype
Guanjun Wang, Binwu Wang, Jiaming Ma, Zhengyang Zhou, Pengkun Wang, Xu Wang, Yang Wang
Abstract
Graph Neural Networks (GNNs) have advanced graph classification, yet they remain vulnerable to graph-level imbalance, encompassing class imbalance and topological imbalance. To address both types of imbalance in a unified manner, we propose UniImb, a Unified framework for Imbalanced graph classification. Specifically, UniImb first captures multi-scale topological features and enhances data diversity via learnable personalized graph perturbations. It then employs a dynamic balanced prototype module to learn representative prototypes from graph instances, improving the quality of graph representations. Concurrently, a prototype load-balancing optimization term mitigates dominance by majority samples to equalize sample influence during training. We justify these design choices theoretically using the Information Bottleneck principle. Extensive experiments on 19 datasets-including a large-scale imbalanced air pollution graph dataset AirGraph released by us and 23 baselines demonstrate that UniImb has achieved dominant performance across various imbalanced scenarios. Our code is available at GitHub.
Unified framework for imbalanced graph classification using dynamic balanced prototypes and prototype load-balancing optimization.
- Dynamic balanced prototype module to learn representative prototypes from graph instances under imbalanced settings
- Prototype load-balancing optimization term to equalize sample influence during training
- Theoretical justification using Information Bottleneck principle for design choices
- Graph Neural Networks
- dynamic prototypes
- learnable graph perturbations
- feature mixup
- AirGraph
- 19 benchmark datasets
Current datasets primarily consist of homogeneous graphs, lacking diversity and complexity of real-world heterogeneous networks
from the paper
Evaluate performance on heterogeneous graphs with multiple types of nodes and edges
from the paperExtend comparison of imbalanced graph learning algorithms beyond classification to few-shot learning, dynamic graph learning, and anomaly detection
from the paper
Author keywords
- Graph classification; graph imbalance learning; graph neural networks; Graph data mining; long-tail learning
Related orals
Compactness and Consistency: A Conjoint Framework for Deep Graph Clustering
CoCo framework captures compactness and consistency in graph neural network representations for improved deep graph clustering.
Multi-Domain Riemannian Graph Gluing for Building Graph Foundation Models
GraphGlue uses Riemannian geometry to merge multi-domain graphs into unified manifolds, enabling knowledge transfer across graph domains.
Learning with Dual-level Noisy Correspondence for Multi-modal Entity Alignment
Proposes framework to handle noisy entity-attribute and inter-graph correspondences in multi-modal entity alignment.
Exchangeability of GNN Representations with Applications to Graph Retrieval
Graph embeddings exhibit exchangeability property, enabling efficient graph retrieval via transport-based similarity approximation with locality-sensitive hashing.
Scaling Laws and Spectra of Shallow Neural Networks in the Feature Learning Regime
Analyzes scaling laws for shallow networks with feature learning via sparse estimation and matrix compression theory.