ICLR 2026 Orals

Compactness and Consistency: A Conjoint Framework for Deep Graph Clustering

Wei Ju, Siyu Yi, Kangjie Zheng, Yifan Wang, Ziyue Qiao, Li Shen, Yongdao Zhou, Xiaochun Cao, Jiancheng Lv

Graph Learning Thu, Apr 23 · 10:42 AM–10:52 AM · 203 A/B Avg rating: 6.80 (4–8)

Abstract

Graph clustering is a fundamental task in data analysis, aiming at grouping nodes with similar characteristics in the graph into clusters. This problem has been widely explored using graph neural networks (GNNs) due to their ability to leverage node attributes and graph topology for effective cluster assignments. However, representations learned through GNNs typically struggle to capture global relationships between nodes via local message-passing mechanisms. Moreover, the redundancy and noise inherently present in graph data may easily result in node representations lacking compactness and robustness. To address these issues, we propose a conjoint framework CoCo, which captures compactness and consistency in the learned node representations for deep graph clustering. Technically, our CoCo leverages graph convolutional filters to learn robust node representations from both local and global views, and then encodes them into low-rank compact embeddings, thus effectively removing the redundancy and noise as well as uncovering the intrinsic underlying structure. To further enrich the node semantics, we develop a consistency learning strategy based on compact embeddings to facilitate knowledge transfer from the two perspectives. Our experimental results indicate that our CoCo outperforms state-of-the-art counterparts on various datasets. The code is available at https://github.com/juweipku/CoCo.

One-sentence summary·Auto-generated by claude-haiku-4-5-20251001(?)

CoCo framework captures compactness and consistency in graph neural network representations for improved deep graph clustering.

Contributions·Auto-generated by claude-haiku-4-5-20251001(?)
  • Proposes CoCo, a conjoint framework leveraging graph convolutional filters with local and global views for deep graph clustering
  • Encodes learned representations into low-rank compact embeddings via GMM to remove redundancy and noise
  • Develops consistency learning strategy based on compact embeddings to enrich node semantics
  • Demonstrates state-of-the-art performance across diverse graph types and superior spatiotemporal complexity
Methods used·Auto-generated by claude-haiku-4-5-20251001(?)
  • Graph convolutional networks
  • Gaussian mixture models
  • Low-rank matrix factorization
  • Contrastive learning
Datasets used·Auto-generated by claude-haiku-4-5-20251001(?)
  • Homophilic graphs
  • Heterophilic graphs
  • Noisy graphs
Limitations (author-stated)·Auto-generated by claude-haiku-4-5-20251001(?)
  • Implementation of feature reconstruction is performed outside the gradient flow, requiring residual connection injection to maintain trainability
    from the paper
Future work (author-stated)·Auto-generated by claude-haiku-4-5-20251001(?)
  • Extend the model to temporal graph clustering and single-cell genomic clustering to analyze dynamic structures and genetic profiles
    from the paper

Author keywords

  • Graph Neural Networks
  • Graph Clustering
  • Representation Learning
  • Compactness Learning
  • Consistency Learning

Related orals

Something off? Let us know →