ICLR 2026 Orals

Information Shapes Koopman Representation

Xiaoyuan Cheng, Wenxuan Yuan, Yiming Yang, Yuanzhao Zhang, Sibo Cheng, Yi He, Zhuo Sun

Uncategorized Thu, Apr 23 · 10:30 AM–10:40 AM · 204 A/B Avg rating: 5.50 (4–6)
Author-provided TL;DR

Because the Koopman operator is infinite-dimensional, identifying tractable finite-dimensional subspaces is challenging. We aim to construct these subspaces through information theory.

Abstract

The Koopman operator provides a powerful framework for modeling dynamical systems and has attracted growing interest from the machine learning community. However, its infinite-dimensional nature makes identifying suitable finite-dimensional subspaces challenging, especially for deep architectures. We argue that these difficulties come from suboptimal representation learning, where latent variables fail to balance expressivity and simplicity. This tension is closely related to the information bottleneck (IB) dilemma: constructing compressed representations that are both compact and predictive. Rethinking Koopman learning through this lens, we demonstrate that latent mutual information promotes simplicity, yet an overemphasis on simplicity may cause latent space to collapse onto a few dominant modes. In contrast, expressiveness is sustained by the von Neumann entropy, which prevents such collapse and encourages mode diversity. This insight leads us to propose an information-theoretic Lagrangian formulation that explicitly balances this tradeoff. Furthermore, we propose a new algorithm based on the Lagrangian formulation that encourages both simplicity and expressiveness, leading to a stable and interpretable Koopman representation. Beyond quantitative evaluations, we further visualize the learned manifolds under our representations, observing empirical results consistent with our theoretical predictions. Finally, we validate our approach across a diverse range of dynamical systems, demonstrating improved performance over existing Koopman learning methods.

One-sentence summary·Auto-generated by claude-haiku-4-5-20251001(?)

Proposes information-theoretic Lagrangian formulation to balance simplicity and expressiveness in Koopman representation learning for dynamical systems.

Contributions·Auto-generated by claude-haiku-4-5-20251001(?)
  • Information-theoretic perspective on Koopman representation balancing latent mutual information and von Neumann entropy
  • Lagrangian formulation that prevents latent space collapse while maintaining predictive power
  • Algorithm that improves performance across diverse dynamical systems tasks
Methods used·Auto-generated by claude-haiku-4-5-20251001(?)
  • Koopman operator
  • Information bottleneck theory
  • Von Neumann entropy
  • Kernel methods
Limitations (author-stated)·Auto-generated by claude-haiku-4-5-20251001(?)
  • Framework does not address sample complexity or non-asymptotic convergence of Koopman representation
    from the paper
Future work (author-stated)·Auto-generated by claude-haiku-4-5-20251001(?)
  • Explore more rigorous theoretical analyses on sample complexity and non-asymptotic convergence
    from the paper
  • Extend conventional kernel techniques in Koopman theory through information-theoretic perspective
    from the paper

Author keywords

  • Koopman Operator
  • Latent subspace reconstruction
  • representation for physical systems

Related orals

Something off? Let us know →