ICLR 2026 Orals

Universal Inverse Distillation for Matching Models with Real-Data Supervision (No GANs)

Nikita Maksimovich Kornilov, David Li, Tikhon Mavrin, Aleksei Leonov, Nikita Gushchin, Evgeny Burnaev, Iaroslav Sergeevich Koshelev, Alexander Korotin

Diffusion & Flow Matching Thu, Apr 23 · 11:18 AM–11:28 AM · 201 A/B Avg rating: 6.00 (4–8)

Abstract

While achieving exceptional generative quality, modern diffusion, flow, and other matching models suffer from slow inference, as they require many steps of iterative generation. Recent distillation methods address this by training efficient one-step generators under the guidance of a pre-trained teacher model. However, these methods are often constrained to only one specific framework, e.g., only to diffusion or only to flow models. Furthermore, these methods are naturally data-free, and to benefit from the usage of real data, it is required to use an additional complex adversarial training with an extra discriminator model. In this paper, we present **RealUID**, a universal distillation framework for all matching models that seamlessly incorporates real data into the distillation procedure without GANs. Our **RealUID** approach offers a simple theoretical foundation that covers previous distillation methods for Flow Matching and Diffusion models, and is also extended to their modifications, such as Bridge Matching and Stochastic Interpolants. The code can be found in https://github.com/David-cripto/RealUID.

One-sentence summary·Auto-generated by claude-haiku-4-5-20251001(?)

RealUID provides universal distillation for matching models without GANs, incorporating real data into one-step generator training.

Contributions·Auto-generated by claude-haiku-4-5-20251001(?)
  • Universal distillation framework applicable to diffusion, flow, bridge matching and stochastic interpolant models
  • Seamless incorporation of real data into distillation without requiring discriminator networks
  • Simple theoretical foundation unifying previous distillation methods and extending to model variants
Methods used·Auto-generated by claude-haiku-4-5-20251001(?)
  • Knowledge distillation
  • Flow matching
  • Diffusion models
  • Inverse distillation
Datasets used·Auto-generated by claude-haiku-4-5-20251001(?)
  • CIFAR-10
  • CelebA
Limitations (author-stated)·Auto-generated by claude-haiku-4-5-20251001(?)

Authors did not state explicit limitations.

Future work (author-stated)·Auto-generated by claude-haiku-4-5-20251001(?)

Authors did not state explicit future directions.

Author keywords

  • Diffusion models
  • Flow Matching
  • Acceleration of diffusion/flow models
  • Distillation of diffusion/flow models

Related orals

Generative Human Geometry Distribution

Introduces distribution-over-distribution model combining geometry distributions with two-stage flow matching for human 3D generation.

Avg rating: 5.50 (2–8) · Xiangjun Tang et al.
Something off? Let us know →