ICLR 2026 Orals

Pareto-Conditioned Diffusion Models for Offline Multi-Objective Optimization

Jatan Shrestha, Santeri Heiskanen, Kari Hepola, Severi Rissanen, Pekka Jääskeläinen, Joni Pajarinen

Diffusion & Flow Matching Fri, Apr 24 · 4:15 PM–4:25 PM · 201 A/B Avg rating: 6.50 (6–8)
Author-provided TL;DR

We propose Pareto-Conditioned Diffusion (PCD), a novel framework for Offline Multi-Objective Optimization

Abstract

Multi-objective optimization (MOO) arises in many real-world applications where trade-offs between competing objectives must be carefully balanced. In the offline setting, where only a static dataset is available, the main challenge is generalizing beyond observed data. We introduce Pareto-Conditioned Diffusion (PCD), a novel framework that formulates offline MOO as a conditional sampling problem. By conditioning directly on desired trade-offs, PCD avoids the need for explicit surrogate models. To effectively explore the Pareto front, PCD employs a reweighting strategy that focuses on high-performing samples and a reference-direction mechanism to guide sampling towards novel, promising regions beyond the training data. Experiments on standard offline MOO benchmarks show that PCD achieves highly competitive performance and, importantly, demonstrates greater consistency across diverse tasks than existing offline MOO approaches.

One-sentence summary·Auto-generated by claude-haiku-4-5-20251001(?)

Pareto-Conditioned Diffusion formulates offline multi-objective optimization as conditional sampling problem avoiding explicit surrogate models.

Contributions·Auto-generated by claude-haiku-4-5-20251001(?)
  • Formulates offline MOO as conditional sampling problem conditioning directly on desired trade-offs
  • Employs reweighting strategy focusing on high-performing samples and reference-direction mechanism for exploring Pareto front
  • Demonstrates consistent performance across diverse tasks compared to existing offline MOO approaches
Methods used·Auto-generated by claude-haiku-4-5-20251001(?)
  • Diffusion models
  • Conditional sampling
  • Pareto optimization
Limitations (author-stated)·Auto-generated by claude-haiku-4-5-20251001(?)
  • Performance limited on extremely high-dimensional continuous tasks (approximately 10,000 dimensions) by applying MLP denoiser directly to parameter space
    from the paper
  • Framework design intended for continuous optimization; extending to discrete tasks requires different approaches
    from the paper
Future work (author-stated)·Auto-generated by claude-haiku-4-5-20251001(?)
  • Extend PCD using Latent Diffusion Models to operate in latent space for high-dimensional problems
    from the paper
  • Adopt Transformer-based denoiser for generating high-dimensional neural network parameters
    from the paper
  • Extend framework for discrete optimization tasks using continuous-space diffusion methods for categorical data
    from the paper
  • Extend to combinatorial optimization problems (e.g., TSP, CVRP) using constrained diffusion frameworks
    from the paper

Author keywords

  • Multi-Objective Optimization
  • Conditional Diffusion Models

Related orals

Generative Human Geometry Distribution

Introduces distribution-over-distribution model combining geometry distributions with two-stage flow matching for human 3D generation.

Avg rating: 5.50 (2–8) · Xiangjun Tang et al.
Something off? Let us know →