Pareto-Conditioned Diffusion Models for Offline Multi-Objective Optimization
Jatan Shrestha, Santeri Heiskanen, Kari Hepola, Severi Rissanen, Pekka Jääskeläinen, Joni Pajarinen
We propose Pareto-Conditioned Diffusion (PCD), a novel framework for Offline Multi-Objective Optimization
Abstract
Multi-objective optimization (MOO) arises in many real-world applications where trade-offs between competing objectives must be carefully balanced. In the offline setting, where only a static dataset is available, the main challenge is generalizing beyond observed data. We introduce Pareto-Conditioned Diffusion (PCD), a novel framework that formulates offline MOO as a conditional sampling problem. By conditioning directly on desired trade-offs, PCD avoids the need for explicit surrogate models. To effectively explore the Pareto front, PCD employs a reweighting strategy that focuses on high-performing samples and a reference-direction mechanism to guide sampling towards novel, promising regions beyond the training data. Experiments on standard offline MOO benchmarks show that PCD achieves highly competitive performance and, importantly, demonstrates greater consistency across diverse tasks than existing offline MOO approaches.
Pareto-Conditioned Diffusion formulates offline multi-objective optimization as conditional sampling problem avoiding explicit surrogate models.
- Formulates offline MOO as conditional sampling problem conditioning directly on desired trade-offs
- Employs reweighting strategy focusing on high-performing samples and reference-direction mechanism for exploring Pareto front
- Demonstrates consistent performance across diverse tasks compared to existing offline MOO approaches
- Diffusion models
- Conditional sampling
- Pareto optimization
Performance limited on extremely high-dimensional continuous tasks (approximately 10,000 dimensions) by applying MLP denoiser directly to parameter space
from the paperFramework design intended for continuous optimization; extending to discrete tasks requires different approaches
from the paper
Extend PCD using Latent Diffusion Models to operate in latent space for high-dimensional problems
from the paperAdopt Transformer-based denoiser for generating high-dimensional neural network parameters
from the paperExtend framework for discrete optimization tasks using continuous-space diffusion methods for categorical data
from the paperExtend to combinatorial optimization problems (e.g., TSP, CVRP) using constrained diffusion frameworks
from the paper
Author keywords
- Multi-Objective Optimization
- Conditional Diffusion Models
Related orals
Universal Inverse Distillation for Matching Models with Real-Data Supervision (No GANs)
RealUID provides universal distillation for matching models without GANs, incorporating real data into one-step generator training.
GLASS Flows: Efficient Inference for Reward Alignment of Flow and Diffusion Models
GLASS Flows samples Markov transitions via inner flow matching models to improve inference-time reward alignment in flow and diffusion models.
Neon: Negative Extrapolation From Self-Training Improves Image Generation
Neon inverts model degradation from self-training by extrapolating away from it, improving generative models with minimal compute.
Generative Human Geometry Distribution
Introduces distribution-over-distribution model combining geometry distributions with two-stage flow matching for human 3D generation.
Cross-Domain Lossy Compression via Rate- and Classification-Constrained Optimal Transport
Cross-domain lossy compression unifies rate and classification constraints via optimal transport framework.