ICLR 2026 Orals

Hyperparameter Trajectory Inference with Conditional Lagrangian Optimal Transport

Harry Amad, Mihaela van der Schaar

Reinforcement Learning & Agents Thu, Apr 23 · 11:30 AM–11:40 AM · 204 A/B Avg rating: 4.00 (2–6)

Abstract

Neural networks (NNs) often have critical behavioural trade-offs that are set at design time with hyperparameters—such as reward weights in reinforcement learning or quantile targets in regression. Post-deployment, however, user preferences can evolve, making initial settings undesirable, necessitating potentially expensive retraining. To circumvent this, we introduce the task of Hyperparameter Trajectory Inference (HTI): to learn, from observed data, how a NN's conditional output distribution changes with its hyperparameters, and construct a surrogate model that approximates the NN at unobserved hyperparameter settings. HTI requires extending existing trajectory inference approaches to incorporate conditions, exacerbating the challenge of ensuring inferred paths are feasible. We propose an approach based on conditional Lagrangian optimal transport, jointly learning the Lagrangian function governing hyperparameter-induced dynamics along with the associated optimal transport maps and geodesics between observed marginals, which form the surrogate model. We incorporate inductive biases based on the manifold hypothesis and least-action principles into the learned Lagrangian, improving surrogate model feasibility. We empirically demonstrate that our approach reconstructs NN outputs across various hyperparameter spectra better than other alternatives.

One-sentence summary·Auto-generated by claude-haiku-4-5-20251001(?)

Hyperparameter Trajectory Inference uses conditional Lagrangian optimal transport to reconstruct neural network outputs across hyperparameter spectra without expensive retraining.

Contributions·Auto-generated by claude-haiku-4-5-20251001(?)
  • Proposes Hyperparameter Trajectory Inference (HTI) task to learn how conditional output distributions change with hyperparameters
  • Develops approach based on conditional Lagrangian optimal transport jointly learning Lagrangian function and optimal transport maps
  • Incorporates inductive biases from manifold hypothesis and least-action principles to improve surrogate model feasibility
Methods used·Auto-generated by claude-haiku-4-5-20251001(?)
  • Conditional Lagrangian optimal transport
  • Trajectory inference
  • Optimal transport maps
Limitations (author-stated)·Auto-generated by claude-haiku-4-5-20251001(?)
  • HTI will be challenging when underlying dynamics are chaotic, making inference from sparse samples inherently difficult
    from the paper
  • Method applicable only for varying a single, continuous hyperparameter
    from the paper
  • Relatively simple settings demonstrated; further investigation across wider range of hyperparameter landscapes warranted
    from the paper
Future work (author-stated)·Auto-generated by claude-haiku-4-5-20251001(?)
  • Explore extensions to handle multiple hyperparameters simultaneously
    from the paper

Author keywords

  • hyperparameter
  • optimal transport
  • trajectory inference
  • manifold learning
  • interpolation

Related orals

Something off? Let us know →