ICLR 2026 Orals

Pinet: Optimizing hard-constrained neural networks with orthogonal projection layers

Panagiotis D. Grontas, Antonio Terpin, Efe C. Balta, Raffaello D'Andrea, John Lygeros

Theory & Optimization Sat, Apr 25 · 3:51 PM–4:01 PM · 202 A/B Avg rating: 6.50 (6–8)

Abstract

We introduce an output layer for neural networks that ensures satisfaction of convex constraints. Our approach, $\Pi$net, leverages operator splitting for rapid and reliable projections in the forward pass, and the implicit function theorem for backpropagation. We deploy $\Pi$net as a feasible-by-design optimization proxy for parametric constrained optimization problems and obtain modest-accuracy solutions faster than traditional solvers when solving a single problem, and significantly faster for a batch of problems. We surpass state-of-the-art learning approaches by orders of magnitude in terms of training time, solution quality, and robustness to hyperparameter tuning, while maintaining similar inference times. Finally, we tackle multi-vehicle motion planning with non-convex trajectory preferences and provide $\Pi$net as a GPU-ready package implemented in JAX.

One-sentence summary·Auto-generated by claude-haiku-4-5-20251001(?)

Enforces convex output constraints via operator splitting enabling fast parametric optimization solving.

Contributions·Auto-generated by claude-haiku-4-5-20251001(?)
  • Introduces PiNet output layer ensuring satisfaction of convex constraints via operator splitting
  • Leverages implicit function theorem for efficient backpropagation
  • Obtains modest-accuracy solutions faster than traditional solvers for single problems
  • Achieves orders of magnitude faster training than learning approaches with similar inference times
Methods used·Auto-generated by claude-haiku-4-5-20251001(?)
  • Operator splitting
  • Convex optimization
  • Implicit function theorem
Limitations (author-stated)·Auto-generated by claude-haiku-4-5-20251001(?)
  • Requires convex constraint sets; many applications involve only convex constraints but future work should relax assumption
    from the paper
Future work (author-stated)·Auto-generated by claude-haiku-4-5-20251001(?)
  • Investigate sequential convexification for non-convex constraints
    from the paper
  • Apply to neural PDE solvers, scheduling, and robotics applications
    from the paper
  • Integrate hard constraints into large-scale models
    from the paper

Author keywords

  • hard constrained neural networks
  • network architecture
  • implicit layers
  • operator splitting
  • optimization

Related orals

Something off? Let us know →