Self-Certifying Primal-Dual Optimization Proxies for Large-Scale Batch Economic Dispatch

📰 ArXiv cs.AI

arXiv:2510.15850v2 Announce Type: replace-cross Abstract: Recent research has shown that optimization proxies can be trained to high fidelity, achieving average optimality gaps under 1% for large-scale problems. However, worst-case analyses show that there exist in-distribution queries that result in orders of magnitude higher optimality gap, making it difficult to trust the predictions in practice. This paper aims at striking a balance between classical solvers and optimization proxies in order

Published 14 Apr 2026
Read full paper → ← Back to Reads