Neural Global Optimization via Iterative Refinement from Noisy Samples
📰 ArXiv cs.AI
arXiv:2604.03614v1 Announce Type: cross Abstract: Global optimization of black-box functions from noisy samples is a fundamental challenge in machine learning and scientific computing. Traditional methods such as Bayesian Optimization often converge to local minima on multi-modal functions, while gradient-free methods require many function evaluations. We present a novel neural approach that learns to find global minima through iterative refinement. Our model takes noisy function samples and the
DeepCamp AI