Improving the Flexibility and Robustness of Model-Based Derivative-Free Optimization Solvers

Abstract

We present two software packages for derivative-free optimization (DFO): DFO-LS for nonlinear least-squares problems and Py-BOBYQA for general objectives, both with optional bound constraints. Inspired by the Gauss-Newton method, DFO-LS constructs simplified linear regression models for the residuals and allows flexible initialization for expensive problems, whereby it can begin making progress after as few as two objective evaluations. Numerical results show DFO-LS can gain reasonable progress on some medium-scale problems with fewer objective evaluations than is needed for one gradient evaluation. DFO-LS has improved robustness to noise, allowing sample averaging, regression-based model construction, and multiple restart strategies with an auto-detection mechanism. Our extensive numerical experimentation shows that restarting the solver when stagnation is detected is a cheap and effective mechanism for achieving robustness, with superior performance over sampling and regression techniques. The package Py-BOBYQA is a Python implementation of BOBYQA (Powell 2009), with novel features such as the implementation of robustness to noise strategies. Our numerical experiments show that Py-BOBYQA is comparable to or better than existing general DFO solvers for noisy problems. In our comparisons, we introduce an adaptive accuracy measure for data profiles of noisy functions, striking a balance between measuring the true and the noisy objective improvement.

Publication
ACM Transactions on Mathematical Software