Derivative-Free Optimization with Convex Constraints [slides available]


When optimizing functions which are computationally expensive and/or noisy, gradient information is often impractical to obtain or inaccurate. As a result, so-called ‘derivative-free’ optimization (DFO) methods are a suitable alternative. In this talk, I will show how existing methods for interpolation-based DFO can be extended to nonconvex problems with convex constraints, accessed only through projections. I will introduce a worst-case complexity analysis and show how existing geometric considerations of model accuracy (from the unconstrained setting) can be generalized to the constrained case. I will then show numerical results in the case of nonlinear least-squares optimization. This is joint work with Matthew Hough (University of Queensland and University of Waterloo).

14 Dec 2021
Lindon Roberts
MSI Fellow

My research is in numerical analysis, particularly nonconvex and derivative-free optimization.