Scalable Derivative-Free Optimization for Nonlinear Least-Squares Problems [slides & video available]

Abstract

Derivative-free optimisation (DFO) methods are an important class of optimisation routines with applications in areas such as in image analysis and data science. However, in model-based DFO methods, the computational cost of constructing local models can be high, particularly for large-scale problems. Considering nonlinear least-squares problems, we improve on state-of-the-art DFO by performing dimensionality reduction in the observational space using sketching methods, avoiding the construction of a full local model. Our approach has a per-iteration computational cost which is linear in problem dimension in a big data regime, and numerical evidence demonstrates that, compared to existing software, it has dramatically improved runtime performance on overdetermined least-squares problems. This is joint work with Coralia Cartis and Tyler Ferguson (Univerity of Oxford).

Date
1 Dec 2020
Avatar
Lindon Roberts
Lecturer

My research is in numerical analysis, particularly nonconvex and derivative-free optimization.