Improving the Flexibility and Robustness of Derivative-Free Optimisation Solvers

Abstract

Classical nonlinear optimisation algorithms require the availability of gradient evaluations for constructing local approximations to the objective and testing for convergence. In settings where the objective is expensive to evaluate or noisy, evaluating the gradient may be too expensive or inaccurate, so cannot be used; we must turn to optimisation methods which do not require gradient information, so-called derivative-free optimisation (DFO). This has applications in areas such as climate modelling, hyperparameter tuning and generating adversarial examples in deep learning. In this talk, I will introduce DFO and discuss two new software packages for DFO for nonlinear least-squares problems and for general minimisation problems. I will describe their novel features aimed at expensive and/or noisy problems, and show their state-of-the-art performance. Time permitting, I will also show a heuristic method which improves the ability of these methods to escape local minima, and show its favourable performance on global optimisation problems.

Date
8 Mar 2019
Event
Cambridge Image Analysis/Medical Imaging in Healthcare Seminar
Location
University of Cambridge
Avatar
Lindon Roberts
Lecturer

My research is in numerical analysis, particularly nonconvex and derivative-free optimization.