Derivative-Free Algorithms for Nonconvex Optimisation


Classical nonconvex optimisation algorithms require the availability of gradient evaluations for constructing local approximations to the objective function and testing for convergence. In settings where the objective is expensive to evaluate and/or noisy, evaluating its gradient may be too expensive or inaccurate, so cannot be used; we must turn to optimisation methods which do not require gradient information, so-called derivative-free optimisation (DFO). DFO has applications in areas such as finance, climate modelling and machine learning. In this talk, I will introduce DFO methods and recent progress in the theory and implementation of these methods, with a particular focus on least-squares problems (e.g. parameter fitting).

14 Oct 2019
Australian National University
Lindon Roberts

My research is in numerical analysis, particularly nonconvex and derivative-free optimization.