Scalable Subspace Methods for Derivative-Free Nonlinear Least-Squares Optimization [slides available]

Abstract

When optimizing functions which are computationally expensive and/or noisy, gradient information is often impractical to obtain or inaccurate. As a result, so-called ‘derivative-free’ optimization methods are a suitable alternative. In existing techniques for derivative-free optimization, the linear algebra cost of constructing function approximations. As a result, these algorithms are not as suitable for large-scale problems as derivative-based methods. In this talk, I will discuss a new derivative-free algorithm based on exploration of random subspaces, its worst-case complexity bounds, and some numerical results. This is joint work with Coralia Cartis (Oxford).

Date
30 Sep 2021
Event
University of Leicester Applied Mathematics Seminar
Avatar
Lindon Roberts
Lecturer

My research is in numerical analysis and data science, particularly nonconvex and derivative-free optimization.