Large-scale derivative-free optimization using random subspace methods

Abstract

Many standard optimization algorithms require being able to cheaply and accurately compute derivatives for the objective and/or constraint functions. However, in the presence of noise, or computationally expensive or black-box procedures, derivative information may be inaccurate or impractical to compute. Derivative-Free Optimization (DFO) encompasses a variety of techniques for nonlinear optimization in the absence of derivatives. However, such techniques can struggle on large-scale problems for reasons including high linear algebra costs and strong dimension-dependency of worst-case complexity bounds. In this talk, I will discuss model-based and direct search DFO algorithms based on iterative searches in randomly drawn subspaces and show how these methods can be used to improve the scalability of DFO. This is joint work with Coralia Cartis (Oxford) and Clément Royer (Paris Dauphine-PSL).

Date
19 Sep 2022
Event
Curtin Centre for Optimisation and Decision Science Colloquium
Avatar
Lindon Roberts
Lecturer

My research is in numerical analysis, particularly nonconvex and derivative-free optimization.