Large-Scale Derivative-Free Optimization using Subspace Methods [slides available]

Abstract

In existing techniques for model-based derivative-free optimization, the computational cost of constructing local models and Lagrange polynomials can be high. As a result, these algorithms are not as suitable for large-scale problems as derivative-based methods. In this talk, I will discuss a model-based derivative-free algorithm based on exploration of random subspaces, its worst-case complexity bounds, and some numerical results.

Date
20 Jul 2021
Avatar
Lindon Roberts
Lecturer

My research is in numerical analysis and data science, particularly nonconvex and derivative-free optimization.