Randomized Subspace Derivative-Free Optimization with Quadratic Models and Second-Order Convergence

Abstract

We consider model-based derivative-free optimization (DFO) for large-scale problems, based on iterative minimization in random subspaces. We provide the first worst-case complexity bound for such methods for convergence to approximate second-order critical points, and show that these bounds have significantly improved dimension dependence compared to standard full-space methods, provided low accuracy solutions are desired and/or the problem has low effective rank. We also introduce a practical subspace model-based method suitable for general objective minimization, based on iterative quadratic interpolation in subspaces, and show that it can solve significantly larger problems than state-of-the-art full-space methods, while also having comparable performance on medium-scale problems when allowed to use full-dimension subspaces.