Avatar

Lindon Roberts

MSI Fellow

Australian National University

Biography

Lindon Roberts is an MSI Fellow at the Mathematical Sciences Institute, Australian National University. His research interests are in numerical analysis and data science, particularly nonconvex and derivative-free optimization.

Interests

  • Derivative-Free Optimization
  • Nonconvex Optimization
  • Numerical Analysis
  • Data Science

Education

  • PhD in Mathematics, 2020 (passed, not yet awarded)

    University of Oxford

  • Bachelor of Computational Sciences (Honours), 2011

    Australian National University

Research

Derivative-Free Optimization (DFO)

Optimization—finding the maximum or minimum of a function—is one of the most important classes of problem in computational mathematics, arising often in scientific and industrial applications. My focus is on nonlinear optimization, where the function to be optimized (the ‘objective’ function) is some nonlinear, nonconvex function with unknown structure.

Generally speaking, to optimize a nonlinear objective, you approximate it locally by some simpler function (such as a low-order Taylor series). To construct this simpler function, you need to evaluate the objective and its derivatives at some set of points. Evaluating the derivative of the objective can be done in several ways:

  • If you know the analytic form of the objective, you can compute its derivatives using calculus.
  • If you have access to computer code for evaluating the objective, automatic differentiation could be used to compute analytic derivatives.
  • Otherwise, you have to approximate the derivative, e.g. using finite differencing.

However, if the objective is black-box, expensive to evaluate or noisy (e.g. a Monte Carlo simulation, or involves the finite termination of an iterative procedure), these approaches may be impractical or inaccurate. Derivative-free optimization (DFO) is the field devoted to nonlinear optimization of objectives when you only have access to (possibly inaccurate) evaluations of the objective.

My research

I develop and study model-based DFO algorithms. This is a class of DFO method which tries to incorporate features of derivative-based methods, and ultimately build local approximations to the objective (e.g. by polynomial interpolation). I have developed several algorithms and software packages for solving least-squares problems with DFO methods, and developed techniques which make model-based DFO more robust to noise, able to local minima, and tackle large-scale problems via dimensionality reduction.

Employment

[2019 – present] MSI Fellow, Australian National University

[2015 – 2019] Doctoral Student, University of Oxford

[2012 – 2015] Senior Analyst, Macquarie Group

  • Worked in the Quantitative Applications Division of the Risk Management Group. Responsible for implementing risk management models (particularly for market risk) and reviewing pricing models.
  • Macquarie Group is Australia’s largest investment bank, headquartered in Sydney.

Software

Click on each package for more details

trustregion

Python routines for solving trust-region subproblems

PyCUTEst

Python interface to CUTEst optimization testing package

DFO-LS

Derivative-free solver for nonlinear least-squares problems

Py-BOBYQA

General-purpose derivative-free optimization solver

DFO-GN

Derivative-Free Gauss-Newton solver

Teaching

Australian National University

University of Oxford

Qualifications

  • 2018, Professional Development Framework Supporting Learning Award, Staff and Educational Development Association — this is a teaching qualification accredited by the UK professional association for higher education (more details here), assessed by the University of Oxford.

Contact

  • +61 2 6125 4678
  • Hanna Neumann Building 145, Science Road, Australian National University, Canberra, ACT, 2601, Australia
  • Office 4.87