Lindon Roberts


University of Sydney


I am a lecturer at the School of Mathematics and Statistics, University of Sydney. My research interests are in numerical analysis and data science, particularly nonconvex and derivative-free optimization.

Details of my CV, publications, talks and software are below (or look at Google Scholar and Github). For general optimization resources, see below or my nonlinear optimization resources page.

Recent news:

  • (Aug-22) Our paper Model-Based Derivative-Free Methods for Convex-Constrained Optimization with Matthew Hough (University of Waterloo) has been accepted by SIAM Journal on Optimization.
  • (Jul-22) I will be speaking about large-scale DFO at the ARC OPTIMA training centre seminar series. Details including Zoom information here. Update: watch the recording here.
  • (Jul-22) I am excited to have started a new position as a lecturer in the School of Mathematics and Statistics at the University of Sydney.
  • (Jun-22) Our paper Scalable Subspace Methods for Derivative-Free Nonlinear Least-Squares Optimization with Coralia Cartis (University of Oxford) has been published in Mathematical Programming. Read the paper here.
  • (Apr-22) New paper out, Direct search based on probabilistic descent in reduced spaces with Clément Royer (Université Paris Dauphine-PSL), showing how using randomized subspaces can make direct search methods more efficient at scale. Read it here.
  • (Feb-22) Some thoughts I have about giving research seminars are now available on the ANU MSI website. They are based on mine and Pierre Portal’s experiences and are aimed at students in any area of mathematics. Of course, they are just our opinions and shouldn’t be taken as definitive!
  • (Dec-21) Our paper Does Model Calibration Reduce Uncertainty in Climate Projections?, led by Simon Tett (University of Edinburgh) has been accepted to the Journal of Climate. Read the paper here, or my brief summary for mathematicians here.
  • (Dec-21) I will be speaking about convex-constrained DFO at WoMBaT and AustMS 2021.
  • (Nov-21) I am lead organizer for the Workshop on the Intersections of Computation and Optimisation on 22-25 November at ANU. This is the first such workshop, a new initiative of the AustMS special interest group MoCaO.
  • (Nov-21) New paper out, Model-Based Derivative-Free Methods for Convex-Constrained Optimization with Matthew Hough (University of Waterloo), where we show how use interpolation in general convex sets to do model-based derivative-free optimization. Read it here, and download the new version 1.3 of DFO-LS which can now solve problems with general convex constraints!
  • (Oct-21) I will be speaking about large-scale DFO at the INFORMS Annual Meeting, as well as organising two sessions on DFO.
  • (Oct-21) I will be speaking about bilevel learning at the Machine Intelligence and Learning Systems seminar at Université Paris Dauphine-PSL.

News archive



  • Derivative-Free Optimization
  • Nonconvex Optimization
  • Numerical Analysis
  • Data Science


  • DPhil in Mathematics, 2019

    University of Oxford

  • Bachelor of Computational Science (Honours), 2011

    Australian National University



Optimization—finding the maximum or minimum of a function—is one of the most important classes of problem in computational mathematics, arising often in scientific and industrial applications. My focus is on nonlinear optimization, where the function to be optimized (the ‘objective’ function) is some nonlinear, possibly nonconvex function usually with little known structure.

Nonlinear optimization resources

Together with Coralia Cartis and Jaroslav Fowkes (University of Oxford), I maintain a page of resources for nonlinear optimization, including a collection of software and test problems.

Derivative-Free Optimization (DFO)

Generally speaking, to optimize a nonlinear objective, you approximate it locally by some simpler function (such as a low-order Taylor series). To construct this simpler function, you need to evaluate the objective and its derivatives at some set of points. Evaluating the derivative of the objective can be done in several ways:

  • If you know the analytic form of the objective, you can compute its derivatives using calculus.
  • If you have access to computer code for evaluating the objective, automatic differentiation could be used to compute analytic derivatives.
  • Otherwise, you have to approximate the derivative, e.g. using finite differencing.

However, if the objective is black-box, expensive to evaluate or noisy (e.g. a Monte Carlo simulation, or involves the finite termination of an iterative procedure), these approaches may be impractical or inaccurate. Derivative-free optimization (DFO) is the field devoted to nonlinear optimization of objectives when you only have access to (possibly inaccurate) evaluations of the objective.

My research

I develop and study model-based DFO algorithms. This is a class of DFO method which tries to incorporate features of derivative-based methods, and ultimately build local approximations to the objective (e.g. by polynomial interpolation). I have developed several algorithms and software packages for solving least-squares problems with DFO methods, and developed techniques which make model-based DFO more robust to noise, able to local minima, and tackle large-scale problems via dimensionality reduction.


[2022 – present] Lecturer, University of Sydney

[2019 – 2022] MSI Fellow, Australian National University

[2015 – 2019] Doctoral Student, University of Oxford

[2012 – 2015] Senior Analyst, Macquarie Group

  • Worked in the Quantitative Applications Division of the Risk Management Group. Responsible for implementing risk management models (particularly for market risk) and reviewing pricing models.
  • Macquarie Group is Australia’s largest investment bank, headquartered in Sydney.


Github | Click on each package for more details. Other useful packages not developed by me are listed on the nonlinear optimization resources webpage


Derivative-free block Gauss-Newton method (for large-scale nonlinear least-squares problems)


Python routines for solving trust-region subproblems


Python interface to CUTEst optimization testing package


Derivative-free solver for nonlinear least-squares problems


General-purpose derivative-free optimization solver


Derivative-Free Gauss-Newton solver


University of Sydney

Australian National University

University of Oxford


  • 2018, Professional Development Framework Supporting Learning Award, Staff and Educational Development Association — this is a teaching qualification accredited by the UK professional association for higher education (more details here), assessed by the University of Oxford.