I am a lecturer and ARC DECRA Fellow at the School of Mathematics and Statistics, University of Sydney. My research interests are in numerical analysis and data science, particularly nonconvex and derivative-free optimization.
Details of my CV, publications, talks and software are below (or look at Google Scholar and Github). For general optimization resources, see below or my nonlinear optimization resources page.
Recent news:
Awards:
DPhil in Mathematics, 2019
University of Oxford
Bachelor of Computational Science (Honours), 2011
Australian National University
Optimization—finding the maximum or minimum of a function—is one of the most important classes of problem in computational mathematics, arising often in scientific and industrial applications. My focus is on nonlinear optimization, where the function to be optimized (the ‘objective’ function) is some nonlinear, possibly nonconvex function usually with little known structure.
Nonlinear optimization resources
Together with Coralia Cartis and Jaroslav Fowkes (University of Oxford), I maintain a page of resources for nonlinear optimization, including a collection of software and test problems.
Generally speaking, to optimize a nonlinear objective, you approximate it locally by some simpler function (such as a low-order Taylor series). To construct this simpler function, you need to evaluate the objective and its derivatives at some set of points. Evaluating the derivative of the objective can be done in several ways:
However, if the objective is black-box, expensive to evaluate or noisy (e.g. a Monte Carlo simulation, or involves the finite termination of an iterative procedure), these approaches may be impractical or inaccurate. Derivative-free optimization (DFO) is the field devoted to nonlinear optimization of objectives when you only have access to (possibly inaccurate) evaluations of the objective.
My research
I develop and study model-based DFO algorithms. This is a class of DFO method which tries to incorporate features of derivative-based methods, and ultimately build local approximations to the objective (e.g. by polynomial interpolation). I have developed several algorithms and software packages for solving least-squares problems with DFO methods, and developed techniques which make model-based DFO more robust to noise, able to local minima, and tackle large-scale problems via dimensionality reduction.
[2022 – present] Lecturer, University of Sydney
[2019 – 2022] MSI Fellow, Australian National University
[2015 – 2019] Doctoral Student, University of Oxford
[2012 – 2015] Senior Analyst, Macquarie Group
Github | Click on each package for more details. Other useful packages not developed by me are listed on the nonlinear optimization resources webpage
University of Sydney
Australian National University
University of Oxford
Qualifications