# News Archive

- (Jan-23) New paper
*Analyzing Inexact Hypergradients for Bilevel Learning*with Matthias Ehrhardt (University of Bath). We look at how to efficiently compute the gradients for bilevel learning using both classical and modern techniques. - (Dec-22) I will be talking about efficient hypergradient evaluation for bilevel optimisation at the AustMS Annual Meeting.
- (Nov-22) I will be giving an overview of black-box optimisation techniques at the Biarri Applied Mathematics Conference. My slides are available here.
- (Nov-22) New paper
*On the selection of the weighting parameter value in optimizing Eucalyptus globulus pulp yield models based on NIR spectra*with Yi Zhen (University of Melbourne), and Tu Ho, Laurence Schimleck and Arijit Sinha (Oregon State University) has been accepted by Wood Science and Technology. We study how to select NIR wavelengths for predicting wood yield without overfitting. - (Nov-22) New paper
*Optimizing illumination patterns for classical ghost imaging*with Andrew Kingston and Alaleh Aminzadeh (ANU), Daniele Pelliccia (Instruments and Data Tools Pty Ltd), Imants Svalbe and David Paganin (Monash University). We give advice for how to choose/fabricat masks for classical ghost imaging, or - for mathematicians - how to select linear operators to accurately measure objects with linear least-squares regression under practical experimental setups. - (Oct-22) New paper
*PyCUTEst: an open source Python package of optimization test problems*with Jaroslav Fowkes (Rutherford Appleton Laboratory) and Árpád Bűrmen (University of Ljubljana) has been accepted by the Journal of Open Source Software. This is a short summary paper outlining the PyCUTEst software package. - (Sep-22) I will be speaking about large-scale DFO at the Curtin Centre for Optimisation and Decision Science Colloquium.
- (Aug-22) New paper
*A Simplified Convergence Theory for Byzantine Resilient Stochastic Gradient Descent*with Edward Smyth (Australian National University) has been accepted by the EURO Journal on Computational Optimization. We significantly simplify existing theory for distributed SGD in the presence of adversarial nodes. - (Aug-22) Our paper
*Model-Based Derivative-Free Methods for Convex-Constrained Optimization*with Matthew Hough (University of Waterloo) has been accepted by SIAM Journal on Optimization. - (Jul-22) I will be speaking about large-scale DFO at the ARC OPTIMA training centre seminar series. Update: a recording of the talk is available on YouTube.
- (Jul-22) I am excited to have started a new position as a lecturer in the School of Mathematics and Statistics at the University of Sydney.
- (Jun-22) Our paper
*Scalable Subspace Methods for Derivative-Free Nonlinear Least-Squares Optimization*with Coralia Cartis (University of Oxford) has been published in Mathematical Programming. - (Apr-22) New paper out,
*Direct search based on probabilistic descent in reduced spaces*with Clément Royer (Université Paris Dauphine-PSL), showing how using randomized subspaces can make direct search methods more efficient at scale. - (Feb-22) Some thoughts I have about giving research seminars are now available on the ANU MSI website. They are based on mine and Pierre Portal’s experiences and are aimed at students in any area of mathematics. Of course, they are just our opinions and shouldn’t be taken as definitive!
- (Dec-21) Our paper
*Does Model Calibration Reduce Uncertainty in Climate Projections?*, led by Simon Tett (University of Edinburgh) has been accepted to the Journal of Climate. If this isn’t your area, perhaps my brief summary for mathematicians might help. - (Dec-21) I will be speaking about convex-constrained DFO at WoMBaT and AustMS 2021.
- (Nov-21) I am lead organizer for the Workshop on the Intersections of Computation and Optimisation on 22-25 November at ANU. This is the first such workshop, a new initiative of the AustMS special interest group MoCaO.
- (Nov-21) New paper out,
*Model-Based Derivative-Free Methods for Convex-Constrained Optimization*with Matthew Hough (University of Waterloo), where we show how use interpolation in general convex sets to do model-based derivative-free optimization. You can now download the new version 1.3 of DFO-LS which can now solve problems with general convex constraints! - (Oct-21) I will be speaking about large-scale DFO at the INFORMS Annual Meeting, as well as organising two sessions on DFO.
- (Oct-21) I will be speaking about bilevel learning at the Machine Intelligence and Learning Systems seminar at Université Paris Dauphine-PSL.
- (Sep-21) I will be speaking about large-scale DFO at the University of Leicester’s CSE Mathematics Seminar.
- (Jul-21) I will be speaking about large-scale DFO at EUROPT, EURO and SIOPT.
- (Jul-21) I was interviewed by Channel 9 National News about projections for Australia’s COVID-19 vaccine rollout.
- (Jun-21) Delighted to be announced as the first prize winner of the 20th IMA Leslie Fox Prize for Numerical Analysis for my recent paper on scalable DFO! You can watch my talk on YouTube.
- (Jun-21) New paper submitted,
*Does Model Calibration Reduce Uncertainty in Climate Projections?*, led by Simon Tett (University of Edinburgh). The study shows that performing structured parameter tuning of climate models helps to significantly reduce the uncertainties in their predictions. It also shows that my DFO-LS code is an effective solver for parameter fitting of climate models. Preprint coming soon! - (May-21) Very happy to be shortlisted for the IMA Leslie Fox Prize for Numerical Analysis for my recent paper on scalable DFO. I will be talking about my work at the Fox Prize event on 21 June.
- (Apr-21) Excited to be interviewed by Channel 9 National News about the progress of Australia’s COVID-19 vaccine rollout on 1 April and 16 April.
- (Mar-21) I will be speaking about bilevel learning at SIAM CSE.
- (Feb-21) New paper
*Scalable Subspace Methods for Derivative-Free Nonlinear Least-Squares Optimization*with Coralia Cartis (University of Oxford). This introduces a general framework for derivative-free optimization in random subspaces and specializes it to nonlinear least-squares problems (with an efficient implementation). - (Jan-21) My paper
*Escaping local minima with local derivative-free methods*with Coralia Cartis and Oliver Sheridan-Methven (University of Oxford) has been accepted by*Optimization*. - (Dec-20) My paper
*Inexact Derivative-Free Optimization for Bilevel Learning*with Matthias Ehrhardt (University of Bath) has been accepted by the*Journal of Mathematical Imaging and Vision*. - (Dec-20) I will be speaking about large-scale DFO methods at both WoMBaT and the optimization stream of AustMS.
- (Nov-20) New paper
*Efficient Hyperparameter Tuning with Dynamic Accuracy Derivative-Free Optimization*with Matthias Ehrhardt (University of Bath) has been accepted for the OPT2020 workshop at NeurIPS 2020. We introduce an efficient hyperparameter tuning algorithm with convergence guarantees. - (Oct-20) I’m delighted to receive the Reddick Prize from the InFoMM CDT at the University of Oxford for my doctoral research! Read more here.
- (Oct-20) New software package DFBGN is now available. It solves large-scale nonlinear least-squares problems without derivatives.
- (Jun-20) New paper
*Scalable Derivative-Free Optimization for Nonlinear Least-Squares Problems*with Coralia Cartis and Tyler Ferguson (Oxford) has been accepted for the ICML workshop Beyond First-Order Methods in ML Systems. - (Jun-20) New paper
*Inexact Derivative-Free Optimization for Bilevel Learning*with Matthias Ehrhardt (University of Bath)! We introduce a new algorithm for learning variational regularization parameters, applicable to problems such as image denoising and MRI reconstruction. - (May-20) My nonlinear optimization resources page is now public. This has been built over several years with Coralia Cartis (University of Oxford) and Jaroslav Fowkes (STFC Rutherford Appleton Laboratory).
- (Apr-20) I will be (virtually) presenting at the UNSW Applied Mathematics Seminar. It will be recorded, so email me if you want to watch the recording.
- (Feb-20) My paper
*A derivative-free Gauss-Newton method*has been awarded the best paper of 2019 for the journal*Mathematical Programming Computation*! - (Jan-20) I will be attending the Mathematics in Industry Study Group at the University of Newcastle.
- (Dec-19) I will be presenting at the first Data Science Down Under workshop at the University of Newcastle.
- (Nov-19) I will be an invited speaker at ANU’s Uncertainty Quantification workshop.