News Archive
- (Oct-24) New paper High-resolution x-ray scanning with a diffuse, Huffman-patterned probe to minimise radiation damage with Alaleh Aminzadeh and Andrew Kingston (ANU), and David Paganin, Tim Petersen and Imants Svalbe (Monash University). We demonstrate how to computationally construct aperiodic masks for ghost imaging and show their practical experimental performance.
- (Sep-24) I will be talking about randomized subspace derivative-free optimization algorithms at the UNSW Applied Mathematics Seminar.
- (Aug-24) Our paper Expected decrease for derivative-free algorithms using random subspaces with Warren Hare (University of British Columbia) and Clément Royer (Université Paris Dauphine-PSL) has been accepted by Mathematics of Computation.
- (Jul-24) New paper Black-box Optimization Algorithms for Regularized Least-squares Problems with Yanjun Liu (Princeton University) and Kevin Lam (Australian National University). We introduce a DFO method for nonlinear least-squares problems with nonsmooth regularizers.
- (Jun-24) I will be visiting Clément Royer at the Université Paris Dauphine-PSL and then attending the 2nd Derivative-Free Optimization Symposium at the University of Padova.
- (May-24) I will be hosting Hung Phan (University of Massachusetts Lowell) as part of SMRI’s International Visitor Program.
- (Apr-24) Significant new revisions to an earlier preprint, now titled An adaptively inexact first-order method for bilevel optimization with application to hyperparameter learning, are now available. This is joint work with Mohammad Sadegh Salehi, Matthias Ehrhardt (University of Bath) and Subhadip Mukherjee (IIT Kharagpur).
- (Mar-24) New paper Model Construction for Convex-Constrained Derivative-Free Optimization. This develops an approximation theory for quadratic interpolation in general convex-constrained sets, extending earlier work on linear interpolation.
- (Feb-24) Hosted Shane Henderson (Cornell University) as part of SMRI’s International Visitor Program.
- (Feb-24) Our paper Non-Uniform Smoothness for Gradient Descent with Albert Berahas (University of Michigan) and Fred Roosta (University of Queensland) has been accepted by Transactions on Machine Learning Research.
- (Feb-24) I will be speaking about the paper Dynamic Bilevel Learning with Inexact Line Search as an invited speaker for the first SigmaOpt workshop.
- (Jan-24) Happy to receive a CNRS International Emerging Actions grant with Clément Royer (Université Paris Dauphine-PSL) to work on random subspace methods for DFO.
- (Dec-23) I am a co-organiser of the joint WOMBAT/WICO workshops on optimisation and computational maths, 11-15 December at the University of Sydney. I am also speaking about expected decrease analysis for random subspace methods.
- (Nov-23) New paper Non-Uniform Smoothness for Gradient Descent with Albert Berahas (University of Michigan) and Fred Roosta (University of Queensland). We introduce a new local first-order smoothness oracle for automatic tuning of stepsizes for gradient descent.
- (Nov-23) Our paper Analyzing Inexact Hypergradients for Bilevel Learning with Matthias Ehrhardt (University of Bath) has been accepted by IMA Journal of Applied Mathematics.
- (Nov-23) I am on the judging panel for the first SigmaOpt Student Best Paper Prize (for Australian students working in optimization), which will be presented at the SigmaOpt workshop after ANZIAM 2024. See the workshop page for details.
- (Oct-23) I will be an invited speaker at the first SigmaOpt workshop associated with the ANZIAM meeting in February 2024. I am also on the judging panel for the best student paper prize which will be awarded at the workshop.
- (Oct-23) I gave a One School seminar (i.e. school colloquium) for the USyd School of Mathematics and Statistics.
- (Sep-23) I will be speaking about large-scale DFO at the Simons Collaboration on Hidden Symmetries and Fusion Energy.
- (Aug-23) Very excited to receive an ARC Discovery Early Career Researcher Award (2024-2026). See a short project description on the ARC page of all funded projects or the university news item.
- (Aug-23) New paper Dynamic Bilevel Learning with Inexact Line Search with Mohammad Sadegh Salehi, Matthias Ehrhardt (University of Bath) and Subhadip Mukherjee (IIT Kharagpur). We introduce a linesearch algorithm suitable for bilevel learning based on conntrollable accuracy function and (hyper)gradient evaluations.
- (Aug-23) New paper Expected decrease for derivative-free algorithms using random subspaces with Warren Hare (University of British Columbia) and Clément Royer (Université Paris Dauphine-PSL). We study why random subspace methods work best with very low-dimensional subspaces.
- (Aug-23) Our paper Direct search based on probabilistic descent in reduced spaces with Clément Royer (Université Paris Dauphine-PSL) has been accepted by SIAM Journal on Optimization.
- (Jul-23) Our paper Mask design, fabrication, and experimental ghost imaging applications for patterned X-ray illumination with Alaleh Aminzadeh, Benjamin Young and Cheng-I Chiang (ANU), Imants Svalbe and David Paganin (Monash University) and Andrew Kingston (ANU) has been accepted by Optics Express.
- (Jun-23) I will be speaking about hypergradient estimation for bilevel optimization at SIAM Conference on Optimization in Seattle, and am co-organising a series of minisymposia on derivative-free optimization with Clément Royer (Université Paris Dauphine-PSL), Warren Hare (UBC) and Sébastien Le Digabel (Polytechnique Montréal).
- (May-23) Together with Geordie Williamson (USyd) I spoke to Associated Press about the role of AI in mathematics, featured for example in the SBS News in Depth podcast.
- (May-23) New paper Mask design, fabrication, and experimental ghost imaging applications for patterned X-ray illumination with Alaleh Aminzadeh, Benjamin Young and Cheng-I Chiang (ANU), Imants Svalbe and David Paganin (Monash University) and Andrew Kingston (ANU). This is an experimental follow-up to our previous, more computational ghost imaging paper.
- (Feb-23) Our paper Optimizing illumination patterns for classical ghost imaging with Andrew Kingston and Alaleh Aminzadeh (ANU), Daniele Pelliccia (Instruments and Data Tools Pty Ltd), Imants Svalbe and David Paganin (Monash University) has been accepted by Physical Review A.
- (Jan-23) New paper Analyzing Inexact Hypergradients for Bilevel Learning with Matthias Ehrhardt (University of Bath). We look at how to efficiently compute the gradients for bilevel learning using both classical and modern techniques.
- (Dec-22) I will be talking about efficient hypergradient evaluation for bilevel optimisation at the AustMS Annual Meeting.
- (Nov-22) I will be giving an overview of black-box optimisation techniques at the Biarri Applied Mathematics Conference. My slides are available here.
- (Nov-22) New paper On the selection of the weighting parameter value in optimizing Eucalyptus globulus pulp yield models based on NIR spectra with Yi Zhen (University of Melbourne), and Tu Ho, Laurence Schimleck and Arijit Sinha (Oregon State University) has been accepted by Wood Science and Technology. We study how to select NIR wavelengths for predicting wood yield without overfitting.
- (Nov-22) New paper Optimizing illumination patterns for classical ghost imaging with Andrew Kingston and Alaleh Aminzadeh (ANU), Daniele Pelliccia (Instruments and Data Tools Pty Ltd), Imants Svalbe and David Paganin (Monash University). We give advice for how to choose/fabricat masks for classical ghost imaging, or - for mathematicians - how to select linear operators to accurately measure objects with linear least-squares regression under practical experimental setups.
- (Oct-22) New paper PyCUTEst: an open source Python package of optimization test problems with Jaroslav Fowkes (Rutherford Appleton Laboratory) and Árpád Bűrmen (University of Ljubljana) has been accepted by the Journal of Open Source Software. This is a short summary paper outlining the PyCUTEst software package.
- (Sep-22) I will be speaking about large-scale DFO at the Curtin Centre for Optimisation and Decision Science Colloquium.
- (Aug-22) New paper A Simplified Convergence Theory for Byzantine Resilient Stochastic Gradient Descent with Edward Smyth (Australian National University) has been accepted by the EURO Journal on Computational Optimization. We significantly simplify existing theory for distributed SGD in the presence of adversarial nodes.
- (Aug-22) Our paper Model-Based Derivative-Free Methods for Convex-Constrained Optimization with Matthew Hough (University of Waterloo) has been accepted by SIAM Journal on Optimization.
- (Jul-22) I will be speaking about large-scale DFO at the ARC OPTIMA training centre seminar series. Update: a recording of the talk is available on YouTube.
- (Jul-22) I am excited to have started a new position as a lecturer in the School of Mathematics and Statistics at the University of Sydney.
- (Jun-22) Our paper Scalable Subspace Methods for Derivative-Free Nonlinear Least-Squares Optimization with Coralia Cartis (University of Oxford) has been published in Mathematical Programming.
- (Apr-22) New paper out, Direct search based on probabilistic descent in reduced spaces with Clément Royer (Université Paris Dauphine-PSL), showing how using randomized subspaces can make direct search methods more efficient at scale.
- (Feb-22) Some thoughts I have about giving research seminars are now available on the ANU MSI website. They are based on mine and Pierre Portal’s experiences and are aimed at students in any area of mathematics. Of course, they are just our opinions and shouldn’t be taken as definitive!
- (Dec-21) Our paper Does Model Calibration Reduce Uncertainty in Climate Projections?, led by Simon Tett (University of Edinburgh) has been accepted to the Journal of Climate. If this isn’t your area, perhaps my brief summary for mathematicians might help.
- (Dec-21) I will be speaking about convex-constrained DFO at WoMBaT and AustMS 2021.
- (Nov-21) I am lead organizer for the Workshop on the Intersections of Computation and Optimisation on 22-25 November at ANU. This is the first such workshop, a new initiative of the AustMS special interest group MoCaO.
- (Nov-21) New paper out, Model-Based Derivative-Free Methods for Convex-Constrained Optimization with Matthew Hough (University of Waterloo), where we show how use interpolation in general convex sets to do model-based derivative-free optimization. You can now download the new version 1.3 of DFO-LS which can now solve problems with general convex constraints!
- (Oct-21) I will be speaking about large-scale DFO at the INFORMS Annual Meeting, as well as organising two sessions on DFO.
- (Oct-21) I will be speaking about bilevel learning at the Machine Intelligence and Learning Systems seminar at Université Paris Dauphine-PSL.
- (Sep-21) I will be speaking about large-scale DFO at the University of Leicester’s CSE Mathematics Seminar.
- (Jul-21) I will be speaking about large-scale DFO at EUROPT, EURO and SIOPT.
- (Jul-21) I was interviewed by Channel 9 National News about projections for Australia’s COVID-19 vaccine rollout.
- (Jun-21) Delighted to be announced as the first prize winner of the 20th IMA Leslie Fox Prize for Numerical Analysis for my recent paper on scalable DFO! You can watch my talk on YouTube.
- (Jun-21) New paper submitted, Does Model Calibration Reduce Uncertainty in Climate Projections?, led by Simon Tett (University of Edinburgh). The study shows that performing structured parameter tuning of climate models helps to significantly reduce the uncertainties in their predictions. It also shows that my DFO-LS code is an effective solver for parameter fitting of climate models. Preprint coming soon!
- (May-21) Very happy to be shortlisted for the IMA Leslie Fox Prize for Numerical Analysis for my recent paper on scalable DFO. I will be talking about my work at the Fox Prize event on 21 June.
- (Apr-21) Excited to be interviewed by Channel 9 National News about the progress of Australia’s COVID-19 vaccine rollout on 1 April and 16 April.
- (Mar-21) I will be speaking about bilevel learning at SIAM CSE.
- (Feb-21) New paper Scalable Subspace Methods for Derivative-Free Nonlinear Least-Squares Optimization with Coralia Cartis (University of Oxford). This introduces a general framework for derivative-free optimization in random subspaces and specializes it to nonlinear least-squares problems (with an efficient implementation).
- (Jan-21) My paper Escaping local minima with local derivative-free methods with Coralia Cartis and Oliver Sheridan-Methven (University of Oxford) has been accepted by Optimization.
- (Dec-20) My paper Inexact Derivative-Free Optimization for Bilevel Learning with Matthias Ehrhardt (University of Bath) has been accepted by the Journal of Mathematical Imaging and Vision.
- (Dec-20) I will be speaking about large-scale DFO methods at both WoMBaT and the optimization stream of AustMS.
- (Nov-20) New paper Efficient Hyperparameter Tuning with Dynamic Accuracy Derivative-Free Optimization with Matthias Ehrhardt (University of Bath) has been accepted for the OPT2020 workshop at NeurIPS 2020. We introduce an efficient hyperparameter tuning algorithm with convergence guarantees.
- (Oct-20) I’m delighted to receive the Reddick Prize from the InFoMM CDT at the University of Oxford for my doctoral research! Read more here.
- (Oct-20) New software package DFBGN is now available. It solves large-scale nonlinear least-squares problems without derivatives.
- (Jun-20) New paper Scalable Derivative-Free Optimization for Nonlinear Least-Squares Problems with Coralia Cartis and Tyler Ferguson (Oxford) has been accepted for the ICML workshop Beyond First-Order Methods in ML Systems.
- (Jun-20) New paper Inexact Derivative-Free Optimization for Bilevel Learning with Matthias Ehrhardt (University of Bath)! We introduce a new algorithm for learning variational regularization parameters, applicable to problems such as image denoising and MRI reconstruction.
- (May-20) My nonlinear optimization resources page is now public. This has been built over several years with Coralia Cartis (University of Oxford) and Jaroslav Fowkes (STFC Rutherford Appleton Laboratory).
- (Apr-20) I will be (virtually) presenting at the UNSW Applied Mathematics Seminar. It will be recorded, so email me if you want to watch the recording.
- (Feb-20) My paper A derivative-free Gauss-Newton method has been awarded the best paper of 2019 for the journal Mathematical Programming Computation!
- (Jan-20) I will be attending the Mathematics in Industry Study Group at the University of Newcastle.
- (Dec-19) I will be presenting at the first Data Science Down Under workshop at the University of Newcastle.
- (Nov-19) I will be an invited speaker at ANU’s Uncertainty Quantification workshop.