Many standard optimization algorithms require being able to cheaply and accurately compute derivatives for the objective and/or …

When optimizing functions which are computationally expensive and/or noisy, gradient information is often impractical to obtain or …

When optimizing functions which are computationally expensive and/or noisy, gradient information is often impractical to obtain or …

When variational regularisation methods are used to solve inverse problems, they suffer from the drawback of having potentially many …

When variational regularisation methods are used to solve inverse problems, they suffer from the drawback of having potentially many …

When optimizing functions which are computationally expensive and/or noisy, gradient information is often impractical to obtain or …

In existing techniques for model-based derivative-free optimization, the computational cost of constructing local models and Lagrange …

In existing techniques for model-based derivative-free optimization, the computational cost of constructing local models and Lagrange …

In existing techniques for model-based derivative-free optimization, the computational cost of constructing local models and Lagrange …

When variational regularisation methods are used to solve inverse problems, they suffer from the drawback of having potentially many …

Derivative-free optimisation (DFO) methods are an important class of optimisation routines with applications in areas such as in image …

Derivative-free optimisation (DFO) methods are an important class of optimisation routines with applications in areas such as in image …

Least-squares problems (such as parameter estimation) are ubiquitous across quantitative disciplines. Optimisation algorithms for …

Derivative-free optimization (DFO) methods are an important class of optimization routines for many problems in data science, such as …

In classical nonlinear optimisation, the availability of first-order information is crucial to constructing accurate local models for …

Classical nonconvex optimisation algorithms require the availability of gradient evaluations for constructing local approximations to …

In classical nonlinear optimisation, the availability of first-order information is crucial to constructing accurate local models for …

Classical nonlinear optimisation algorithms require the availability of gradient evaluations for constructing local approximations to …

Classical nonlinear optimization algorithms require the availability of gradient evaluations for constructing local approximations to …

We present DFO-LS, a software package for derivative-free optimization (DFO) for nonlinear least-squares problems, that has simplified …

We present DFO-GN, a derivative-free version of the Gauss-Newton method for solving nonlinear least-squares problems. As is common in …

Derivative-free optimisation (DFO) algorithms are a category of optimisation methods for situations when one is unable to compute or …