We present DFO-LS, a software package for derivative-free optimization (DFO) for nonlinear least-squares problems, that has simplified models, flexible initialization and improved robustness to noise. Inspired by the Gauss-Newton method, DFO-LS constructs simplified linear regression models for the residuals. DFO-LS also has improved flexibility for expensive problems, whereby it can begin making progress from as few as two objective evaluations. Numerical results show DFO-LS can gain reasonable progress on some medium-scale problems with fewer objective evaluations than is needed for one gradient evaluation. For noisy problems, DFO-LS allows a wide variety of sample averaging methodologies, the construction of highly overdetermined regression models, and restart strategies. Our extensive numerical experimentation shows that restarting the solver when stagnation is detected is a cheap and effective mechanism for achieving robustness, with superior performance. We also discuss our package Py-BOBYQA, a Python implementation of BOBYQA (Powell, 2009) which implements some of these features for general objective problems.