No more gradients for your ML models with Nevergrad, an open source Python toolkit from Facebook
Gradient-free optimization used to rely on custom implementation. But now, Facebook has just open sourced its Python3 toolkit for derivative-free optimization for improving machine learning parameters and hyperparameters. Optimize your models faster than ever with these tested algorithms!
Fine-tuning machine learning models takes a lot of trial and error. Now, thanks to a recently open sourced toolkit from Facebook, developers can rely on a derivative-free optimization tool to optimize their parameters and hyperparameters.
Nevergrad is a Python3 library stocked full of algorithms that allow developers to avoid gradient computation for machine learning model optimization. Presented in a standard ask-and-tell Python framework, these algorithms make optimizing models faster and easier than ever for developers.
Gradient-free optimization is crucial for machine learning tasks where estimating the gradient is impractical, in cases of non-continuous domains or a slow to compute f function. Gradient-free use cases for machine learning include things like power grid management, aeronautics, lens design, and more.
Below is an example of how an evolutionary algorithm works. Various points are sampled in the function space, a population of best points is selected, and then new points are proposed around the existing points to try to improve the current population.
Nevergrad comes with standard testing and evaluation tools. This Python library makes it easy to compare results from different implementations, thanks to a number of standardized optimizers. These different methods include:
- Differential evolution.
- Sequential quadratic programming.
- Covariance matrix adaptation.
- Population control methods for noise management.
- Particle swarm optimization.
Out of the box features include gradient/derivative-free optimization algorithms, including algorithms able to handle noise. There are also a number of tools to instrument any code; developers can optimize their parameters or hyperparameters without needing to worry about continuous or discrete variables. The functions test the optimization algorithms, while developers can also benchmark routines in order to compare algorithms easily.
Other features include:
optimization: implementing optimization algorithms
instrumentation: tooling to convert code into a well-defined function to optimize.
functions: implementing both simple and complex benchmark functions
benchmark: for running experiments comparing the algorithms on benchmark functions
common: a set of tools used throughout the package
The initial release for Nevergrad has a rather basic artificial test function, but more functions are on the way (including some functions that represent physical models).
Want to try it out? Nevergrad is available for free on GitHub now! It can be installed with a pip install or a cloned repository. You should probably have Python3 though. For more information, head on over to the detailed announcement page!