This documentation is a work in progress, feel free to help us update/improve/restucture it!
nevergrad
is a Python 3.6+ library. It can be installed with:
pip install nevergrad
You can find other installation options (including for Windows users) in the :ref:`Getting started section <getting_started>`.
Feel free to join Nevergrad users Facebook group.
Minimizing a function using an optimizer (here OnePlusOne
) can be easily run with:
.. literalinclude:: ../nevergrad/optimization/test_doc.py :language: python :dedent: 4 :start-after: DOC_SIMPLEST_0 :end-before: DOC_SIMPLEST_1
Convergence of a population of points to the minima with two-points DE.
nevergrad
can also support bounded continuous variables as well as discrete variables, and mixture of those.
To do this, one can specify the input space:
.. literalinclude:: ../nevergrad/parametrization/test_param_doc.py :language: python :dedent: 4 :start-after: DOC_README_0 :end-before: DOC_README_1
Learn more about parametrization in the :ref:`Parametrization section <parametrizing>`!
.. toctree:: :maxdepth: 3 :caption: Contents getting_started.rst optimization.rst parametrization.rst machinelearning.rst optimizers_ref.rst parametrization_ref.rst benchmarking.rst benchmarks.rst pyomo.rst windows.md contributing.rst opencompetition2020.md
@misc{nevergrad,
author = {J. Rapin and O. Teytaud},
title = {{Nevergrad - A gradient-free optimization platform}},
year = {2018},
publisher = {GitHub},
journal = {GitHub repository},
howpublished = {\url{https://GitHub.com/FacebookResearch/Nevergrad}},
}
nevergrad
is released under the MIT license. See LICENSE for additional details about it, as well as our Terms of Use and Privacy Policy.