Skip to content

Overflow when setting bounds for NGOpt #1592

@nepfaff

Description

@nepfaff

Steps to reproduce

import sys

import nevergrad as ng
import numpy as np

x_array = ng.p.Array(init=np.ones(10)).set_bounds(
    -sys.float_info.max, sys.float_info.max
)
optimizer = ng.optimizers.NGOpt(parametrization=x_array, budget=1000)

def cost(x):
    return np.linalg.norm(x)

optimizer.minimize(cost)

Observed Results

I'm getting the following:

RuntimeWarning: overflow encountered in subtract
return trans.Affine(self.bounds[1] - self.bounds[0], self.bounds[0]).reverted() # type: ignore
UserWarning: Failed to find bounds for Array{(10,),Cl(-179769313486231570814527423731704356798070567525844996598917476803157260780028538760589558632766878171540458953514382464234321326889464182768467546703537516986049910576551282076245490090389328944075868508455133942304583236903222948165808559332123348274797826204144723168738177180919299881250404026184124858368,179769313486231570814527423731704356798070567525844996598917476803157260780028538760589558632766878171540458953514382464234321326889464182768467546703537516986049910576551282076245490090389328944075868508455133942304583236903222948165808559332123348274797826204144723168738177180919299881250404026184124858368,b)}:[1. 1. 1. 1. 1. 1. 1. 1. 1. 1.], quasi-random optimizer may be inefficient.
Please open an issue on Nevergrad github
warnings.warn(
python3.10/site-packages/scipy/optimize/_optimize.py:2989: RuntimeWarning: overflow encountered in scalar multiply
w = xb - ((xb - xc) * tmp2 - (xb - xa) * tmp1) / denom

Changing the bounds to something more reasonable like [-1e10, 1e10] doesn't produce the first errors but still results in the following:

python3.10/site-packages/scipy/optimize/_optimize.py:2989: RuntimeWarning: overflow encountered in scalar multiply
w = xb - ((xb - xc) * tmp2 - (xb - xa) * tmp1) / denom

Reducing the budget to 100 gives the following error instead:

python3.10/site-packages/nevergrad/parametrization/data.py:309: RuntimeWarning: overflow encountered in subtract
return reference._to_reduced_space(self._value - reference._value)

Lastly, using bounds [-np.inf, np.inf] works for some optimizers but not others. For example, NGOpt with a budget of 100 works just fine, but NGOpt with a budget of 1000 returns nan without warning.

Expected Results

I want to be able to set large bounds without weird errors. In particular, I have a large vector of decision variables, and some of them have bounds while others do not. Ideally, I would want to set bounds using set_bounds for the entire vector, where the variables without bounds have some large values or np.inf.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions