You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Reducing the budget to 100 gives the following error instead:
python3.10/site-packages/nevergrad/parametrization/data.py:309: RuntimeWarning: overflow encountered in subtract
return reference._to_reduced_space(self._value - reference._value)
Lastly, using bounds [-np.inf, np.inf] works for some optimizers but not others. For example, NGOpt with a budget of 100 works just fine, but NGOpt with a budget of 1000 returns nan without warning.
Expected Results
I want to be able to set large bounds without weird errors. In particular, I have a large vector of decision variables, and some of them have bounds while others do not. Ideally, I would want to set bounds using set_bounds for the entire vector, where the variables without bounds have some large values or np.inf.
The text was updated successfully, but these errors were encountered:
Steps to reproduce
Observed Results
I'm getting the following:
Changing the bounds to something more reasonable like
[-1e10, 1e10]
doesn't produce the first errors but still results in the following:Reducing the budget to
100
gives the following error instead:Lastly, using bounds
[-np.inf, np.inf]
works for some optimizers but not others. For example,NGOpt
with a budget of100
works just fine, butNGOpt
with a budget of1000
returnsnan
without warning.Expected Results
I want to be able to set large bounds without weird errors. In particular, I have a large vector of decision variables, and some of them have bounds while others do not. Ideally, I would want to set bounds using
set_bounds
for the entire vector, where the variables without bounds have some large values ornp.inf
.The text was updated successfully, but these errors were encountered: