-
Notifications
You must be signed in to change notification settings - Fork 44
Add wrapper for Nevergrad PSO optimizer #579
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
Hello @janosg Is there anything left to implement in this PR? I would appreciate it if the CI gets approval to begin. |
@r3kste , I am also working on similar PR: about the addition of OnePlusOne algorithm from nevergrad. I am curious if we can also add non-linear constraint in wrapper as suggested here: https://facebookresearch.github.io/nevergrad/optimization.html#optimization-with-constraints, can you please look into this? (Currently, I have end semester exams so a little busy.) |
Hi @r3kste, I had not looked at your PR yet because you have not requested a review (see contributor guide). I approved the CI and will look at it as soon as possible. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hi @r3kste, thanks for the PR. The changes look really good. I just made a few minor stylistic comments.
Currently the tests are not passing but I think this is not related to a problem in your code. It rather seems like nevergrad has a few things that are not yet compatible with numpy 2.0.
Can you pin numpy to <2.0 in the environment to confirm this? If it works we can see how to deal with the situation.
Codecov ReportAttention: Patch coverage is
🚀 New features to boost your workflow:
|
Hello, Thanks for the review. I have committed the suggested changes. I want to mention that there are a few errors raised by mypy in pre-commit, but I believe these are unrelated to the PR. src/optimagic/optimization/history.py:412: error: Unused "type: ignore" comment [unused-ignore]
src/optimagic/optimization/history.py:533: error: Unused "type: ignore" comment [unused-ignore]
src/optimagic/logging/logger.py:193: error: Unused "type: ignore" comment [unused-ignore]
Found 3 errors in 2 files (checked 1 source file)
This was caused because |
I have been trying to fix the issue with one of the failing test ( Optimizing for the problem from the test in nevergrad, I get a 'close' solution most of the times, but a few times it returns a sub-optimal solution (differes in the first decimal place) Python 3.10.17 | packaged by conda-forge | (main, Apr 10 2025, 22:19:12) [GCC 13.3.0] on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> import nevergrad as ng
>>> import numpy as np
>>> ng.__version__
'1.0.11'
>>>
>>> instrum = ng.p.Instrumentation(
... ng.p.Array(shape=(2,), lower=np.array([0.2, -0.5]), upper=np.array([1, 0.5])),
... )
>>> ng.optimizers.PSO(parametrization=instrum, budget=1000).minimize(lambda x: x @ x).value[0][0]
array([ 0.2 , -0.0069145])
>>> ng.optimizers.PSO(parametrization=instrum, budget=1000).minimize(lambda x: x @ x).value[0][0]
array([0.2 , 0.02596084])
>>> ng.optimizers.PSO(parametrization=instrum, budget=1000).minimize(lambda x: x @ x).value[0][0]
array([0.2 , 0.07377472])
>>> ng.optimizers.PSO(parametrization=instrum, budget=1000).minimize(lambda x: x @ x).value[0][0]
array([ 0.2 , -0.0101106])
>>> ng.optimizers.PSO(parametrization=instrum, budget=1000).minimize(lambda x: x @ x).value[0][0]
array([0.2 , 0.01947061])
>>> ng.optimizers.PSO(parametrization=instrum, budget=1000).minimize(lambda x: x @ x).value[0][0]
array([0.21244835, 0.1338858 ])
>>> ng.optimizers.PSO(parametrization=instrum, budget=1000).minimize(lambda x: x @ x).value[0][0]
array([0.21057671, 0.04487235])
>>> ng.optimizers.PSO(parametrization=instrum, budget=1000).minimize(lambda x: x @ x).value[0][0]
array([ 0.2 , -0.10922376]) It would be very helpful to know how to proceed further in such cases. |
Hi @r3kste, this is quite common for global algorithms and cannot be fixed completely. Our first approach is usually to work on the tuning parameters of the algorithm to make the failures less likely at default values (e.g. increase Another thing I realized: You currently don't expose a seed argument as we do for example in the pygmo optimizers. It would be good to add a Once we have the seed argument, we can think about having separate tests for stochastic optimizers. |
Implements the Particle Swarm Optimization algorithm from
nevergrad
.Partially fixes #560