Skip to content

Add wrapper for Nevergrad PSO optimizer #579

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 6 commits into
base: main
Choose a base branch
from

Conversation

r3kste
Copy link

@r3kste r3kste commented Apr 7, 2025

Implements the Particle Swarm Optimization algorithm from nevergrad.
Partially fixes #560

@r3kste
Copy link
Author

r3kste commented Apr 19, 2025

Hello @janosg Is there anything left to implement in this PR? I would appreciate it if the CI gets approval to begin.

@gulshan-123
Copy link

gulshan-123 commented Apr 19, 2025

@r3kste , I am also working on similar PR: about the addition of OnePlusOne algorithm from nevergrad. I am curious if we can also add non-linear constraint in wrapper as suggested here: https://facebookresearch.github.io/nevergrad/optimization.html#optimization-with-constraints, can you please look into this? (Currently, I have end semester exams so a little busy.)

@janosg
Copy link
Member

janosg commented Apr 21, 2025

Hi @r3kste, I had not looked at your PR yet because you have not requested a review (see contributor guide). I approved the CI and will look at it as soon as possible.

@janosg janosg self-requested a review April 21, 2025 16:30
Copy link
Member

@janosg janosg left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hi @r3kste, thanks for the PR. The changes look really good. I just made a few minor stylistic comments.

Currently the tests are not passing but I think this is not related to a problem in your code. It rather seems like nevergrad has a few things that are not yet compatible with numpy 2.0.

Can you pin numpy to <2.0 in the environment to confirm this? If it works we can see how to deal with the situation.

Copy link

codecov bot commented Apr 22, 2025

Codecov Report

Attention: Patch coverage is 96.15385% with 3 lines in your changes missing coverage. Please review.

Files with missing lines Patch % Lines
src/optimagic/config.py 60.00% 2 Missing ⚠️
src/optimagic/optimizers/nevergrad_optimizers.py 97.50% 1 Missing ⚠️
Files with missing lines Coverage Δ
src/optimagic/algorithms.py 85.85% <100.00%> (+0.16%) ⬆️
src/optimagic/optimizers/nevergrad_optimizers.py 97.50% <97.50%> (ø)
src/optimagic/config.py 70.14% <60.00%> (-0.82%) ⬇️
🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.

@r3kste
Copy link
Author

r3kste commented Apr 22, 2025

Hello, Thanks for the review. I have committed the suggested changes.

I want to mention that there are a few errors raised by mypy in pre-commit, but I believe these are unrelated to the PR.

src/optimagic/optimization/history.py:412: error: Unused "type: ignore" comment  [unused-ignore]
src/optimagic/optimization/history.py:533: error: Unused "type: ignore" comment  [unused-ignore]
src/optimagic/logging/logger.py:193: error: Unused "type: ignore" comment  [unused-ignore]
Found 3 errors in 2 files (checked 1 source file)

Currently the tests are not passing but I think this is not related to a problem in your code. It rather seems like nevergrad has a few things that are not yet compatible with numpy 2.0.

This was caused because nevergrad is outdated in the conda repository. Therefore I edited environment.yml such that nevergrad is installed using pip.

@r3kste
Copy link
Author

r3kste commented Apr 28, 2025

I have been trying to fix the issue with one of the failing test (tests/optimagic/optimization/test_many_algorithms.py::test_global_algorithms_on_sum_of_squares). However, I think that this is not an implementation issue, but rather due to Nevergrad PSO algorithm itself not being able to find the optimal solution sometimes.

Optimizing for the problem from the test in nevergrad, I get a 'close' solution most of the times, but a few times it returns a sub-optimal solution (differes in the first decimal place)

Python 3.10.17 | packaged by conda-forge | (main, Apr 10 2025, 22:19:12) [GCC 13.3.0] on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> import nevergrad as ng
>>> import numpy as np
>>> ng.__version__
'1.0.11'
>>> 
>>> instrum = ng.p.Instrumentation(
...     ng.p.Array(shape=(2,), lower=np.array([0.2, -0.5]), upper=np.array([1, 0.5])),
... )
>>> ng.optimizers.PSO(parametrization=instrum, budget=1000).minimize(lambda x: x @ x).value[0][0]
array([ 0.2      , -0.0069145])
>>> ng.optimizers.PSO(parametrization=instrum, budget=1000).minimize(lambda x: x @ x).value[0][0]
array([0.2       , 0.02596084])
>>> ng.optimizers.PSO(parametrization=instrum, budget=1000).minimize(lambda x: x @ x).value[0][0]
array([0.2       , 0.07377472])
>>> ng.optimizers.PSO(parametrization=instrum, budget=1000).minimize(lambda x: x @ x).value[0][0]
array([ 0.2      , -0.0101106])
>>> ng.optimizers.PSO(parametrization=instrum, budget=1000).minimize(lambda x: x @ x).value[0][0]
array([0.2       , 0.01947061])
>>> ng.optimizers.PSO(parametrization=instrum, budget=1000).minimize(lambda x: x @ x).value[0][0]
array([0.21244835, 0.1338858 ])
>>> ng.optimizers.PSO(parametrization=instrum, budget=1000).minimize(lambda x: x @ x).value[0][0]
array([0.21057671, 0.04487235])
>>> ng.optimizers.PSO(parametrization=instrum, budget=1000).minimize(lambda x: x @ x).value[0][0]
array([ 0.2       , -0.10922376])

It would be very helpful to know how to proceed further in such cases.

@janosg
Copy link
Member

janosg commented Apr 28, 2025

I have been trying to fix the issue with one of the failing test (tests/optimagic/optimization/test_many_algorithms.py::test_global_algorithms_on_sum_of_squares). However, I think that this is not an implementation issue, but rather due to Nevergrad PSO algorithm itself not being able to find the optimal solution sometimes.

Optimizing for the problem from the test in nevergrad, I get a 'close' solution most of the times, but a few times it returns a sub-optimal solution (differes in the first decimal place)

Python 3.10.17 | packaged by conda-forge | (main, Apr 10 2025, 22:19:12) [GCC 13.3.0] on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> import nevergrad as ng
>>> import numpy as np
>>> ng.__version__
'1.0.11'
>>> 
>>> instrum = ng.p.Instrumentation(
...     ng.p.Array(shape=(2,), lower=np.array([0.2, -0.5]), upper=np.array([1, 0.5])),
... )
>>> ng.optimizers.PSO(parametrization=instrum, budget=1000).minimize(lambda x: x @ x).value[0][0]
array([ 0.2      , -0.0069145])
>>> ng.optimizers.PSO(parametrization=instrum, budget=1000).minimize(lambda x: x @ x).value[0][0]
array([0.2       , 0.02596084])
>>> ng.optimizers.PSO(parametrization=instrum, budget=1000).minimize(lambda x: x @ x).value[0][0]
array([0.2       , 0.07377472])
>>> ng.optimizers.PSO(parametrization=instrum, budget=1000).minimize(lambda x: x @ x).value[0][0]
array([ 0.2      , -0.0101106])
>>> ng.optimizers.PSO(parametrization=instrum, budget=1000).minimize(lambda x: x @ x).value[0][0]
array([0.2       , 0.01947061])
>>> ng.optimizers.PSO(parametrization=instrum, budget=1000).minimize(lambda x: x @ x).value[0][0]
array([0.21244835, 0.1338858 ])
>>> ng.optimizers.PSO(parametrization=instrum, budget=1000).minimize(lambda x: x @ x).value[0][0]
array([0.21057671, 0.04487235])
>>> ng.optimizers.PSO(parametrization=instrum, budget=1000).minimize(lambda x: x @ x).value[0][0]
array([ 0.2       , -0.10922376])

It would be very helpful to know how to proceed further in such cases.

Hi @r3kste, this is quite common for global algorithms and cannot be fixed completely.

Our first approach is usually to work on the tuning parameters of the algorithm to make the failures less likely at default values (e.g. increase stopping_maxfun, try out other defaults for transform, etc.).

Another thing I realized: You currently don't expose a seed argument as we do for example in the pygmo optimizers. It would be good to add a seed option and use it as described in the last example here. I.e. we want to avoid changing the global numpy seed and instead rely on the random state of the parametrization.

Once we have the seed argument, we can think about having separate tests for stochastic optimizers.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Add wrappers for Nevergrad
3 participants