-
Notifications
You must be signed in to change notification settings - Fork 46
Wrap gradient_free_optimizers (local) #624
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
Codecov Report❌ Patch coverage is
🚀 New features to boost your workflow:
|
Hi @janosg
|
|
fix doc fixes add tests
Hi @janosg, Changesadd a new example problem with converter in internal_optimization_problemAdded functions and converter dealing with dict input. refactor test_many_algorithms.( this is minimal and is just for passing tests on this one)
gfo
|
stopping_funval: float | None = None | ||
""""Stop the optimization if the objective function is less than this value.""" | ||
|
||
convergence_iter_noimprove: PositiveInt = 1000 # need to set high |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Should we set this to None or a really high value instead? Is there another convercence criterion we could set to a non-None value instead? We don't want all optimizers just to run until max_iter but we also don't want pre-mature stopping of course.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I am a bit confused about this.
Most of the time convergence_iter_noimprove
behaves as stopping_maxiter
. The other convergence criteria dont seem to be respected. Even after a > 100000 iterations the algorithm does not converge to a good solution. I might be missing something but hope to get a solution here.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for creating the issue over there.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Since convergence_iter_noimprove
is set to a large value, it will look for a change in a large no. of iterartions and convergence_ftol_abs
and convergence_ftol_rel
will not have effect.
…st_gfo_optimizers
…okup dict for failing tests, move nag_dfols test to nag_optimizers, move test_pygmo_optimizers
Hi @janosg, The tests are passing, could you review this as well? |
@gauravmanmode can you fix the merge conflicts? |
@janosg , I've fixed the merge conflicts. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Very nice. Just minor comments!
Check out this pull request on See visual diffs & provide feedback on Jupyter Notebooks. Powered by ReviewNB |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for the changes and sorry for the delay. This is ready to merge from my side. Maybe have a quick look again if it is ready from your side and then click on merge.
sure. no worries. i need to look once again and will let you know. |
PR Description
This PR adds optimizers from gradient_free_optimizers
The following optimizers are now available
Other Changes
experimental
toAlgoInfo
for optimizers that need to skip tests.SphereExampleInternalOptimizationProblemWithConverter
with converter ininternal_optimization_problem.py
.test_many_algorithms.py
, add aPRECISION_LOOKUP
dict for algorithm specific precision.Helper Functions