Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Will constrained optimization be supported? #45

Open
zhouxian opened this issue Feb 16, 2024 · 8 comments
Open

Will constrained optimization be supported? #45

zhouxian opened this issue Feb 16, 2024 · 8 comments
Labels
feature New feature question User queries

Comments

@zhouxian
Copy link

zhouxian commented Feb 16, 2024

Hi, thank you for the amazing library!

I was wondering if minimizing with user-specified bounds, or algorithms like projected gradient descent are supported?
If not, what would be the best practice you suggest if we are trying to solve things like argmin(norm(Ax+b)) s.t. x >=0?

Thanks in advance!

@patrick-kidger
Copy link
Owner

So right now we do support hypercube constraints in our root-finding methods (e.g. Newton or Bisection). And moreover it is sort-of-possible to cast a minimisation problem as a root-finding problem, by seeking a root of the gradient of the function. (You have to be careful as this may also converge to any other external point.) So I think the particular example you've given could be tackled today, although I'm not sure how efficient it would be as compared to a dedicated solver.

Other than that we don't currently support any kind of constraints in our solves. We may do at some point, but as always developer time is the limiting factor!

@patrick-kidger patrick-kidger added feature New feature question User queries labels Feb 16, 2024
@mjo22
Copy link

mjo22 commented Mar 6, 2024

Hello! Was also curious on this front. In particular, I am wondering if there are plans to support box constraints in BFGS, or if this can already be done in the current framework. jaxopt seems to have an implementation of this, but I’m not sure how well maintained it is.

@patrick-kidger
Copy link
Owner

For something simple like box constraints, then it should be fairly easy to add that in. We'd be happy to take a PR adding that. (Note that box constraints are already implemented for the root-finders, so you could follow a similar pattern to there.)

@f0uriest
Copy link

f0uriest commented Mar 8, 2024

If box constraints are supported it shouldn't be too much work to add an augmented Lagrangian type method for general constraints. Could maybe be like a "meta solver" where you can wrap another solver to manage the unconstrained sub problems? I can hopefully take a look at it once I finish up a few other projects.

@billtubbs
Copy link

Constrained nonlinear optimization would be great.

I'm not an optimization expert, so this may be a lot to ask, but what would it take to get a solver like IPOPT integrated with Jax? IPOPT is a tried and tested solver for large scale nonlinear optimization (with nonlinear constraints) used widely in engineering fields like chemical engineering, control, and robotics (see for example CasADi, GEKKO, Pyomo). Would be great to see more people in these fields trying out Jax.

From my limited understanding, solving large scale problems requires the solver to take advantage of the sparsity of the constraint Jacobian and Hessian matrices. Is this a fundamental problem in Jax?

@johannahaffner
Copy link
Contributor

Hi @billtubbs,

since box constraints (mentioned above) are just a special case of inequality constraints, I have been looking into ways to make some of the "ingredients" used for their implementation in various methods available. It is still early days though!

Thank you for the solver suggestions, I will look into them :)

@f0uriest
Copy link

If I remember correctly, IPOPT uses a filter to decide on accepting steps, which might be difficult to implement in JAX with compile time memory bounds? It can also switch between different "modes" dynamically which might make vmap super inefficient.

Something like LANCELOT/GALAHAD or SLSQP would probably be easier.

As for sparsity, that is usually what lets you solve large problems efficiently, though you don't necessarily need actual sparse data structures. It should be easy to just use some of the iterative solvers from lineax on the KKT system, though you may need a good preconditioner. @patrick-kidger is it possible to pass linear solver options like preconditoners through optimistix to lineax?

@patrick-kidger
Copy link
Owner

I think frequently "no" but happy to consider a PR changing that fact :)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
feature New feature question User queries
Projects
None yet
Development

No branches or pull requests

6 participants