You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
**OptimizationODE.jl** provides ODE-based optimization methods as a solver plugin for [SciML's Optimization.jl](https://github.com/SciML/Optimization.jl). It wraps various ODE solvers to perform gradient-based optimization using continuous-time dynamics.
4
+
5
+
## Installation
6
+
7
+
```julia
8
+
using Pkg
9
+
Pkg.add("OptimizationODE")
10
+
```
11
+
12
+
## Usage
13
+
14
+
```julia
15
+
using OptimizationODE, Optimization, ADTypes, SciMLBase
16
+
17
+
functionf(x, p)
18
+
returnsum(abs2, x)
19
+
end
20
+
21
+
functiong!(g, x, p)
22
+
@. g =2* x
23
+
end
24
+
25
+
x0 = [2.0, -3.0]
26
+
p = []
27
+
28
+
f_manual =OptimizationFunction(f, SciMLBase.NoAD(); grad = g!)
29
+
prob_manual =OptimizationProblem(f_manual, x0)
30
+
31
+
opt =ODEGradientDescent(dt=0.01)
32
+
sol =solve(prob_manual, opt; maxiters=50_000)
33
+
34
+
@show sol.u
35
+
@show sol.objective
36
+
```
37
+
38
+
## Local Gradient-based Optimizers
39
+
40
+
All provided optimizers are **gradient-based local optimizers** that solve optimization problems by integrating gradient-based ODEs to convergence:
41
+
42
+
*`ODEGradientDescent(dt=...)` — performs basic gradient descent using the explicit Euler method. This is a simple and efficient method suitable for small-scale or well-conditioned problems.
43
+
44
+
*`RKChebyshevDescent()` — uses the ROCK2 solver, a stabilized explicit Runge-Kutta method suitable for stiff problems. It allows larger step sizes while maintaining stability.
45
+
46
+
*`RKAccelerated()` — leverages the Tsit5 method, a 5th-order Runge-Kutta solver that achieves faster convergence for smooth problems by improving integration accuracy.
47
+
48
+
*`HighOrderDescent()` — applies Vern7, a high-order (7th-order) explicit Runge-Kutta method for even more accurate integration. This can be beneficial for problems requiring high precision.
49
+
50
+
You can also define a custom optimizer using the generic `ODEOptimizer(solver; dt=nothing)` constructor by supplying any ODE solver supported by [OrdinaryDiffEq.jl](https://docs.sciml.ai/DiffEqDocs/stable/solvers/ode_solve/).
51
+
52
+
## DAE-based Optimizers
53
+
54
+
!!! warn
55
+
DAE-based optimizers are still experimental and a research project. Use with caution.
56
+
57
+
In addition to ODE-based optimizers, OptimizationODE.jl provides optimizers for differential-algebraic equation (DAE) constrained problems:
58
+
59
+
*`DAEMassMatrix()` — uses the Rodas5P solver (from OrdinaryDiffEq.jl) for DAE problems with a mass matrix formulation.
60
+
61
+
*`DAEOptimizer(IDA())` — uses the IDA solver (from Sundials.jl) for DAE problems with index variable support (requires `using Sundials`)
62
+
63
+
You can also define a custom optimizer using the generic `ODEOptimizer(solver)` or `DAEOptimizer(solver)` constructor by supplying any ODE or DAE solver supported by [OrdinaryDiffEq.jl](https://docs.sciml.ai/DiffEqDocs/stable/solvers/ode_solve/) or [Sundials.jl](https://github.com/SciML/Sundials.jl).
64
+
65
+
## Interface Details
66
+
67
+
All optimizers require gradient information (either via automatic differentiation or manually provided `grad!`). The optimization is performed by integrating the ODE defined by the negative gradient until a steady state is reached.
0 commit comments