Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

WIP: A total rewrite #40

Open
wants to merge 19 commits into
base: master
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion .github/workflows/ci.yml
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@ jobs:
strategy:
fail-fast: false
matrix:
version: ['1.6', '1']
version: ['1.10', '1']
os: [ubuntu-latest, macOS-latest, windows-latest]
arch: [x64]
include:
Expand Down
2 changes: 2 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -3,3 +3,5 @@
*.jl.mem
Manifest.toml
*.DS_Store
.DS_Store
.vscode/settings.json
12 changes: 5 additions & 7 deletions Project.toml
Original file line number Diff line number Diff line change
Expand Up @@ -6,18 +6,16 @@ version = "0.4.2"
[deps]
JuMP = "4076af6c-e467-56ae-b986-b466b2749572"
LinearAlgebra = "37e2e46d-f89d-539d-b4ee-838fcccc9c8e"
MathOptInterface = "b8f27783-ece8-5eb3-8dc8-9495eed66fee"
Random = "9a3f8284-a2c9-5f02-9a11-845980a1fd5c"

[compat]
Cbc = "1"
JuMP = "0.23, 1"
MathOptInterface = "1"
julia = "1.6"
HiGHS = "1"
JuMP = "1"
julia = "1.10"

[extras]
Cbc = "9961bab8-2fa3-5c5a-9d89-47fab24efd76"
HiGHS = "87dc4568-4c63-4d18-b0c0-bb2238e4078b"
Test = "8dfed614-e22c-5e08-85e1-65c5234f0b40"

[targets]
test = ["Cbc", "Test"]
test = ["Test", "HiGHS"]
85 changes: 33 additions & 52 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -33,37 +33,21 @@ Pkg.add("PiecewiseLinearOpt")
## Use with JuMP

Current support is limited to modeling the graph of a continuous piecewise
linear function, either univariate or bivariate, with the goal of adding support
for the epigraphs of lower semicontinuous piecewise linear functions.
linear function, with a primary focus on univariate or bivariate functions.
There are also methods for more general multivariate problems.

### Univariate

Consider a piecewise linear function `f`. The function is described a domain `d`,
which is a set of breakpoints between pieces, and the function value `fd` at
those breakpoints:
Consider a piecewise linear function described by a domain `d`,
which is a set of breakpoints between pieces, and the function value at
those breakpoints given by the function `f` at those points:

```julia
julia> f(x) = sin(x)
f (generic function with 1 method)

julia> d = 0:0.5:2pi
0.0:0.5:6.0

julia> fd = f.(d)
13-element Vector{Float64}:
0.0
0.479425538604203
0.8414709848078965
0.9974949866040544
0.9092974268256817
0.5984721441039564
0.1411200080598672
-0.35078322768961984
-0.7568024953079282
-0.977530117665097
-0.9589242746631385
-0.7055403255703919
-0.27941549819892586
julia> f(x) = sin(x)
f (generic function with 1 method)
```

To represent this function in a JuMP model, do:
Expand All @@ -72,13 +56,14 @@ To represent this function in a JuMP model, do:
using JuMP, PiecewiseLinearOpt
model = Model()
@variable(model, x)
z = PiecewiseLinearOpt.piecewiselinear(model, x, d, fd; method = :CC)
z = PiecewiseLinearOpt.piecewiselinear(model, x, d, f; method = Logarithmic())
@objective(model, Min, z) # minimize f(x)
```

### Bivariate

Consider piecewise linear approximation for the function $f(x, y) = exp(x + y)$:
Consider a piecewise linear approximation for the function $f(x, y) = exp(x + y)$
on a triangular grid with a best fit pattern:

```julia
using JuMP, PiecewiseLinearOpt
Expand All @@ -92,38 +77,34 @@ z = PiecewiseLinearOpt.piecewiselinear(
0:0.1:1,
0:0.1:1,
(u, v) -> exp(u + v);
method = :DisaggLogarithmic,
method = SixStencil(),
pattern = :BestFit
)
@objective(model, Min, z)
```

## Methods

The following formualations are available in the package and is provided through the
`method` argument:

Supported multivariate formulations:
* `ConvexCombination()`
* `DisaggregatedLogarithmic()`
* `MultipleChoice()`: Limited support as it currently needs an explicit formulations with hyperplanes

Supported univariate formulations:
* `Incremental()`
* `Logarithmic()`
* `LogarithmicIndependentBranching()`
* `NativeSOS2()`
* `ZigZagBinary()`
* `ZigZagInteger()`

The following bivariate formulations are available and can be combined with most univariate
formulations to impose two axis-aligned SOS2 constraints. See the associated paper for more details.
* `K1(sos2_method)`: requires a K1 grid triangulation
* `UnionJack(sos2_method)`: requires a UnionJack grid triangulation
* `SixStencil(sos2_method)`: requires a grid triangulation
* `NineStencil(sos2_method)`: requires a grid triangulation

* Convex combination (`:CC`)
* Multiple choice (`:MC`)
* Native SOS2 branching (`:SOS2`)
* Incremental (`:Incremental`)
* Logarithmic (`:Logarithmic`; default)
* Disaggregated Logarithmic (`:DisaggLogarithmic`)
* Binary zig-zag (`:ZigZag`)
* General integer zig-zag (`:ZigZagInteger`)

Supported bivariate formulations for entire constraint:

* Convex combination (`:CC`)
* Multiple choice (`:MC`)
* Disaggregated Logarithmic (`:DisaggLogarithmic`)

Also, you can use any univariate formulation for bivariate functions as well.
They will be used to impose two axis-aligned SOS2 constraints, along with the
"6-stencil" formulation for the triangle selection portion of the constraint.
See the associated paper for more details. In particular, the following are also
acceptable bivariate formulation choices:

* Native SOS2 branching (`:SOS2`)
* Incremental (`:Incremental`)
* Logarithmic (`:Logarithmic`)
* Binary zig-zag (`:ZigZag`)
* General integer zig-zag (`:ZigZagInteger`)
60 changes: 58 additions & 2 deletions src/PiecewiseLinearOpt.jl
Original file line number Diff line number Diff line change
Expand Up @@ -8,12 +8,68 @@ module PiecewiseLinearOpt
using JuMP

import LinearAlgebra
import MathOptInterface as MOI
import Random

export PWLFunction, UnivariatePWLFunction, BivariatePWLFunction, piecewiselinear

include("types.jl")
include("jump.jl")

mutable struct PWLData
counter::Int
PWLData() = new(0)
end

function initPWL!(m::JuMP.Model)
if !haskey(m.ext, :PWL)
m.ext[:PWL] = PWLData()
end
return nothing
end

const VarOrAff = Union{JuMP.VariableRef,JuMP.AffExpr}

include("methods/util.jl")

export Incremental,
LogarithmicEmbedding,
LogarithmicIndependentBranching,
NativeSOS2,
ZigZagBinary,
ZigZagInteger
include("methods/univariate/incremental.jl")

include("methods/univariate/logarithmic_embedding.jl")
include("methods/univariate/logarithmic_independent_branching.jl")
include("methods/univariate/native_sos2.jl")
include("methods/univariate/zig_zag_binary.jl")
include("methods/univariate/zig_zag_integer.jl")
# ConvexCombination has an SOS2 formulation, so defer this until after the
# multivariate formulations are defined
include("methods/univariate/sos2_formulation_base.jl")

# Consider the colloqial "log" to refer to the embedding formulation
const Logarithmic = LogarithmicEmbedding
export Logarithmic

export K1,
NineStencil,
OptimalIndependentBranching,
OptimalTriangleSelection,
SixStencil,
UnionJack
include("methods/bivariate/k1.jl")
include("methods/bivariate/nine_stencil.jl")
include("methods/bivariate/optimal_independent_branching.jl")
include("methods/bivariate/optimal_triangle_selection.jl")
include("methods/bivariate/six_stencil.jl")
include("methods/bivariate/union_jack.jl")
include("methods/bivariate/common.jl")

export ConvexCombination, DisaggregatedLogarithmic, MultipleChoice
include("methods/multivariate/convex_combination.jl")
include("methods/multivariate/disaggregated_logarithmic.jl")
include("methods/multivariate/multiple_choice.jl")

include("pwlinear.jl")

end # module
Loading
Loading