In this notebook, we show how to work with the extended formulation of a polyhedron. The convex hull of the union of polyhedra that are H-represented can be obtained as the projection of a H-representation [Theorem 3.3, B85]. In order to use the resulting polyhedron as a constraint set in an optimization problem, there is no need to compute the resulting H-representation of this projection. Moreover, other operations such as intersection are also implemented between extended H-representations. We illustrate this with a simple example. We start by defining the H-representation of a square with JuMP.
[B85] Balas, E., 1985. Disjunctive programming and a hierarchy of relaxations for discrete optimization problems. SIAM Journal on Algebraic Discrete Methods, 6(3), pp.466-486.
using JuMP
model = Model()
@variable(model, -1 <= x <= 1)
@variable(model, -1 <= y <= 1)
using Polyhedra
square = hrep(model)
H-representation LPHRep{Float64}: 4-element iterator of HalfSpace{Float64,SparseArrays.SparseVector{Float64,Int64}}: HalfSpace( [1] = -1.0, 1.0) HalfSpace( [2] = -1.0, 1.0) HalfSpace( [1] = 1.0, 1.0) HalfSpace( [2] = 1.0, 1.0)
In the following, diagonal
and antidiag
are projections of extended Hrepresentations of dimension 7
hence diamond
is a projection of an extended H-representation of dimensions 12.
diagonal = convexhull(translate(square, [-2, -2]), translate(square, [2, 2]))
antidiag = convexhull(translate(square, [-2, 2]), translate(square, [2, -2]))
diamond = diagonal ∩ antidiag
Polyhedra.Projection{LPHRep{Float64},Base.OneTo{Int64}}(HyperPlane( [1 ] = 1.0 [3 ] = -1.0 [5 ] = -1.0, 0.0) ∩ HyperPlane( [2 ] = 1.0 [4 ] = -1.0 [6 ] = -1.0, 0.0) ∩ HyperPlane( [1 ] = 1.0 [8 ] = -1.0 [10] = -1.0, 0.0) ∩ HyperPlane( [2 ] = 1.0 [9 ] = -1.0 [11] = -1.0, 0.0) ∩ HalfSpace( [3 ] = -1.0 [7 ] = -3.0, 0.0) ∩ HalfSpace( [4 ] = -1.0 [7 ] = -3.0, 0.0) ∩ HalfSpace( [3 ] = 1.0 [7 ] = 1.0, 0.0) ∩ HalfSpace( [4 ] = 1.0 [7 ] = 1.0, 0.0) ∩ HalfSpace( [5 ] = -1.0 [7 ] = -1.0, -1.0) ∩ HalfSpace( [6 ] = -1.0 [7 ] = -1.0, -1.0) ∩ HalfSpace( [5 ] = 1.0 [7 ] = 3.0, 3.0) ∩ HalfSpace( [6 ] = 1.0 [7 ] = 3.0, 3.0) ∩ HalfSpace( [7 ] = -1.0, 0.0) ∩ HalfSpace( [7 ] = 1.0, 1.0) ∩ HalfSpace( [8 ] = -1.0 [12] = -3.0, 0.0) ∩ HalfSpace( [9 ] = -1.0 [12] = 1.0, 0.0) ∩ HalfSpace( [8 ] = 1.0 [12] = 1.0, 0.0) ∩ HalfSpace( [9 ] = 1.0 [12] = -3.0, 0.0) ∩ HalfSpace( [10] = -1.0 [12] = -1.0, -1.0) ∩ HalfSpace( [11] = -1.0 [12] = 3.0, 3.0) ∩ HalfSpace( [10] = 1.0 [12] = 3.0, 3.0) ∩ HalfSpace( [11] = 1.0 [12] = -1.0, -1.0) ∩ HalfSpace( [12] = -1.0, 0.0) ∩ HalfSpace( [12] = 1.0, 1.0), Base.OneTo(2))
We don't need to compute the result of the projection to solve an optimization problem
over diamond
.
import GLPK
model = Model(GLPK.Optimizer)
@variable(model, x[1:2])
@constraint(model, x in diamond)
@objective(model, Max, x[2])
optimize!(model)
value.(x)
2-element Array{Float64,1}: 0.0 2.0
In the optimization problem, above, the auxiliary variables of the extended formulation are transparently added inside a bridge. To manipulate the auxiliary variables, one can use the extended H-representation directly instead of its projection.
import GLPK
model = Model(GLPK.Optimizer)
@variable(model, x[1:12])
@constraint(model, x in diamond.set)
@objective(model, Max, x[2])
optimize!(model)
value.(x)
12-element Array{Float64,1}: 0.0 2.0 -0.75 -0.25 0.75 2.25 0.25 -0.75 2.25 0.75 -0.25 0.75
This notebook was generated using Literate.jl.