We will learn about basic and advanced implementation of MILP problems through a running example. The example problem that we select is the knapsack problem, which is a very famous $NP$-hard problem.
Knapsack problem
Given a set of items, each with a weight and a value, determine the number of each item to include in a collection so that the total weight is less than or equal to a given limit and the total value is as large as possible.
It derives its name from the problem faced by someone who is constrained by a fixed-size knapsack and must fill it with the most valuable items.
The problem often arises in resource allocation where the decision makers have to choose from a set of non-divisible projects or tasks under a fixed budget or time constraint, respectively.
Graphical example: which boxes should be chosen to maximize the amount of money while still keeping the overall weight under or equal to 15 kg?
(Solution: if any number of each box is available, then three yellow boxes and three grey boxes; if only the shown boxes are available, then all but not the green box.)
Modeling the problem
We are given a set of $n$ items numbered from 1 up to $n$. We are allowed to select multiple units of the same item.
Each item has a weight $w_{i}$ kg,
Each item has monetary value $v_{i}$ dollar,
We have a bag with maximum weight capacity $W$,
Goal: We want to put as many of the items as possible in this bag to maximize the monetary value of the bag
An optimization problem that will model the situation above is:
$$ \begin{array}{ll} \underset{x\in\mathbf{R}^{n}}{\mbox{minimize}} & \sum_{i=1}^{n}v_{i}x_{i}\\ \mbox{subject to} & \sum_{i=1}^{n}w_{i}x_{i}\leq W,\\ & x_{i}\in\{0,1,2,\ldots\},\quad i=1,\ldots,n. \end{array} $$This is a integer optimization problem.
Let us first solve the problem using the basic implementation.
# Problem data
v = [9, 3, 2, 6, 10, 7, 6, 6, 4, 5, 4, 1, 1,
6, 3, 2, 3, 2, 10, 7, 9, 8, 3, 10, 8, 5,
3, 8, 3, 6, 5, 7, 8, 6, 9, 7, 5, 5, 1, 5,
9, 5, 4, 5, 5, 3, 4, 8, 8, 10]
w = [4, 10, 9, 8, 8, 4, 7, 3, 1, 10, 5, 4, 2,
3, 9, 9, 9, 5, 8, 8, 4, 2, 6, 10, 5, 7, 7,
8, 4, 7, 7, 8, 4, 4, 10, 3, 9, 2, 9, 10, 3,
7, 3, 5, 5, 7, 10, 10, 5, 8]
W = 50
n = length(v)
50
using JuMP, Gurobi
knapsack_model = Model(Gurobi.Optimizer)
Set parameter Username Academic license - for non-commercial use only - expires 2022-03-13
A JuMP Model Feasibility problem with: Variables: 0 Model mode: AUTOMATIC CachingOptimizer state: EMPTY_OPTIMIZER Solver name: Gurobi
Add the variable $x$ along with the binary constraint.
@variable(knapsack_model, x[1:n] >= 0, Int)
50-element Vector{VariableRef}: x[1] x[2] x[3] x[4] x[5] x[6] x[7] x[8] x[9] x[10] x[11] x[12] x[13] ⋮ x[39] x[40] x[41] x[42] x[43] x[44] x[45] x[46] x[47] x[48] x[49] x[50]
Add the objective $v^\top x$.
@objective(knapsack_model, Max, v' * x)
Add the constraint $w^\top x \leq W$.
@constraint(knapsack_model, w' * x <= W)
optimize!(knapsack_model)
Gurobi Optimizer version 9.5.0 build v9.5.0rc5 (win64) Thread count: 4 physical cores, 8 logical processors, using up to 8 threads Optimize a model with 1 rows, 50 columns and 50 nonzeros Model fingerprint: 0x369feda4 Variable types: 0 continuous, 50 integer (0 binary) Coefficient statistics: Matrix range [1e+00, 1e+01] Objective range [1e+00, 1e+01] Bounds range [0e+00, 0e+00] RHS range [5e+01, 5e+01] Found heuristic solution: objective 116.0000000 Presolve removed 1 rows and 50 columns Presolve time: 0.00s Presolve: All rows and columns removed Explored 0 nodes (0 simplex iterations) in 0.00 seconds (0.00 work units) Thread count was 1 (of 8 available processors) Solution count 2: 200 116 Optimal solution found (tolerance 1.00e-04) Best objective 2.000000000000e+02, best bound 2.000000000000e+02, gap 0.0000% User-callback calls 178, time in user-callback 0.00 sec
println(abs.(value.(x)))
[0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 50.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0]
objective_value(knapsack_model)
200.0
In practice, the MILP problem that we are interested in solving, comes with specific structure, and callback functions allow us a way to provide the solver with such structure specific insight.
JuMP provides solver-independent support for three types of callbacks:
We see in the solution of the knapsack problem that, it has put all the eggs in one basket. This may not be a desirable solution. So, we want the optimal solution to satisfyi the following condition. We want the maximum difference between the selected quantities of any two items to be no more that $\alpha$. This requirement leads to $$2 {n \choose 2}$$ constraints of the form: $$x_i - x_j \leq \alpha \quad \forall i \neq j.$$
Instead of enumerating all of them and adding them a priori to the model, we may use a technique known as lazy constraints.
In particular, every time our solver reaches a new solution, for example with a heuristic or by solving a problem at a node in the branch and bound tree, it will give the user the chance to provide constraint(s) that would make the current solution infeasible.
Lazy constraints are useful when the full set of constraints is too large to explicitly include in the initial formulation. When a MIP solver reaches a new solution, for example with a heuristic or by solving a problem at a node in the branch-and-bound tree, it will give the user the chance to provide constraints that would make the current solution infeasible.
JuMP
¶There are three important steps to providing a lazy constraint callback in JuMP.
Callback function: a function that will analyze the current solution. This function takes as argument a reference to the callback management code inside JuMP. Currently, the only thing we may query in a callback is the primal value of the variables using the function "callback_value".
Lazy constraint: after analyzing the current solution, we generate a new constraint using the
"con = @build_constraint(...)"
macro and submit it to the model via the MOI interface
"MOI.submit(model, MOI.LazyConstraint(cb), con)."
Lazy constraint callback: we again use the MOI interface to tell the solver which function should be used for lazy constraint generation
"MOI.set(model, MOI.LazyConstraintCallback(), my_callback)."
new_knapsack_model = Model(Gurobi.Optimizer)
Set parameter Username Academic license - for non-commercial use only - expires 2022-03-13
A JuMP Model Feasibility problem with: Variables: 0 Model mode: AUTOMATIC CachingOptimizer state: EMPTY_OPTIMIZER Solver name: Gurobi
@variable(new_knapsack_model , x[1:n] >= 0, Int);
@objective(new_knapsack_model , Max, v' * x)
@constraint(new_knapsack_model , w' * x <= W)
function callback_function(cb_data)
if callback_value(x[1]) == 1
con = @build_constraint(x[3] == 1)
MOI.submit(knapsack_model, MOI.LazyConstraint(cb_data), con)
println("[🙀 ] The lazy constraint is submitted", status)
# The status shows if the submitted heuristic solution is accepted or not
end
end
MOI.set(knapsack_model, MOI.HeuristicCallback(), good_callback_function)
optimize!(knapsack_model)
Set parameter Username Academic license - for non-commercial use only - expires 2022-03-13 Gurobi Optimizer version 9.5.0 build v9.5.0rc5 (win64) Thread count: 4 physical cores, 8 logical processors, using up to 8 threads Optimize a model with 1 rows, 5 columns and 5 nonzeros Model fingerprint: 0x51d1e86e Variable types: 0 continuous, 5 integer (5 binary) Coefficient statistics: Matrix range [2e+00, 8e+00] Objective range [2e+00, 7e+00] Bounds range [0e+00, 0e+00] RHS range [1e+01, 1e+01] Found heuristic solution: objective 8.0000000 Presolve removed 1 rows and 5 columns Presolve time: 0.00s Presolve: All rows and columns removed Explored 0 nodes (0 simplex iterations) in 0.00 seconds (0.00 work units) Thread count was 1 (of 8 available processors) Solution count 2: 16 8 Optimal solution found (tolerance 1.00e-04) Best objective 1.600000000000e+01, best bound 1.600000000000e+01, gap 0.0000% User-callback calls 236, time in user-callback 0.06 sec