# IAP 2022: Advanced Optimization Session (Session 1)¶

Shuvomoy Das Gupta

## Response for the advanced optimization session¶

• Convex programming (32%)
• Integer programming (28%)
• Nonconvex programming (28%)
• Semidefinite programming (5%)
• Other

Based on the responses, I will cover the following topic today.

# Topics¶

## $\textsf{Session 1: Convex Optimization and Extension}$¶

• How to model an optimization problem in a tractable manner
• Tractable optimization models
• Convex optimization through Convex.jl and JuMP
• Modeling examples of common convex programs that shows up in OR
• What to do when the problem is nonconvex

## $\textsf{Session 2: Discrete Optimization: Basic and Advanced}$¶

• Mixed Integer Linear Programs (MILP): basic
• How MILPs are solved: Branch-and-Bound algorithm
• How do we exploit structures to speed up MILPs
• Solving Mixed Integer Quadratic Problems

## Modeling an optimization problem¶

• We have our research problem, that usually come from a practical problem rooted in reality.

• While modeling the problem, we want to stay as close to reality as possible, but without making the model intractable.

• This is a balance. Safety-critical constraints have to be modeled exactly, where soft constraints can be relaxed.

• It is a good practice to be considerate during the modeling phase so we end up with a model that is practically tractable.

### Some good resources on how to model optimization problems¶

• Model building in mathematical programming by H Williams (Link)
• A very practical book for modeling LP and MILP
• Optimization models by G Calafiore and L El Ghaoui (Link)
• Good book for modeling convex optimization problems, especially Chapter 9, 10, 11
• Applications of optimization with Xpress-MP by C Guéret, C Prins, and M Sevaux (Link)
• Excellent for modeling integer optimization models, especially Chapter 3, with lot of shortcuts and hacks

## Modeling an optimization problems¶

At present the following types of problems are practically tractable. Whenever we are modeling an optmization problem, we should try to ensure that our model is one of the following:

1. Linear programs (LP)
1. Second order cone programs (SOCP)
1. Semidefinite programs (SDP)
1. Mixed-integer linear programs (MILP)

## My modeling strategies¶

• Solve the smaller optimization problem first

Thomson's Rule for First-Time Telescope Makers: "It is faster to make a four-inch mirror then a six-inch mirror than to make a six-inch mirror."

I follow a modified version of Thomson's rule

It is faster to solve a smaller simpler optimization model and then the larger complicated optimization model than to solve a larger complicated optimization model.

• Do not optimize prematurely

"The real problem is that programmers have spent far too much time worrying about efficiency in the wrong places and at the wrong times; premature optimization is the root of all evil (or at least most of it) in programming.”

• Donald Knuth, The Art of Computer Programming

## Milestones in optimization¶

• 1947: G. Dantzig, who works for US air-forces, presents the Simplex method for solving LP-problems
• 1948: J. Von Neumann establishes the theory of duality for LP-problems
• 1951: H.W. Kuhn and A.W. Tucker reinvent Karush's optimality conditions (known as KKT conditions)
• 1951: H. Markowitz presents his portfolio optimization theory => (1990 Nobel prize)
• 1954: L.R. Ford's and D.R. Fulkerson's research on network problems => start of combinatorial optimization
• 1960-1970: Many of the early works on first-order optimization algorithms are done (mostly developed in Soviet Union)
• 1983: Nesterov comes up with accelerated gradient descent
• 1984: N. Karmarkar's polynomial time algorithm for LP-problems begins a boom period for interior point methods
• 1990s: Semidefinite optimization theory
• 2015s: First-order methods become very hot again due to machine learning

## General convex optimization problem¶

Most authoritative reference is Convex Optimization by Boyd and Vandenberghe (pdf).

Convex optimization problems are defined through the notion of convex set.

Convex set

Convex function

A function is convex if and only if the region above its graph is a convex set.

## Standard form of a convex optimization problem¶

A general convex optimization problem has the following form: $$\begin{array}{ll} \underset{x\in\mathbf{R}^{d}}{\mbox{minimize}} & f_{0}(x)\\ \mbox{subject to} & a_{i}^{\top}x=b_{i},\quad i=1,\ldots,p,\\ & f_{i}(x)\leq0,\quad i=1,\ldots,m. \end{array}$$ where the equality constraints are linear and the functions $f_{0},f_{1},\ldots,f_{m}$ are convex.

## Convex optimization¶

• a subclass of optimization problems that includes LP as special case
• convex problems can look very difficult (nonlinear, even nondifferentiable), but like LP can be solved very efficiently
• convex problems come up more often than was once thought
• many applications recently discovered in control, combinatorial optimization, signal processing, communications, circuit design, machine learning, statistics, finance, . . .

## General approaches to using convex optimization¶

• Pretend/assume/hope $f_i$ are convex (around minimum) and proceed $\Rightarrow \textsf{Machine Learning in Practice}$
• easy on user (problem specifier)
• but lose many benefits of convex optimization
• risky to use in saftey-critical application domain
• an important example Adam: one of the most heavily used solver for deep learning. (also see Link)
• Verify problem is convex (around minimum) before attempting solution $\Rightarrow \textsf{Machine Learning Theory}$
• but verification for general problem description is hard
• Model the problem in a way that results in a convex formulation $\Rightarrow \textsf{Operations Research}$
• user needs to follow a restricted set of rules and methods
• convexity verification is automatic

Each has its advantages, but we focus on 3rd approach.

## How can you tell if a problem is convex?¶

• Need to check convexity of a function

Approaches:

• use basic definition, first or second order conditions, e.g., $\nabla^2 f(x) \succeq 0$
• via convex calculus: construct $f$ using
• library of basic examples or atoms that are convex
• calculus rules or transformations that preserve convexity

## Basic convex functions (convex atoms)¶

• $x^{p}$ for $p\geq1$ or $p\leq0$; $-x^{p}$ for $0\leq p\leq1$