Conic Programming
General Conic Program
A conic program optimizes a linear objective over the intersection of an affine subspace and a convex cone K:
minimize c^T x
subject to Ax = b
x in K
The dual conic program is:
maximize b^T y
subject to A^T y + s = c
s in K*
where K* = { s : s^T x >= 0 for all x in K } is the dual cone. When K is self-dual (K* = K), the primal and dual have symmetric structure.
LP, SOCP, and SDP form a hierarchy of increasing generality and computational cost:
LP ⊂ SOCP ⊂ SDP ⊂ Copositive Programming
Second-Order Cone Programming (SOCP)
A second-order cone (SOC, Lorentz cone, ice-cream cone):
Q^n = { (x, t) in R^{n+1} : ||x||_2 <= t }
The standard SOCP:
minimize f^T x
subject to ||A_i x + b_i||_2 <= c_i^T x + d_i, i = 1,...,m
Each constraint requires that (A_i x + b_i, c_i^T x + d_i) lies in the second-order cone.
What Reduces to SOCP
| Problem | How | |---------|-----| | LP | SOC constraints with A_i = 0 | | Quadratically constrained QP (convex) | Direct translation of ||.||^2 <= t via SOC | | Robust LP with ellipsoidal uncertainty | ||u|| <= 1 uncertainty becomes SOC | | Minimax / Chebyshev approximation | Minimize max ||a_i^T x - b_i|| | | Norm minimization | minimize ||Ax - b|| | | Portfolio optimization (mean-variance) | Risk constraint as SOC |
SOCP Duality
The second-order cone is self-dual: Q^n* = Q^n. This gives clean duality with zero duality gap under mild feasibility conditions (Slater-type).
Semidefinite Programming (SDP)
The positive semidefinite cone:
S_+^n = { X in R^{n x n} : X = X^T, X >= 0 (psd) }
Standard form SDP:
minimize <C, X> (trace inner product: tr(C^T X))
subject to <A_i, X> = b_i, i = 1,...,m
X >= 0 (X is positive semidefinite)
The dual:
maximize b^T y
subject to sum_i y_i A_i + S = C
S >= 0
The PSD cone is self-dual, so S_+^n* = S_+^n.
What Reduces to SDP
| Problem | Formulation | |---------|-------------| | LP | Diagonal X | | SOCP | 2x2 block-diagonal SDP | | Eigenvalue optimization | lambda_max(A(x)) via SDP | | Lovász theta (graph theory) | SDP relaxation | | Max-cut relaxation | Goemans-Williamson SDP | | Sum-of-squares (SOS) polynomials | SDP feasibility | | Quantum information | Density matrix constraints |
SDP Relaxations
Many NP-hard combinatorial problems admit SDP relaxations with provable approximation guarantees:
- Max-Cut: Goemans-Williamson achieves 0.878 approximation ratio via SDP + randomized rounding.
- Graph coloring, partition problems: SDP bounds often tighter than LP bounds.
- Polynomial optimization: Lasserre hierarchy of SDP relaxations converges to global optimum.
Copositive Programming
A matrix M is copositive if x^T M x >= 0 for all x >= 0. The copositive cone COP is:
COP = { M : x^T M x >= 0 for all x >= 0 }
Its dual is the completely positive cone: matrices expressible as sum of x x^T with x >= 0.
Many NP-hard problems (stable set, standard QP over the simplex) can be formulated as copositive programs. The cone itself is NP-hard to optimize over, but inner/outer approximations using SDP hierarchies provide practical bounds.
Conic Duality
Weak Duality
For conic programs, weak duality always holds: primal objective >= dual objective (for minimization with standard conventions).
Strong Duality
Strong duality for conic programs requires care. Unlike LP:
- Strong duality can fail even for feasible problems (no analogue of LP's always-strong-duality).
- Slater's condition for cones: there exists x in the interior of K satisfying Ax = b. This guarantees strong duality.
- For SDP, the interior of S_+^n is the set of positive definite matrices. So Slater requires a strictly feasible (positive definite) solution.
- Attainment of the dual optimum is not guaranteed even with zero gap.
Duality Gap Pathologies
SDPs can exhibit:
- Positive duality gap with both primal and dual feasible.
- Zero gap but dual optimum not attained.
- Infeasible primal with unbounded dual, or vice versa, without the expected complementary relationship that LP enjoys.
These pathologies arise when Slater's condition fails (facial reduction theory addresses this).
Interior Point Methods for Conic Programs
Self-Concordant Barriers
For each cone K, a self-concordant barrier function F(x) satisfies:
|D^3 F(x)[h,h,h]| <= 2 * (D^2 F(x)[h,h])^{3/2}
The barrier parameter nu determines the iteration complexity: O(sqrt(nu) * ln(1/epsilon)) iterations.
| Cone | Barrier | Parameter nu | |------|---------|-------------| | R_+^n (LP) | -sum ln(x_i) | n | | Q^n (SOC) | -ln(t^2 - ||x||^2) | 2 | | S_+^n (SDP) | -ln det(X) | n |
Primal-Dual IPM
Solve the KKT system with a barrier term, tracing the central path as the barrier parameter decreases. Each iteration involves:
- For LP: solving an m x m linear system.
- For SOCP: solving block-structured systems (one block per cone).
- For SDP: solving the Schur complement equation, O(m^2 n^2 + m n^3) per iteration.
SDP interior point methods scale as O(n^{3.5}) per iteration for dense n x n matrix variables, limiting practical problem sizes to n ~ 1000 for dense SDPs.
First-Order Methods for Large SDPs
When interior point methods are too expensive:
- ADMM-based: SCS (Splitting Conic Solver) scales to large problems with moderate accuracy.
- Augmented Lagrangian: SDPNAL+ for large-scale SDPs.
- Burer-Monteiro: low-rank factorization X = R R^T converts SDP to nonlinear program. Effective when optimal X has low rank.
Applications
Robust Optimization
Uncertain LP: min c^T x s.t. a_i^T x <= b_i for all a_i in U_i. For ellipsoidal uncertainty U_i = { a_i_bar + P_i u : ||u|| <= 1 }, the robust counterpart is SOCP.
Control Theory
- H-infinity control: SDP via Riccati inequalities.
- Lyapunov stability: finding P > 0 with A^T P + PA < 0 is an SDP feasibility problem.
- LMI (Linear Matrix Inequality) regions for pole placement.
Matrix Completion
Given partial observations of a low-rank matrix, minimize the nuclear norm (convex surrogate for rank) subject to matching observations. This is an SDP (nuclear norm ball is a spectral constraint).
Signal Processing
- Beamforming: SDP relaxation of quadratic optimization with unit-modulus constraints.
- Sensor network localization: SDP relaxation of distance constraints.
Connections
- LP as a special case (→
01-linear-programming.md). - Duality theory generalizations (→
04-duality-and-kkt.md). - Convex optimization algorithms applicable to conic problems (→
03-convex-optimization.md). - Solver support for conic programs (→
09-optimization-solvers.md).