Introduction
Mathematical optimization problem
quadratic programming
Standard and inequality form semidefinite programs
Semidefinite programs
Convex function
-
convex function plus convex function is also convex function
-
convex function minus convex function may not be convex function
Inequality can not subtract another Inequality, so above formula is not true. For example:
There are two minimum points, it is not a convex function.
Duality
Lagrange function
We consider an optimization problem in the standard form:
Lagrange function
Lagrange dual function
Lagrange dual problem
Convex conjugate
Let be a real topological vector space and let be the dual space to . Denote by
the canonical dual pairing, which is defined by
For a function
taking values on the extended real number line, its convex conjugate is the function
whose value at
is defined to be the supremum:
or, equivalently, in terms of the infimum:
This definition can be interpreted as an encoding of the convex hull of the function's epigraph in terms of its supporting hyperplanes.
Weak duality
The optimal value of the Lagrange dual problem, which we denote , is, by definition, the best lower bound on that can be obtained from the Lagrange dual function. In particular, we have the simple but important inequality
which holds even if the original problem is not convex. This property is called weak duality.
A starved camel is bigger than a horse.
Strong duality
Slater’s condition
Saddle-point interpretation
over the Lagrange function is saddle point, is the necessary and sufficient condition of that is the primal and dual optimal points, in the other words, .
Compelementary slackness
important conclusion is that
We can express the complementary slackness condition as
KKT optimality conditions
which are called the Karush-Kuhn-Tucker (KKT) conditions.
For any optimization problem with differentiable objective and constraint functions for which strong duality obtains, any pair of primal and dual optimal points must satisfy the KKT conditions.(necessary condition)
necessary condition: When is the primal and dual optimal points, optimization problem is differentiable objective and constraint functions for which strong duality obtains, then must satisfy the KKT conditions.
When the primal problem is convex, the KKT conditions are also sufficient for the points to be primal and dual optimal.
sufficient condition: When is satisfied with KKT and the primal problem is convex, then is the primal and dual optimal points.
Perturbation and sensitivity analysis
Perturbed version
We define as the optimal value of the perturbed problem
When the original problem is convex, the function is a convex function of and .
Now we assume that strong duality holds, and that the dual optimum is attained. This is the case if the original problem is convex, and Slater’s condition is satisfied. Let be optimal for the dual of the unperturbed problem. Then for all and we have
Algorithm
Steepest descent method
At the first, we are going to make an equivalent substitution.
Taylor expansion:
Then we get the direction:
Dual norm
We just need the direction, that's all.
Newton's method
The second-order Taylor approximation the hat f of the real f at the point is
The is convex quadratic function of .
The optimal necessary and sufficient condition:
The Hesse matrix is invertible due to that it is positive definiteness matrix.
The vector
is called the Newton step.
We can get the optimal of , but is a approximate function to real .
If the real is quadratic, then is the exact minimizer of . If the function is nearly quadratic, intuition suggests that should be a very good estimate of the minimizer of .
Since the real is twice differentiable, the quadratic model of the will be very accurate when is near . It follows that when is near , the point should be a very good estimate of .
Also, the is the linear approximation of the real at the point . Even so, the direction will approximately point to the real optimal.
Appendices
The minimization over :
The maximization over :
Matrix split
Dual norm