Back

Weak duality revisited

Let \((P)\) and \((D)\) denote a primal-dual pair of linear programming problems in generic form as defined previously. As discussed before, strong duality and weak duality both hold. We now show the latter directly. That is, if \(\mathbf{x}^*\) is a feasible solution to \((P)\) and \(\mathbf{y}^*\) is a feasible solution to \((D)\), then \(\mathbf{c}^\mathsf{T}\mathbf{x}^* \geq \mathbf{y^*}^\mathsf{T}\mathbf{b}\).

Note that if \(x^*_j\) is constrained to be nonnegative, its corresponding dual constraint is \(\mathbf{y^*}^\mathsf{T} \mathbf{A}_j \leq c_j\). Hence, \((c_j - \mathbf{y^*}^\mathsf{T} \mathbf{A}_j)x^*_j \geq 0\) with equality if and only if \(x^*_j = 0\) or \(\mathbf{y^*}^\mathsf{T} \mathbf{A}_j = c_j\) (or both).

If \(x^*_j\) is constrained to be nonpositive, its corresponding dual constraint is \(\mathbf{y^*}^\mathsf{T} \mathbf{A}_j \geq c_j\). Hence, \((c_j - \mathbf{y^*}^\mathsf{T} \mathbf{A}_j)x^*_j \geq 0\) with equality if and only if \(x^*_j = 0\) or \(\mathbf{y^*}^\mathsf{T} \mathbf{A}_j = c_j\) (or both).

If \(x^*_j\) is free, its corresponding dual constraint is \(\mathbf{y^*}^\mathsf{T} \mathbf{A}_j = c_j\). Hence, \((c_j - \mathbf{y^*}^\mathsf{T} \mathbf{A}_j)x^*_j = 0\).

We can combine these three cases and obtain that \((\mathbf{c}^\mathsf{T} - \mathbf{y^*}^\mathsf{T} \mathbf{A})\mathbf{x^*} = \displaystyle\sum_{j = 1}^n (c_j - \mathbf{y^*}^\mathsf{T} \mathbf{A}_j) x^*_j \geq 0\) with equality if and only if for each \(j = 1,\ldots, n\), \[x^*_j = 0 \text{ or } \mathbf{y^*}^\mathsf{T} \mathbf{A}_j = c_j.\] (Here, the usage of “or” is not exclusive.)

Similary, \(\mathbf{y^*}^\mathsf{T}(\mathbf{A}\mathbf{x^*} - \mathbf{b}) = \displaystyle\sum_{i = 1}^n y^*_i({\mathbf{a}^i}^\mathsf{T} \mathbf{x^*} - b_i) \geq 0\) with equality if and only if for each \(i = 1,\ldots, n\), \[y^*_i = 0 \text{ or } {\mathbf{a}^i}^\mathsf{T} \mathbf{x^*} = b_i.\] (Again, the usage of “or” is not exclusive.)

Adding the inequalities \((\mathbf{c}^\mathsf{T} - \mathbf{y^*}^\mathsf{T} \mathbf{A})\mathbf{x^*} \geq 0\) and \(\mathbf{y^*}^\mathsf{T}(\mathbf{A}\mathbf{x^*} - \mathbf{b}) \geq 0\), we obtain \(\mathbf{c}^\mathsf{T}\mathbf{x}^* - \mathbf{y^*}^\mathsf{T}\mathbf{b} \geq 0\) with equality if and only if the following conditions hold: \begin{eqnarray*} x^*_j = 0 \text{ or } \mathbf{y^*}^\mathsf{T} \mathbf{A}_j = c_j~~ \text{ for } j = 1,\ldots, n \\ y^*_i = 0 \text{ or } {\mathbf{a}^i}^\mathsf{T} \mathbf{x^*} = b_i~~ \text{ for } i = 1,\ldots, m \end{eqnarray*} Hence, we have proved weak duality for the generic form and a bit more.

By strong duality, \(\mathbf{x}^*\) feasible for \((P)\) and \(\mathbf{y}^*\) feasible for \((D)\) are optimal solutions if and only if \(\mathbf{c}^\mathsf{T} \mathbf{x}^* = \mathbf{y^*}^\mathsf{T}\mathbf{b}\). Hence, we obtain the following result:

Theorem 3.3. (Complementary Slackness Theorem) Let \((P)\) and \((D)\) denote a primal-dual pair of linear programming problems in generic form. Let \(\mathbf{x}^*\) be a feasible solution to \((P)\). Let \(\mathbf{y}^*\) be a feasible solution to \((D)\). Then \(\mathbf{x}^*\) and \(\mathbf{y}^*\) are optimal solutions for the corresponding problems if and only if the following conditions, called the complementary slackness conditions, hold: \begin{eqnarray*} x^*_j = 0 \text{ or } \mathbf{y^*}^\mathsf{T} \mathbf{A}_j = c_j~~ \text{ for } j = 1,\ldots, n \\ y^*_i = 0 \text{ or } {\mathbf{a}^i}^\mathsf{T} \mathbf{x^*} = b_i~~ \text{ for } i = 1,\ldots, m \end{eqnarray*}

The complementary slackness conditions give a characterization of optimality which can be useful in solving problems such as the following example.

Example. Let \((P)\) denote the following linear programming problem: \[\begin{array}{rrcrcrlll} \mbox{min} & 2 x_1 & +& 4x_2 & + & 2x_3 \\ \mbox{s.t.} & x_1 & + & x_2 & + & 3x_3 & \leq & 1 \\ & -x_1 & + & 2x_2 & + & x_3 & \geq & 1 \\ & & & 3x_2 & - & 6x_3 & = & 0 \\ & x_1 & , & & & x_3 & \geq & 0 \\ & & & x_2 & & & & \mbox{free.} \end{array}\] Is \(\mathbf{x}^* =\left[\begin{array}{c} 0\\\frac{2}{5}\\ \frac{1}{5} \end{array}\right]\) an optimal solution to \((P)\)?

One could answer this question by solving \((P)\) and then see if the objective function value of \(\mathbf{x}^*\), assuming that its feasibiilty has already been verified, is equal to the optimal value. However, there is a way to make use of the given information to save some work.

The dual problem of \((P)\) is: \[\begin{array}{rrcrcrlll} \max & y_1 & +& y_2 & & \\ (D)~~~~~~~\mbox{s.t.} & y_1 & - & y_2 & & & \leq & 2 \\ & y_1 & + & 2y_2 & + & 3y_3 & = & 4 \\ & 3y_1 & + & y_2 & - & 6y_3 & \leq & 2 \\ & y_1 & & & & & \leq & 0 \\ & & & y_2 & & & \geq & 0 \\ & & & & & y_3 & & \mbox{free.} \end{array}\] One can check that \(\mathbf{x}^*\) is a feasible solution to \((P)\). If \(\mathbf{x}^*\) is optimal, then there must exist a feasible solution \(\mathbf{y}^*\) to \((D)\) satisfying together with \(\mathbf{x}^*\) the complementary slackness conditions: \begin{eqnarray*} y_1^* = 0 & \mbox{ or } & ~~x_1^* + ~~x_2^* + 3x_3^* = 1 \\ y_2^* = 0 & \mbox{ or } & -x_1^* + 2x_2^* +~~x_3^* = 1 \\ y_3^* = 0 & \mbox{ or } & ~~~~~~~~~ 3x_2^* - 6x_3^* = 0 \\ x_1^* = 0 & \mbox{ or } & ~~y_1^* - ~~y_2^* ~~~~~~~~ = 2 \\ x_2^* = 0 & \mbox{ or } & ~~y_1^* + 2y_2^* + 3y_3^* = 4 \\ x_3^* = 0 & \mbox{ or } & ~3y_1^* + ~~y_2^* - 6y_3^* = 2. \\ \end{eqnarray*} As \(x_2^*, x_3^* \gt 0\), satisfying the above conditions require that \begin{eqnarray*} y_1^* + 2y_2^* + 3y_3^* & = & 4 \\ 3y_1^* + y_2^* - 6y_3^* & = & 2. \end{eqnarray*} Solving for \(y_2^*\) and \(y_3^*\) in terms of \(y_1^*\) gives \(y_2^* = 2 - y_1^*\), \(y_3^* = \frac{1}{3}y_1^*\). To make \(\mathbf{y}^*\) feasible to \((D)\), we can set \(y_1^* = 0\) to obtain the feasible solution \(y_1^* = 0, y_2^* = 2\), \(y_3^* = 0\). We can check that this \(\mathbf{y}^*\) satisfies the complementary slackness conditions with \(\mathbf{x}^*\). Hence, \(\mathbf{x}^*\) is an optimal solution to \((P)\) by the Complementary Slackness Theorem.

Worked examples

  1. Let \((P)\) and \((D)\) denote a primal-dual pair of linear programming problems. Prove that if \((P)\) is not infeasible and \((D)\) is infeasible, then \((P)\) is unbounded.

  2. Let \((P)\) denote the following linear programming problem: \[\begin{array}{rrcrcrlll} \mbox{min} & & & 4x_2 & + & 2x_3 \\ \mbox{s.t.} & x_1 & + & x_2 & + & 3x_3 & \leq & 1 \\ & x_1 & - & 2x_2 & + & x_3 & \geq & 1 \\ & x_1 & + & 3x_2 & - & 6x_3 & = & 0 \\ & x_1 & , & & & x_3 & \geq & 0 \\ & & & x_2 & & & & \mbox{free.} \end{array}\] Determine if \(\begin{bmatrix} x_1\\x_2\\x_3 \end{bmatrix} =\begin{bmatrix} \frac{3}{5} \\ -\frac{1}{5}\\ 0 \end{bmatrix}\) is an optimal solution to \((P)\)

  3. Let \((P)\) denote the following linear programming problem: \[\begin{array}{rrcrcrlll} \mbox{min} & x_1 & + & 2x_2 & - & 3x_3 \\ \mbox{s.t.} & x_1 & + & 2 x_2 & + & 2x_3 & = & 2 \\ & -x_1 & + & x_2 & + & x_3 & = & 1 \\ & -x_1 & + & x_2 & - & x_3 & \geq & 0 \\ & x_1 & , & x_2 & , & x_3 & \geq & 0 \end{array}\] Determine if \(\begin{bmatrix} x_1\\x_2\\x_3 \end{bmatrix} =\begin{bmatrix} 0 \\ 1\\ 0 \end{bmatrix}\) is an optimal solution to \((P)\)

.