Back

The dual problem

Consider the following problem: \[\begin{array}{rlcrl} \min & \mathbf{c}^\mathsf{T}\mathbf{x} \\ (P)~~~~~\mbox{s.t.} & \mathbf{A}\mathbf{x} \geq \mathbf{b}. \end{array}\] Previously, we showed that if \((P)\) has an optimal solution, then there exists \(\mathbf{y}^*\in\mathbb{R}^m\) such that \(\mathbf{y}^* \geq 0\), \({\mathbf{y}^*}^\mathsf{T}\mathbf{A} = \mathbf{c}^\mathsf{T}\), and \({\mathbf{y}^*}^\mathsf{T}\mathbf{b} = \gamma\) where \(\gamma\) denotes the optimal value of \((P)\).

Take any \(\mathbf{y}\in\mathbb{R}^m\) satisfying \(\mathbf{y} \geq 0\) and \(\mathbf{y}^\mathsf{T}\mathbf{A} = \mathbf{c}^\mathsf{T}\). Then we can infer from \(\mathbf{A}\mathbf{x}\geq \mathbf{b}\) the inequality \(\mathbf{y}^\mathsf{T}\mathbf{A}\mathbf{x} \geq \mathbf{y}^\mathsf{T} \mathbf{b}\), or more simply, \(\mathbf{c}^\mathsf{T}\mathbf{x} \geq \mathbf{y}^\mathsf{T} \mathbf{b}\). Thus, for any such \(\mathbf{y}\), \(\mathbf{y}^\mathsf{T} \mathbf{b}\) gives a lower bound for the objective function value of any feasible solution to \((P)\). Since \(\gamma\) is the optimal value of \((P)\), we must have \(\gamma \geq \mathbf{y}^\mathsf{T}\mathbf{b}\).

As \({\mathbf{y}^*}^\mathsf{T}\mathbf{b} = \gamma\), we see that \(\gamma\) is the optimal value of \[\begin{array}{rl} \max & \mathbf{y}^\mathsf{T}\mathbf{b} \\ (D)~~~~~\mbox{s.t.} & \mathbf{y}^\mathsf{T}\mathbf{A} = \mathbf{c}^\mathsf{T} \\ &~~~~\mathbf{y} \geq \mathbf{0}. \end{array}\] Note that \((D)\) is a linear programming problem! We call \((D)\) the dual problem of the primal problem \((P)\). We say that the dual variable \(y_i\) is associated with the constraint \({\mathbf{a}^i}^\mathsf{T} \mathbf{x} \geq b_i\) where \({\mathbf{a}^i}^\mathsf{T}\) denotes the \(i\)th row of \(\mathbf{A}\).

In other words, we define the dual problem of \((P)\) to be the linear programming problem \((D)\). In the discussion above, we saw that if \((P)\) has an optimal solution, then so does \((D)\) and the optimal values of the two problems are equal. We also saw that if \(\mathbf{x}^*\) is a feasible solution to \((P)\) and \(\mathbf{y}^*\) is a feasible solution to \((D)\), then \(\mathbf{c}^\mathsf{T}\mathbf{x}^* \geq {\mathbf{y}^*}^\mathsf{T}\mathbf{b}\).

Summarizing, we have

Theorem 3.1. (Duality Theorem for Linear Programming — a.k.a. Strong Duality) If \((P)\) has an optimal solution, then so does \((D)\) and the optimal values of the two problems are equal.

Theorem 3.2. (Weak Duality Theorem) If \(\mathbf{x}^*\) is a feasible solution to \((P)\) and \(\mathbf{y}^*\) is a feasible solution to \((D)\), then \(\mathbf{c}^\mathsf{T}\mathbf{x}^* \geq {\mathbf{y}^*}^\mathsf{T}\mathbf{b}\).

At first glance, requiring all the constraints to be \(\geq\)-inequalities as in \((P)\) before forming the dual problem seems a bit restrictive. We now see how the dual problem of a primal problem in general form can be defined. We first make two observations that motivate the definition.

Observation 1

Suppose that our primal problem contains a mixture of all types of linear constraints: \[\begin{array}{rlcrl} \min & \mathbf{c}^\mathsf{T}\mathbf{x} \\ (P')~~~~~\mbox{s.t.} & \mathbf{A}\mathbf{x} \geq \mathbf{b} \\ & \mathbf{A'}\mathbf{x} \leq \mathbf{b'} \\ & \mathbf{A''}\mathbf{x} = \mathbf{b''} \end{array}\] where \(\mathbf{A} \in \mathbb{R}^{m\times n}\), \(\mathbf{b} \in \mathbb{R}^{m}\), \(\mathbf{A}' \in \mathbb{R}^{m'\times n}\), \(\mathbf{b}' \in \mathbb{R}^{m'}\), \(\mathbf{A}'' \in \mathbb{R}^{m''\times n}\), and \(\mathbf{b}'' \in \mathbb{R}^{m''}\).

We can of course convert this into an equivalent problem in the form of \((P)\) and form its dual. However, if we take the point of view that the utility of the dual is to infer from the constraints of \((P')\) an inequality of the form \(\mathbf{c}^\mathsf{T} \mathbf{x} \geq \gamma\) with \(\gamma\) as large as possible by taking an appropriate linear combination of the constraints, we are effectively looking for \(\mathbf{y} \in \mathbb{R}^{m}\), \(\mathbf{y} \geq \mathbf{0}\), \(\mathbf{y}' \in \mathbb{R}^{m'}\), \(\mathbf{y}' \leq \mathbf{0}\), and \(\mathbf{y}'' \in \mathbb{R}^{m''}\), such that \[\mathbf{y}^\mathsf{T}\mathbf{A} +\mathbf{y'}^\mathsf{T}\mathbf{A'} +\mathbf{y''}^\mathsf{T}\mathbf{A''} = \mathbf{c}^\mathsf{T}\] with \(\mathbf{y}^\mathsf{T}\mathbf{b} +\mathbf{y'}^\mathsf{T}\mathbf{b'} +\mathbf{y''}^\mathsf{T}\mathbf{b''}\) to be maximized.

(The reason why we need \(\mathbf{y'} \leq \mathbf{0}\) is because inferring a \(\geq\)-inequality from \(\mathbf{A}'\mathbf{x} \leq \mathbf{b}'\) requires nonpositive multipliers. There is no restriction on \(\mathbf{y''}\) because the constraints \(\mathbf{A}''\mathbf{x} = \mathbf{b}''\) are equalities.)

This leads to the dual problem: \[\begin{array}{rl} \max & \mathbf{y}^\mathsf{T}\mathbf{b} +\mathbf{y'}^\mathsf{T}\mathbf{b'} +\mathbf{y''}^\mathsf{T}\mathbf{b''} \\ (D')~~~~~\mbox{s.t.} & \mathbf{y}^\mathsf{T}\mathbf{A} +\mathbf{y'}^\mathsf{T}\mathbf{A'} +\mathbf{y''}^\mathsf{T}\mathbf{A''} = \mathbf{c}^\mathsf{T} \\ &~~~~\mathbf{y} \geq \mathbf{0} \\ &~~~~\mathbf{y'} \leq \mathbf{0}. \end{array}\] In fact, we could have derived this dual by applying the definition of the dual problem to \[\begin{array}{rl} \min & \mathbf{c}^\mathsf{T}\mathbf{x} \\ \mbox{s.t.} & \mathbf{A}\mathbf{x} \geq \mathbf{b} \\ & -\mathbf{A'}\mathbf{x} \geq -\mathbf{b'} \\ & \mathbf{A''}\mathbf{x} \geq \mathbf{b''} \\ & -\mathbf{A''}\mathbf{x} \geq -\mathbf{b''}, \end{array}\] which is equivalent to \((P')\). The details are left as an exercise.

Observation 2

Consider the primal problem of the following form: \[\begin{array}{rlcrl} \min & \mathbf{c}^\mathsf{T}\mathbf{x} \\ (P'')~~~~~\mbox{s.t.} & \mathbf{A}\mathbf{x} \geq \mathbf{b} \\ & x_i \geq 0 & i \in P \\ & x_i \leq 0 & i \in N \end{array}\] where \(P\) and \(N\) are disjoint subsets of \(\{1,\ldots,n\}\). In other words, constraints of the form \(x_i \geq 0\) or \(x_i \leq 0\) are separated out from the rest of the inequalities.

Forming the dual of \((P'')\) as defined under Observation 1, we obtain the dual problem \[\begin{array}{rll} \max & \mathbf{y}^\mathsf{T}\mathbf{b} \\ (\tilde{D})~~~~~\mbox{s.t.} & \mathbf{y}^\mathsf{T}\mathbf{a}^i = c_i & i \in \{1,\ldots,n\}\backslash (P\cup N) \\ &\mathbf{y}^\mathsf{T}\mathbf{a}^i + p_i = c_i & i \in P \\ & \mathbf{y}^\mathsf{T}\mathbf{a}^i + q_i = c_i & i \in N \\ &~~~~p_i \geq 0 & i \in P \\ &~~~~q_i \leq 0 & i \in N \\ \end{array}\] where \(\mathbf{y} = \begin{bmatrix} y_1\\ \vdots \\ y_m\end{bmatrix}\). Note that this problem is equivalent to the following without the variables \(p_i\), \(i \in P\) and \(q_i\), \(i \in N\): \[\begin{array}{rll} \max & \mathbf{y}^\mathsf{T}\mathbf{b} \\ (D'')~~~~~\mbox{s.t.} & \mathbf{y}^\mathsf{T}\mathbf{a}^i = c_i & i \in \{1,\ldots,n\}\backslash (P\cup N) \\ &\mathbf{y}^\mathsf{T}\mathbf{a}^i \leq c_i & i \in P \\ & \mathbf{y}^\mathsf{T}\mathbf{a}^i \geq c_i & i \in N \\ \end{array}\] Instead of \((\tilde{D})\), \((D'')\) is defined to be the dual problem of \((P'')\). Note that \((D'')\) has fewer variables than \((\tilde{D})\).

Hence, the dual problem of \[\begin{array}{rlcrl} \min & \mathbf{c}^\mathsf{T}\mathbf{x} \\ \mbox{s.t.} & \mathbf{A}\mathbf{x} \geq \mathbf{b} \\ & \mathbf{x} \geq \mathbf{0} \end{array}\] is simply \[\begin{array}{rlcrl} \max & \mathbf{y}^\mathsf{T}\mathbf{b} \\ \mbox{s.t.} & \mathbf{y}^\mathsf{T}\mathbf{A} \leq \mathbf{c}^\mathsf{T} \\ & \mathbf{y} \geq \mathbf{0}. \end{array}\]

What we can see from the pair \((P'')\), \((D'')\) is that there is no need to associate dual variables to constraints of the form \(x_i \geq 0\) or \(x_i \leq 0\) provided we have the appropriate types of constraints in the dual problem.

Forming the dual problem of a generic LP problem

Combining the above observations lead to the following definition of the dual problem for a primal problem in general form.

Let \(\mathbf{A} \in \mathbb{R}^{m\times n}\), \(\mathbf{b} \in \mathbb{R}^m\), \(\mathbf{c} \in \mathbb{R}^n\). Let \({\mathbf{a}^i}^\mathsf{T}\) denote the \(i\)th row of \(\mathbf{A}\). Let \(\mathbf{A}_j\) denote the \(j\)th column of \(\mathbf{A}\).

Let \((P)\) denote the minimization problem with variables in the tuple \(\mathbf{x} = \begin{bmatrix} x_1\\ \vdots \\ x_n\end{bmatrix}\) given as follows:

Then the dual problem is defined to be the maximization problem with variables in the tuple \(\mathbf{y} = \begin{bmatrix} y_1\\ \vdots\\ y_m\end{bmatrix}\) given as follows:

The following table can help remember the above.

Primal (min) Dual (max)
\(\geq\) constraint \(\geq 0\) variable
\(\leq\) constraint \(\leq 0\) variable
\(=\) constraint \(\text{free}\) variable
\(\geq 0\) variable \(\leq\) constraint
\(\leq 0\) variable \(\geq\) constraint
\(\text{free}\) variable \(=\) constraint

Below is an example of a primal-dual pair of problems based on the above definition:

Consider the primal problem: \[\begin{array}{rrcrcrcl} \mbox{min} & x_1 & - & 2x_2 & + & 3x_3 & \\ \mbox{s.t.} & -x_1 & & & + & 4x_3 & = &5 \\ & 2x_1 & + & 3x_2 & - & 5x_3 & \geq & 6 \\ & & & 7x_2 & & & \leq & 8 \\ & x_1 & & & & & \geq & 0 \\ & & & x_2 & & & & \mbox{free} \\ & & & & & x_3 & \leq & 0.\\ \end{array}\] Here, \(\mathbf{A}= \begin{bmatrix} -1 & 0 & 4 \\ 2 & 3 & -5 \\ 0 & 7 & 0 \end{bmatrix}\), \(\mathbf{b} = \begin{bmatrix}5 \\6\\8\end{bmatrix}\), and \(\mathbf{c} = \begin{bmatrix}1 \\-2\\3\end{bmatrix}\). The primal problem has three constraints. So the dual problem has three variables. As the first constraint in the primal is an equation, the corresponding variable in the dual is free. As the second constraint in the primal is a \(\geq\)-inequality, the corresponding variable in the dual is nonnegative. As the third constraint in the primal is a \(\leq\)-inequality, the corresponding variable in the dual is nonpositive. Now, the primal problem has three variables. So the dual problem has three constraints. As the first variable in the primal is nonnegative, the corresponding constraint in the dual is a \(\leq\)-inequality. As the second variable in the primal is free, the corresponding constraint in the dual is an equation. As the third variable in the primal is nonpositive, the corresponding constraint in the dual is a \(\geq\)-inequality. Hence, the dual problem is: \[\begin{array}{rrcrcrcl} \mbox{max} & 5y_1 & + & 6y_2 & + & 8y_3 & \\ \mbox{s.t.} & -y_1 & + & 2y_2 & & & \leq & 1 \\ & & & 3y_2 & + & 7y_3 & = & -2 \\ & 4y_1 & - & 5y_2 & & & \geq & 3 \\ & y_1 & & & & & & \mbox{free} \\ & & & y_2 & & & \geq & 0 \\ & & & & & y_3 & \leq & 0.\\ \end{array}\]

Remarks. Note that in some books, the primal problem is always a maximization problem. In that case, what is our primal problem is their dual problem and what is our dual problem is their primal problem.

Worked examples

  1. Write down the dual problem of the following: \[\begin{array}{rrcrclll} \min & 4x_1 & - & 2x_2 \\ \text{s.t.} & x_1 & + & 2x_2 & \geq & 3\\ & 3x_1 & - & 4x_2 & = & 0 \\ & & & x_2 & \geq & 0. \end{array}\]

  2. Write down the dual problem of the following: \[ \begin{array}{rrcrcll} \min & & &3x_2 & + & x_3 \\ \mbox{s.t.} & x_1 & + & x_2 & + & 2x_3 & = & 1 \\ & x_1 & & & - & 3x_3 & \leq & 0 \\ & x_1 & , & x_2 & , & x_3 & \geq & 0. \end{array} \]

  3. Write down the dual problem of the following: \[ \begin{array}{rrcrcll} \min & x_1 & & & - & 9x_3 \\ \mbox{s.t.} & x_1 & - & 3x_2 & + & 2x_3 & = & 1 \\ & x_1 & & & & & \leq & 0 \\ & & & x_2 & & & & \mbox{free} \\ & & & & & x_3 & \geq & 0. \end{array}\]

  4. Write down the dual problem of the following: \[ \begin{array}{rrll} \min & 3x_1 \\ \mbox{s.t.} & x_1 & = & 1 \\ & x_1 & \geq & 2 \\ & x_1 & & \mbox{free} \\ \end{array}\]

  5. Prove that if the primal problem and the dual problem are not infeasible, then they both have optimal solutions.