Back

Auxiliary problem

The simplex method requires an initial basic feasible solution. So far, we have not discussed how to obtain one. We now show how we can use the simplex method to obtain an initial basic feasible solution.

Let \(\mathbf{A}\in\mathbb{R}^{m\times n}\) with \(\operatorname{rank}(\mathbf{A}) = m\). Let \(\mathbf{b}\in \mathbb{R}^m\). Assume that \(\mathbf{b} \geq \mathbf{0}\). Let \((P')\) denote the problem \[ \begin{array}{rl} \min & x_{n+1} + \cdots + x_{n+m} \\ \text{s.t.} & \begin{bmatrix} \mathbf{A} & \mathbf{I} \end{bmatrix} \begin{bmatrix} x_1 \\ \vdots \\ x_n \\ x_{n+1} \\ \vdots\\ x_{n+m}\end{bmatrix} = \mathbf{b} \\ & x_1,\ldots, x_{n+m} \geq 0 \end{array} \] where \(\mathbf{I}\) denotes the \(m\times m\) identity matrix.

The problem \((P')\) is called the auxiliary problem associated with \(\mathbf{A}\mathbf{x} =\mathbf{b},~\mathbf{x} \geq \mathbf{0}\).

Proposition 5.2.

The system \(\mathbf{A}\mathbf{x} = \mathbf{b},~\mathbf{x} \geq \mathbf{0}\) has a solution if and only if the optimal value of \((P')\) is \(0\).

Proof. Notice that \((P')\) is not unbounded since the objective function value is never negative. It also has a feasible solution since \(x_1 = \cdots = x_n = 0\), \(x_{n+i} = b_i\) for \(i = 1,\ldots, m\) satisfies all the constraints of \((P')\). Hence, by the Fundamental Theorem of Linear Programming, \((P')\) has an optimal solution.

Observe that if the optimal value of \((P')\) is 0 and if \(\mathbf{x'}\) is an optimal solution, then we must have \(x'_{n+i} = 0\) for \(i = 1\ldots, m\). Hence, \(\mathbf{x}\) such that \(x_i = x'_i\) for \(i = 1,\ldots,n\) is a solution to \(\mathbf{A}\mathbf{x} = \mathbf{b},~\mathbf{x} \geq \mathbf{0}\).

Conversely, if \(\mathbf{x}\) satisfies \(\mathbf{A}\mathbf{x} = \mathbf{b},~\mathbf{x} \geq \mathbf{0}\), then \(\mathbf{x'}\) with \(x'_i = x_i\) for \(i = 1,\ldots, n\) and \(x'_{n+1}=\cdots =x'_{n+m} = 0\) is a feasible solution to \((P')\) with objective function value \(0\) and therefore is optimal.

Proposition 5.3.

Suppose that the optimal value of \((P')\) is 0. If the simplex method with an anti-cycling rule is applied to \((P')\) starting with the basis \(B=\{n+1,\ldots,n+m\}\), it will terminate with a basic feasible solution to \(\mathbf{A}\mathbf{x} = \mathbf{b},~\mathbf{x} \geq \mathbf{0}\) after components \(n+1,\ldots,n+m\) are removed.

Proof. Note that the columns of the coefficient matrix of the equations in \((P')\) indexed by elements of \(B=\{n+1,\ldots,n+m\}\) are linearly independent. Hence, \(B\) is a basis. The basic solution \(\mathbf{x'}\) determined by \(B\) is given by \(x'_1 = \cdots = x'_n = 0\) and \(x'_{n+i} = b_i\) for \(i = 1,\ldots,m\). Since \(\mathbf{b}\geq \mathbf{0}\) by assumption, \(\mathbf{x'}\) is a basic feasible solution.

Hence, the simplex method (with Bland's anti-cycling rule) can be applied to \((P')\) starting with the basis \(B\) and it will terminate with an optimal solution, say \(\mathbf{x'}\). As we are given that the optimal value is \(0\), \(\mathbf{x'}_i = 0\) for \(i = n+1,\ldots,n+m\). Let \(\mathbf{x}\in \mathbb{R}^n\) be given by \(x_i = x'_i\) for \(i = 1,\ldots,n\). Note that we still have that the columns of \(A\) corresponding to nonzero components of \(\mathbf{x}\) are linearly independent. By the result in a previous worked example, we see that \(\mathbf{x}\) is a basic feasible solution to \(\mathbf{A}\mathbf{x} = \mathbf{b},~\mathbf{x} \geq \mathbf{0}\).

Two-phase method

The following is an outline of the two-phase method for solving \[\begin{array}{rl} \min & \mathbf{c}^\mathsf{T} \mathbf{x} \\ \text{s.t.} & \mathbf{A}^\mathsf{T} \mathbf{x} = \mathbf{b} \\ & \mathbf{x} \geq \mathbf{0} \end{array}\] where \(\mathbf{A} \in \mathbb{R}^{m\times n}\) is of rank \(m\), \(\mathbf{b} \in \mathbb{R}^m\), and \(\mathbf{c} \in \mathbb{R}^n\):

  1. Solve the auxiliary problem associated with \(\mathbf{A}^\mathsf{T} \mathbf{x} = \mathbf{b}, \mathbf{x} \geq \mathbf{0}\) to obtain a basic feasible solution \(\mathbf{x}^*\) to \(\mathbf{A}^\mathsf{T} \mathbf{x} = \mathbf{b}, \mathbf{x} \geq \mathbf{0}\).

  2. Obtain a basis \(B\) determining \(\mathbf{x}^*\) and apply the simplex method starting with the basis \(B\).

With the two-phase method, we can now solve any linear programming problem in standard form.

We end with a few words should on the efficiency of the simplex method. Computational experiments over the years suggest that the simplex method is quite efficient in practice: the number of iterations is often linear in the number of constraints. Unfortunately, in the worst case, the number of iterations that the simplex method with every popular anti-cycling rule requires is exponential in the number of variables. At the moment, there is no known variant of the simplex method that requires a polynomial number of iterations in the worst case. Finding a choice rule for the entering and leaving variables that gives a polynomial-time simplex method is still an open problem.

Worked examples

  1. Obtain a basic feasible solution to \[\begin{array}{rrcrcrcl} & x_1 & - & 2x_2 & - & x_3 & = & 1 \\ & x_1 & - & x_2 & + & x_3 & = & 3 \\ & x_1 & , & x_2 & , & x_3 & \geq & 0. \end{array}\] by solving its associated auxiliary problem.