Representing point in full-rank polyhedron

Dependencies:

  1. Polyhedral set and polyhedral cone
  2. Basic feasible solutions
  3. Direction of a convex set and Recession cone
  4. Extreme directions of a convex set
  5. Rank of a matrix
  6. Condition for existence of BFS in a polyhedron
  7. Recession cone of a polyhedron
  8. Condition for polyhedral cone to be pointed
  9. Representing point in pointed polyhedral cone
  10. Point in polytope is convex combination of BFS
  11. Transitivity of convexity

Let $P = \{x \in \mathbb{R}^n: (a_i^Tx \ge b_i, \forall i \in I) \textrm{ and } (a_i^Tx = b_i, \forall i \in E)\}$ be a non-empty polyhedron. Let $A$ be the matrix whose rows are $\{a_i: i \in I \cup E\}$. Suppose $\rank(A) = n$.

Let $\{x^{(1)}, x^{(2)}, \ldots, x^{(p)}\}$ be the set of basic feasible solutions of $P$. Let $\{d^{(1)}, d^{(2)}, \ldots, d^{(q)}\}$ be the set of extreme directions of $P$. Let $\xhat \in \mathbb{R}^n$ and $B \defeq \{a_i: a_i^T\xhat = b_i\}$.

Then $\xhat \in P$ iff $\exists \lambda \in \mathbb{R}^p_{\ge 0}$ and $\exists \mu \in \mathbb{R}^q_{\ge 0}$ such that $\xhat = \sum_{i=1}^p \lambda_ix^{(i)} + \sum_{i=1}^q \mu_id^{(i)}$, $\sum_{i=1}^p \lambda_i = 1$, and $|\support(\lambda)| + |\support(\mu)| \le n + 1 - \rank(B)$. (For any vector $v \in \mathbb{R}^n$, $|\support(v)|$ is the number of non-zero entries in $v$).

Proof

A point $\xhat$ that can be represented this way lies in $P$ by the definition of direction and by the convexity of $P$. We will now assume $\xhat \in P$ and show the existence of $\lambda$ and $\mu$.

We will prove the main claim using induction on $\rank(B)$. If $\rank(B) = n$, then $\xhat$ is a BFS, so we are done. Now assume that this result is true for $\rank(B) \ge r+1$, where $r \le n-1$. We will prove this when $\rank(B) = r$. Since $\rank(B) = r \le n-1$, $\xhat$ is not a BFS.

Let $T \defeq \{i: a_i^T\xhat = b_i\}$. Let $P' \defeq \{x: (a_i^Tx = b_i, \forall i \in T) \textrm{ and } (a_i^Tx \ge b_i, \forall i \in (I \cup E) - T)\}$. Then $P' \subseteq P$ and $\xhat \in P'$. Since $P'$ is non-empty and $\rank(A) = n$, $P'$ contains at least one BFS $y$. Since $y$ is tight at $n$ linearly independent constraints, it is also a BFS of $P$.

Let $d = \xhat - y$. Since $\xhat$ is not a BFS of $P$, $d \neq 0$. For each $i \in T$, we get $a_i^Ty = b_i$ and $a_i^T\xhat = b_i$. Interpret $B$ as a matrix whose rows are $\{a_i^T: i \in T\}$. Hence, $Bd = 0$ and $d \neq 0$.

Case 1: $y + \alpha d \in P'$ for all $\alpha \ge 0$.

Hence, $d$ is a direction of $P'$. Let $D$ be the recession cone of $P$. Then $D = \{x: (a_i^Tx = 0, \forall i \in E) \textrm{ and } (a_i^Tx \ge 0, \forall i \in I)\}$. Hence, $d \in D$.

Since $\rank(A) = n$, $D$ is a pointed cone. Since $Bd = 0$, $d$ is tight at $r$ linearly independent constraints of $D$. Hence, $d$ can be represented as a non-negative combination of at most $n - r$ extreme directions of $D$ (which are the same as the extreme directions of $P$).

Hence, $\xhat = y + d$, so $|\support(\lambda)| = 1$ and $|\support(\mu)| = n-r$. This completes the proof.

Case 2: $\max_{\alpha \ge 0} (y + \alpha d \in P')$ is finite.

When $i \in T$, $a_i^Td = 0$ (since $Bd = 0$) and $a_i^Ty = b_i$ (since $y \in P'$), so $a_i^T(y + \alpha d) = a_i^Ty = b_i$. If $a_i^Td \ge 0$ for all $i$, then for all $\alpha \ge 0$, we get $a_i^T(y + \alpha d) = a_i^Ty + \alpha a_i^Td \ge a_i^Ty \ge b_i$. Thus, $y + \alpha d \in P'$ for all $\alpha \ge 0$. But this isn't possible since we're in case 2. Hence, $\exists k$ such that $a_k^Td < 0$. In fact, choose $k$ as $\argmin_{i: a_i^Td < 0} (b_i - a_i^Ty)/(a_i^Td)$. Let $\alpha^* = (b_k - a_k^Ty)/(a_k^Td)$. Note that $\alpha^* \ge 0$, since $y \in P'$. Let $\xstar = y + \alpha^* d$. We will show that $\xstar \in P'.$

When $i \in T$, we have $a_i^Td = 0$. When $a_i^Td = 0$, we get $a_i^T\xstar = a_i^Ty$. When $a_i^Td > 0$, we get $a_i^T\xstar = a_i^Ty + \alpha^* a_i^Td \ge b_i$. Now let $a_i^Td < 0$. $\alpha^* \le (b_i - a_i^Ty)/(a_i^Td) \implies a_i^Ty + \alpha^* a_i^Td \ge b_i$. Hence, $a_i^T\xstar \ge b_i$. Therefore, $\xstar \in P'$. Furthermore, $a_k^T\xstar = a_k^Ty + \alpha^* a_i^Td = b_i$.

$a_k$ isn't a linear combination of $B$, since $a_i^Td = 0$ for all $i \in T$ but $a_k^Td < 0$. Let $C$ be the matrix with rows $\{a_i^T: a_i^T\xstar = b_i\}$. Then $C$ contains the rows of $B$ and contains $a_k^T$. Hence, $\rank(C) > \rank(B) = r$.

By the inductive hypothesis, $\exists \lambda^* \in \mathbb{R}^p_{\ge 0}$ and $\exists \mu^* \in \mathbb{R}^q_{\ge 0}$ such that $\xstar = \sum_{i=1}^p \lambda^*x^{(i)} + \sum_{i=1}^q \mu^*d^{(i)}$ and $\sum_{i=1}^p \lambda^* = 1$ and $|\support(\lambda^*)| + |\support(\mu^*)| \le n + 1 - \rank(C)$. \[ \xstar = \xhat + \alpha^* (\xhat - y) \implies \xhat = \frac{1}{1+\alpha^*}\xstar + \frac{\alpha^*}{1+\alpha^*}y. \] Since $\xhat$ is a convex combination of $\xstar$ and $y$, by the transitivity of convexity, we get that $\xhat$ can be represented as a convex combination of BFSes of $P$ plus a non-negative combination of extreme directions of $P$. Furthermore, $|\support(\lambda)| + |\support(\mu)| = |\support(\lambda^*)| + |\support(\mu^*)| + 1 \le n + 2 - \rank(C) \le n + 1 - \rank(B)$.

Dependency for:

  1. LP is optimized at BFS

Info:

Transitive dependencies:

  1. /linear-algebra/vector-spaces/condition-for-subspace
  2. /linear-algebra/matrices/gauss-jordan-algo
  3. /complex-numbers/conjugation-is-homomorphic
  4. /sets-and-relations/equivalence-relation
  5. Group
  6. Ring
  7. Polynomial
  8. Integral Domain
  9. Comparing coefficients of a polynomial with disjoint variables
  10. Field
  11. Vector Space
  12. Linear independence
  13. Span
  14. Inner product space
  15. Inner product is anti-linear in second argument
  16. Semiring
  17. Matrix
  18. Stacking
  19. System of linear equations
  20. Product of stacked matrices
  21. Matrix multiplication is associative
  22. Reduced Row Echelon Form (RREF)
  23. Matrices over a field form a vector space
  24. Row space
  25. Elementary row operation
  26. Every elementary row operation has a unique inverse
  27. Row equivalence of matrices
  28. Row equivalent matrices have the same row space
  29. RREF is unique
  30. Identity matrix
  31. Inverse of a matrix
  32. Inverse of product
  33. Elementary row operation is matrix pre-multiplication
  34. Row equivalence matrix
  35. Equations with row equivalent matrices have the same solution set
  36. Rank of a matrix
  37. Basis of a vector space
  38. Linearly independent set is not bigger than a span
  39. Homogeneous linear equations with more variables than equations
  40. Rank of a homogenous system of linear equations
  41. Cone
  42. Convex combination and convex hull
  43. Transitivity of convexity
  44. Convex set
  45. Polyhedral set and polyhedral cone
  46. Basic feasible solutions
  47. Condition for existence of BFS in a polyhedron
  48. Point in polytope is convex combination of BFS
  49. Convex hull of a finite number of points in Euclidean space is bounded
  50. Extreme point of a convex set
  51. Extreme point iff BFS
  52. Condition for a point to be extreme
  53. Condition for polyhedral cone to be pointed
  54. Direction of a convex set and Recession cone
  55. Recession cone of a polyhedron
  56. Polyhedron is unbounded iff it has direction
  57. Bounded section of pointed cone
  58. Extreme directions of a convex set
  59. Extreme direction of convex cone as extreme point of intersection with hyperplane
  60. Representing point in pointed polyhedral cone