Item added to cart
In Linear Programming: A Modern Integrated Analysis, both boundary (simplex) and interior point methods are derived from the complementary slackness theorem and, unlike most books, the duality theorem is derived from Farkas's Lemma, which is proved as a convex separation theorem. The tedium of the simplex method is thus avoided.
A new and inductive proof of Kantorovich's Theorem is offered, related to the convergence of Newton's method. Of the boundary methods, the book presents the (revised) primal and the dual simplex methods. An extensive discussion is given of the primal, dual and primal-dual affine scaling methods. In addition, the proof of the convergence under degeneracy, a bounded variable variant, and a super-linearly convergent variant of the primal affine scaling method are covered in one chapter. Polynomial barrier or path-following homotopy methods, and the projective transformation method are also covered in the interior point chapter. Besides the popular sparse Cholesky factorization and the conjugate gradient method, new methods are presented in a separate chapter on implementation. These methods use LQ factorization and iterative techniques.
In Linear Programming: A Modern Integrated Analysis, both boundary (simplex) and interior point methods are derived from the complementary slackness theorem and, unlike most books, the duality theorem is derived from Farkas's Lemma, which is proved as a convex separation theorem. The tedium of the simplex method is thus avoided.
A new and inductive proof of Kantorovich's Theorem is offered, related to the convergence of Newton's method. Of the boundary methods, the book presents the (revised) primal and the dual simplex methods. An extensive discussion is given of the primal, dual and primal-dual affine scaling methods. In addition, the proof of the convergence under degeneracy, a bounded variable variant, alĂ+
Copyright © 2018 - 2024 ShopSpell