MATH 535 Optimization I

**Instructor:** Hemanshu Kaul

**Office:** 125C, Engineering 1

**Phone:** (312) 567-3128

**E-mail:** kaul [at] iit.edu

**Time:** 11:25am, Monday and Wednesday.

**Place:** 121, Engg. 1 Bldg.

**Office Hours:** 1pm-2pm Monday and Wednesday; walk-ins; and by appointment. Emailed questions are also encouraged.

**TA Office Hours:** Chris Mitillos, cmitillo [at] hawk.iit.edu, 12noon-3pm Friday, 129, E1 bldg.

|Course Information| |Advice| |Announcements| |Examinations| |Homework| |Class Log| |Books| |MATLAB/ Mathematica| |Useful Links|

The

The official course syllabi: MATH 435, MATH 535.

Excellent advice by Doug West on how to write homework solutions for proof-based problems.

Why do we have to learn proofs?

Understanding Mathematics - a study guide

On a more abstract note, here is a discussion of Language and Grammar of Mathematics - which is what you are starting to learn in a course like this.

Excellent advice for math majors and graduate students, by Terry Tao, 2006 Fields medallist. Required reading.

*Wednesday, 2/26*: HW # 6 has been uploaded. It is due a week after the mid-term Exam#1.*Monday, 2/24*: Date for Mid-term Exam #2 have been moved by two days. Look below in the appropriate section.*Wednesday, 2/12*: Dates for Mid-term Exams #1 and #2 and Final Exam have been announced. Look below in the appropriate section.*Tuesday, 1/14*: Check this webpage regularly for homework assignments, announcements, etc.

*Exam # 1*: Wednesday, 2/26. Syllabus: Based on topics covered up to and including 2/12 [total 9 lectures].*Exam # 2*: Monday, 4/14. Syllabus: Based on topics covered from 2/17 up to 4/2 (including both dates [total 11 lectures].*Final Exam*: Tuesday, May 6, 8am - 10am. Syllabus: All topics covered during the semester.

__Homework #0__

*Linear Algebra Background*: Read and understand the Linear Algebra background (MATH 332 or its equivalent is a pre-requisite for this course) from Section 1.5 within the first 10 days of the semester.

You need to be familiar with topics like - System of linear equations, Row operations for solving systems of linear equations,**R**^n as a vector space, Subspaces, Span, Linear Independence, Basis, Dimension, Nullspace of a matrix, Matrix rank, Affine subspaces.

You can pick up any standard undergraduate linear algebra textbook from the IIT library, like Anton, Elementary Linear Algebra, 9th/ 10th ed. or Lay, Linear Algebra and its applications, 3rd ed., and follow along the topics covered in my IIT course at Math 332.

Another possibility is to use the online Linear Algebra lecture notes. Focus on the chapters on: Systems of Equations and Matrices, and Vector Spaces. You can practice simple problems .

The third option is to use theMIT linear algebra site and study at least the lectures 1-11 listed in the syllabus. You can also use their video lectures (focus on the first 11 lectures).

Of course, feel free to consult with me on any of these topics that you have difficulty with. Stop by my office during office hours.*Monday, 1/13*: Read and understand Examples from Section 1.2.**Homework #1 [pdf].**Due Wednesday, 1/22. Solutions distributed on 1/27.: SUBMIT BT 1.4, BT 1.5ab, BT 1.6. Due Wednesday, 1/29. Solutions distributed on 2/3, Monday.__Homework #2__

(Comment: In 1.5b, proving the following two statements is enough to finish the proof. You need to submit the proof of only ONE of these two statements. Below, P0 refers to the given optimization problem, P1 refers to the first reformulation, and P2 refers to the second reformulation. (i) Given a feasible solution of P0, find a feasible solution of P1 with cost at most the cost in P0. And, given a feasible solution of P1, find a feasible solution of P0 with cost at most the cost in P1. (ii) Given a feasible solution of P0, find a feasible solution of P2 with cost at most the cost in P0. And, given a feasible solution of P2, find a feasible solution of P0 with cost at most the cost in P2. ): SUBMIT BT 2.4 and construct an non-empty polyhedron with no extreme points, BT 2.6a, BT 2.10bcde, BT 2.16. Due Wednesday, 2/5. Solutions distributed on 2/10, Monday.__Homework #3__

(Comment: For all these problems assume that we have already proved that: Every non-empty polyhedron in the standard form has at least one extreme point)

*Wednesday, 2/5*: Read and understand the definition and example of Degenerate basic solutions from Section 2.4 on page 59.: SUBMIT BT 2.3 (only state (do not prove) an analog of Theorem 2.4), BT 2.9, BT 2.17. Due Wednesday, 2/12. Solutions distributed on 2/17, Monday.__Homework #4__**Homework #5 [pdf].**You can use calculators, Mathematica, etc. for calculations if needed (please specify whenever used) but you still have to show**all**the intermediate steps. Due Wednesday, 2/19. Solutions distributed on 2/19, Wednesday.*Wednesday, 2/19*: Read and understand Examples 3.5 and 3.6.

Read and understand Example 3.8 from section 3.5, and the examples for the 2-phase method in the lecture log below.: SUBMIT one of (BT3.12 or BT3.21a), BT 3.17, BT 3.19 (in part (a), they are really asking for multiple optimal bases and not multiple optimal solutions), BT3.20abc. Also think about (but don't submit) BT 3.24. Due Wednesday, 3/5. Solutions distributed on 3/12, Wednesday.__Homework #6__*Thursday, 3/5*: Read and understand Example 4.6 from section 4.3.: SUBMIT [Math 435 students submit 5 out of 6, Math 535 submit all 6 problems] BT 4.1, BT 4.2, BT 4.4, BT 4.5abc, BT 4.7, BT 4.8. Due Thursday, 3/12. Solutions distributed on 3/24, Monday.__Homework #7__*Wednesday, 3/12*: Read details of Examples 4.8 and 4.9 from Section 4.5.: SUBMIT BT 4.10, BT 4.11, BT 4.25, one of {BT 4.21, BT 4.26}. Due Wednesday, 3/26. Solutions distributed on 3/31, Monday.__Homework #8__*Thursday, 3/8*: Read the details of Examples 5.1, 5.2, and 5.3 from Section 5.1.

Read the short discussions on changes in c and in nonbasic column of A.: SUBMIT BT 4.27, BT 5.1, BT 5.4 [Comment: B(delta) is B+delta.E, see BT5.2 for definition. Hint: Calculate B(delta)^-1 explicitly], BT 5.5abde. Due Wednesday, 4/2. Solutions distributed on 4/7, Monday.__Homework #9__*Wednesday, 4/2*: For your knowledge, read the statements of Theorem 4.15 and Corollary 4.4, and the solution of Example 4.10, from Section 4.9.: SUBMIT BT 5.9, BT 5.13 (Review page 159, section 4.5 with regard to uniqueness), BT 5.14abc, BT 6.1. Due Wednesday, 4/9. Solutions distributed on 4/9, Wednesday.__Homework #10__*Wednesday, 4/9*: Read Example 6.2 from section 6.4, if you missed the class.*Wednesday, 4/16*: Read Example 6.5 from section 6.5.: SUBMIT BT 6.2, BT 6.4, BT 6.6, BT 6.8, BT 6.9. Due Wednesday, 4/23. Solutions to be distributed on 4/28, Monday.__Homework #11__*Wednesday, 4/23*: Read and understand Example 10.3 from section 10.1.: SUBMIT BT 6.11, BT 10.2, BT 10.5, BT 11.1 [Detailed Instructions- part a: use a calculator if you like; part b: list all the points in X and then show the CH in the figure; part c: show the final simplex tableau of the LP relaxation before you pick the first Gomory cut and then show final tableau for the LP which includes this cut and say what you would do next; part d: Its enough to draw the enumeration tree the way I drew in class and use a calculator to solve the intermediate LPs; part e : Use the statement of Theorem 11.4 to write the LP that finds Z_D but before that write convex hull of the respective set X algebraically; part f : optional].__Homework #12__

Due Friday, 5/2, at 1pm in mailbox in 210, E1, or to the TA in 129, E1. Pick up solutions from TA in 129, E1, when you hand in the HW.*Monday, 4/28*: Read Examples 11.4 and 11.5 from section 11.2 as further illustrations of Branch and Bound.

*Monday, 1/13*: Linear Program and its various forms, Basic terminology, Matrix and vector forms for setting up an LP, Linear and Affine Functions, Manufacturing example to setup as LP, Setting up the Transportation problem as an LP. (From Section 1.1 and elsewhere)*Wednesday, 1/15*: Discussion of Transportation Problem, Standard form of LP, How to convert any LP into standard form, When are two LPs (or optimization problems) equivalent; Why standard form? its relation to general solution of system of m linear equations in n variables, and the related concepts of Null space of A, nullity, rank, etc.; Geometric description of feasibility set of an LP, Geometric method of solution using family of parallel lines perpendicular to c, the cost vector, Different possible solutions for an LP. (From Sections 1.1, 1.4, 1.5, and elsewhere)*Monday, 1/20*: MLK Jr. Day*Wednesday, 1/22*: Discussion of course organization, syllabus and policies. Converting an optimization problem with a piecewise linear objective function into an LP, Removing absolute value function from an optimization problem - two methods, Linear-fractional programming problem, Geometric definitions - Polyhedron, Polytope, Hyperplane, Half-space, A polyhedron is the intersection of half-spaces, Convexity of a set. (From Sections 1.3, 2.1)*Monday, 1/27*: Convex combination and Convex Hull of a finite set of vectors, a polyhedron and a convex Hull are convex sets, Definition and geometric intuition of extreme point, vertex, and Basic feasible solution of a polyhedron, active constraints, linear independence of constraints. (From Section 2.2)*Wednesday, 1/29*: Properties of n active constraints, basic solutions (and their possible infeasibility), the proof of equivalence of extreme point, vertex, and BFS. (From Sections 2.2 and 2.3)*Monday, 2/3*: Finiteness of number of basic solutions, Adjacent basic solutions (and corresponding geometric intuition), Polyhedron in standard form, Characterization of basic solutions of a standard form Polyhedron, Characterization of basic solutions of a standard form Polyhedron and how it can be used to construct basic solutions for an standard form LP, Example and definitions of basic variables, basic columns, basis matrix and using the basis matrix to solve for the basic variables and constructing a basic solution, More examples for constructing Basic solutions, Relation between Bases and Basic solutions, Adjacent basic solutions with examples. (From Sections 2.2 and 2.3)*Wednesday, 2/5*: Theorem that shows why we can assume all rows of A are linearly independent, Degenerate and non-degenerate Basic solutions with examples and geometric intuition, Degenerate Basic solutions in standard form polyhedron, Proof of "Characterization of basic solutions of a standard form Polyhedron", Non-empty polyhedron has an extreme point iff it does not contain a line (without proof) and its consequences for std. and bounded polyhedra, If a standard LP has an optimal solution then it has an optimal BFS, A local search algo for solving an Optimization problem, Feasible direction. (From Sections 2.3, and 2.4)*Monday, 2/10*: Feasible direction, Analysis of how to choose direction d and step length theta to move from a current BFS to a new one - jth basic direction from equality constraints, Change in the value of cost function and the corresponding reduced cost, Reduced cost of a basic variable, Reduced cost of a basic variable, for a given feasible direction how to choose theta to maximize the step length. (From Sections 3.1 and 3.2)*Wednesday, 2/12*: How the choice of the step length (and feasible direction) implies that a basic variable becomes zero and leaves the basis while a non-basic variable becomes non-zero and enters the basis, Change of basis and the associated BFS after an iteration of the Simplex Algo, Optimality conditions for a Basis, The five step iteration of the simplex algorithm, Simplex Algorithm under degeneracy, the difference between moving from BFS to BFS and moving from Basis to Basis, Cycling and how it can be avoided, pivoting rules, Bland's rule, Lexicographic rule. (From Sections 3.2 and 3.4)*Monday, 2/17*: How to minimize the computations associated with running the simplex - particularly calculating the inverse of updated basis matrix, Full tableau form of Simplex Algo, Interpretation of the tableau entries and how they are updated using elementary row operations, Two Detailed Examples of Full Tableau Simplex Algorithm. (From Sections 3.2 and 3.3 and elsewhere)*Wednesday, 2/19*: Lexicographic order and Lex. pivoting rule, Why lex pivoting rule gives an unique pivot row, How to setup a tableau such that all the rows (except zeroth) are lex positive, Proof of why no cycling occurs when using lex pivoting rule, Revised simplex algo and how its different from full tableau simplex, Difference between their number of operations and number of entries to maintain, How to find an initial BFS for a standard form LP, Two phase method using artificial variables. (From Sections 3.3, 3.4, 3.5, and elsewhere)*Monday, 2/24*: The three possible conclusions of the Phase 1: positive optimal cost and why that implies infeasible original LP, zero optimal cost and no artificial variable in the basis, zero optimal cost and some artificial variables in the basis, How to remove artificial variables from the optimal basis of Phase 1 - linear independence of old basis columns from A and the new incoming basis column from A, Several Examples illustrating the Two Phase Method, How to check for optimality (Motivation), Using Lagrangean relaxation (Lagrange multiplier method) to define a dual LP. (From Sections 3.5, 4.1 and elsewhere).*Wednesday, 2/26*: Mid-term Exam#1.*Monday, 3/3*: Using Lagrangean relaxation (Lagrange multiplier method) to define a dual LP that maximizes a lower bound on the optimal cost of the primal LP, General rules for formulating a dual of a minimization - the relation between the constraints and variables/ rows and columns of these two LPs, Dual(Dual(Primal))= Primal, Duals of two equivalent LPs are also equivalent, Weak Duality Theorem - proof and how it allows to check for optimality using a feasible solution of the dual. (From Sections 4.2 and 4.3)*Wednesday, 3/5*: Strong Duality Theorem and how the optimal solution of the dual is defined using the optimal solution of the primal in standard form, Nine possibilities for the solution of a primal and its dual, Complementary slackness theorem and how is can be used to check whether a feasible solution of the primal is optimal, Detailed example for application of complementary slackness. Discussion of Midterm Exam#1. (From Section 4.3)*Monday, 3/10*: Continuation of the discussion of Midterm#1. The underlying principles of primal simplex and dual simplex algo. (From Section 4.5)*Wednesday, 3/12*: Basic (but not necessarily feasible) solutions for primal and dual using a basis matrix, dual feasibility iff reduced costs are non-negative, pivoting rule and the termination conditions for the dual simplex, pivoting rule for the dual simplex, Applying Dual simplex when Dual BFS is easily available like when the constraint vector of the primal is changed, An example for running the Dual simplex, the underlying properties of Dual LP - Basis to dual basic solution, feasibility of dual basic solution, active dual constraint, degeneracy of dual basic solution; example illustrating the geometry of the dual simplex, cone of vectors, geometric interpretation of Farkas' lemma using idea of positive projections, Farkas' Lemma. (From Sections 4.5 and 4.6)*Monday, 3/24*: Farkas' Lemma and its proof using strong duality, A Farkas-Lemma-type theorem proved using strong and weak duality, Discussion of change in feasibility and optimality of current optimal basis and BFS when - a new variable is added. (From Sections 4.6 and 5.1)*Wednesday, 3/26*: Discussion of change in feasibility and optimality of current optimal basis and BFS when - a new inequality constraint is added; and how to handle such change, a new equality constraint is added (two approaches), change in requirement vector, change in cost vector; and how to handle such change. (From Section 5.1)*Monday, 3/31*: Discussion of change in feasibility and optimality of current optimal basis and BFS when - an entry in A is changed; change in cost vector; and how to handle such change, Applying simplex algorithm to solve a parametric LP, Global dependence of optimal cost on requirement vector b. (From Sections 5.1, 5.5 and 5.2)*Wednesday, 4/2*: Global dependence of optimal cost on requirement vector b (convexity) and on cost vector c (concavity); Example to illustrate the application of dual simplex algo to solve a parametric LP with parameter in the requirement vector; the principle of delayed column generation, the cutting stock problem - setup, relaxation to LP, solution using revised simplex and delayed column generation: initial BFS, iteration, (Knapsack Problem) subproblem that calculates reduced cost and generates the corresponding constraint column. (From Sections 5.4, 5.5, 6.1, 6.2)*Monday, 4/7*: Further details of the solution of the cutting stock problem using revised simplex and delayed column generation; Delayed column generation with retained columns and how an optimal solution for a restricted LP can be optimal for the full LP, Cutting plane methods as delayed constraint generation - the geometric intuition & its relation to delayed column generation for the dual; Dantzig-Wolfe Decomposition - standard form of LP suitable for decomposition, decomposition into a master problem using extreme points of the polyhedra associated with the disjoint sets of variables. (From Sections 6.1, 6.2, 6.3, 6.4)*Wednesday, 4/9*: Dantzig-Wolfe Decomposition - why we don't need to find out the extreme points of polyhedra to solve the master problem, using revised simplex and delayed column generation by defining subproblems associated with each underlying polyhedra which automatically finds a negative reduced cost and generates the appropriate extreme point needed to generate the corresponding constraint column in the master LP, detailed exposition of example 6.2 for computational application of D-W decomposition method. (From Section 6.4)*Monday, 4/14*: EXAM # 2.*Wednesday, 4/16*: How to find the initial BFS for the master LP, generalization to t subproblems, computational experience, storage comparison between straightforward simplex and D-W decomposition method, Bounds for the optimal cost of the Dantzig-Wolfe master problem in terms of the optimal costs of the subproblems; Two-stage stochastic LP - elaborate example, mathematical description, and the block structure of the problem, underlying idea of Bender's Decomposition as delayed constraint generation for such problems, Detailed derivation of Bender's Decomposition -standard form of LP suitable for decomposition, decomposition into a master problem using extreme points of the fixed dual polyhedra. (From Sections 6.4 and 6.5, and elsewhere)*Monday, 4/21*: Wrap up of the discussion of Bender's decomposition and how the violated constraint is found; Mixed Integer programs and their formulations - using binary variables to model various scenarios, examples. (From Sections 10.1 and 10.2)*Wednesday, 4/23*: The distance between the feasible integral solution closest to the optimal solution of LP relaxation and the optimal solution of the IP can be very large. Discussion of EXAM#2. (From Sections 10.1 and 10.2)*Monday, 4/28*: Detailed example of formulation of an MIP for sequencing problem with setup times, LP relaxation, its relation to the IP and its optimal solution, Cutting plane methods, Gomory cutting planes, Gomory cutting planes - proof of correctness. (From Sections 10.2 and 11.2)*Wednesday, 4/30*: A detailed IP example to illustrate the Branch and bound method for solving IPs including the use of bounds to prune the enumeration tree, general description of B&B, Duality theory for IPs - Lagrangean relaxations and the associated optimal costs and their relation to the optimal cost of IP, Lagrangean dual and its optimal cost as a better lower bound for the optimal cost of the IP (Weak Duality for IP), Rewriting the Lagrangean dual as a linear program with proof, Understanding how to apply this theorem to solve Lagrangean dual using properties of Convex Hull, describing Convex Hull of integer points in terms of linear constraints, relationship between optimal costs of the IP, Lag. dual, and the LP relaxation. (From Section 11.4)

For an alternate point-of-view and for additional applications, refer to the following book:

LP book by Vanderbei

MATLAB - getting started at IIT, by Greg Fasshauer

A crash course

MATLAB Tutorial

Older Guide I

Older Guide II

Mathematica Tutorials.

A Beginner's guide to Mathematica.

Introduction to Mathematica.

Linear Programming at Wikipedia

OR Models

NEOS Optimization Guide

Mathematical Programming Glossary

Mathworld-Optimization

Myths and Counterexamples