MATH 435 Linear Optimization
MATH 535 Optimization I
Instructor: Hemanshu
Kaul
Office: 234B, Engineering 1
Phone: (312) 567-3128
E-mail: kaul [at]
math.iit.edu
Time: 1:50pm, Tuesday and Thursday.
Place: 225, Stuart Bldg.
Office Hours: Tuesday 3:30-4:30pm and Wednesday 2:30-3:30pm, walk-ins, and by appointment. Emailed questions are also encouraged.
|Course Information|
|Announcements|
|Examinations|
|Homework|
|Class Log|
|Books|
|MATLAB|
|Links|
Course Information:
The Course Information Handout has extensive description of the course - topics, textbook, student evaluation policy, as well as other relevant information. Read it carefully!
What is this course really about? Required reading.
The official course syllabi: MATH 435, MATH 535.
Class Announcements:
- Thursday, 5/8 : Final Exam is coming next week. I will be available for consultation, picking up HWs and HW solutions, etc. tomorrow - Friday, 5/9, between 2pm and 4pm. You can email with questions or to schedule an appointment for next week.
- Thursday, 4/24 : HW # 11 has been uploaded. It is due in a week.
- Friday, 4/18 : EXAM # 2 on Thursday. Check the appropriate section below for details (also the email sent today).
- Friday, 4/11 : HW # 10 has been uploaded. This is the last HW before Exam#2 (less than two weeks away).
- Friday, 4/4 : HW # 9 has been uploaded. The problems are not difficult but you will need to be very clear in your concepts to solve them easily. So, start working on them right away.
- Saturday, 3/29 : HW # 8 has been uploaded.
- Wednesday, 3/5 : HW # 6 has been uploaded.
- Tuesday, 3/4 : EXAM # 1 on Thursday. Check the appropriate section below for details (also the email sent last week).
- Thursday, 2/21 : Deadline for Math 535 project is only a week away. Exam #1 is two weeks away.
- Tuesday, 2/12 : Shortened version of HW#3 has been uploaded.
- Friday, 2/1 : HW#1 solutions were emailed. Future HW solutions will also be distributed in class or through email.
- Wednesday, 1/30 : The webpage has been updated drastically.
- Wednesday, 1/30 : Make sure your IIT email is working. I sent some comments on HW#1 through email. I will send such emails regularly.
- Tuesday, 1/29 : The corrected version of HW#1 has been uploaded.
- Tuesday, 1/22 : Check this webpage regularly for homework assignments, announcements, etc.
Examinations:
- Exam # 1 : Thursday, March 6th @ 1:30pm. Topics: Based on topics covered up to and including Thursday, February 21st [total 10 lectures].
- Exam # 2 : Thursday, April 24th @ 1:30pm. Topics: Based on topics covered from and including February 26th and up to and including Thursday, April 10th [total 11 lectures].
- Final Exam : Friday, May 16th, 8:00am to 10:00am. Topics: Everything done during the semester(except last lecture) with slightly more emphasis on topics covered after April 10th.
Homework Assignments:
- Linear Algebra Background : Read and understand the Linear Algebra background (MATH 332 or
its equivalent is a pre-requisite for this course) from Section 1.5 within the first 10 days of the semester.
You need to be familiar with topics like - System of linear equations, Row operations for solving systems of linear equations, R^n as a vector space, Subspaces, Span, Linear Independence, Basis, Dimension, Nullspace of a matrix, Matrix rank, Affine subspaces.
I would suggest looking at MIT-Linear Algebra and at the list of lectures therein. You need to study and understand lectures 1-11. You can use the related video lectures to help you study.
If you have your Math 332 textbook, you can study from it as well. Here is a link to my Math 332 webpage from Fall 2006.
Of course, feel free to consult with me on any of these topics that you have difficulty with. Stop by my office during office hours.
- Tuesday, 1/22 : Read and understand Examples from Section 1.2.
- Homework #1. Due Thursday, 1/31. HW#1 solutions emailed on 2/1, Friday.
- Tuesday, 1/29 : Read and understand proof of Theorem 2.1 from Section 2.1.
- Homework #2. In BT1.5b, you have submit the proof of either (i) or (ii) (as defined in the HW). Due Thursday, 2/7. HW#2 solutions distributed on 2/21, Thursday.
- Thursday, 2/7 : Read and understand the statement of Theorem 2.2 from Section 2.2.
- Homework #3 : SUBMIT BT 2.9 and BT 2.17. Also, think about BT 2.3 and BT 2.10. Due Thursday, 2/21. HW#3 solutions distributed on 2/26, Tuesday.
- Homework #4. You can use MAPLE/MATLAB etc. for calculations if needed (please specify whenever used). Due Thursday, 2/21. HW#4 solutions distributed on 2/26, Tuesday.
- Tuesday, 2/19 : Read and understand proof of Theorem 3.2 from Section 3.2. Also make sure you understand the statement and the importance of Theorems 3.1 and 3.3 (as discussed in class).
- Thursday, 2/21 : Read and understand Examples 3.5 and 3.6.
- Homework #5. You can use Calculator/MAPLE/MATLAB etc. for calculations if needed (please specify whenever used). Due Thursday, 2/28. HW#5 solutions distributed on 2/28, Thursday.
- Tuesday, 2/26 : Read and understand Examples 3.8 and 3.9 from section 3.5.
- Homework #6 : SUBMIT BT 3.17, BT 4.1, BT 4.2, BT 4.4, BT 4.5, BT 4.8. Due Thursday, 3/13. HW#6 solutions distributed on 3/13, Thursday.
- Tuesday, 3/4 : Read and understand Example 4.6 from section 4.3, and compare it to the example done in class.
- Homework #7 : SUBMIT BT 4.7, BT 4.10, one of {BT 4.21, BT 4.26}, BT 4.27. Due Thursday, 3/27. HW#7 solutions distributed through email on 4/11, Friday.
- Thursday, 3/13 : Read the details of Examples 4.8 and 4.9 from section 4.5.
- Thursday, 3/27 : Read the details of Examples 5.2 and 5.3 from section 5.1.
- Homework #8 : SUBMIT BT 5.1, BT 5.4 (B(delta) is B+delta.E, see BT 5.2), BT 5.5abde. Due Friday, 4/4, 2pm. HW#8 solutions distributed through email on 4/11, Friday.
- Homework #9 : SUBMIT BT 5.9, BT 5.13, BT 5.14abc (think about part d), BT 6.1. Due Thursday, 4/10. HW#9 solutions distributed on 4/15, Tuesday.
- Thursday, 4/10 : Read Example 6.2 from section 6.4 if you missed the class.
- Homework #10 : SUBMIT BT 6.2, one of {BT 6.4, BT 6.5}, BT 6.6, BT 6.8. Due Thursday, 4/17. HW#10 solutions distributed through email on 4/18, Friday.
- Tuesday, 4/22 : Read and understand Example 10.3 from section 10.1.
- Homework #11 : SUBMIT BT 6.9, BT 6.11, BT 10.2, BT 10.3, BT 10.5. Due Thursday, 5/1. HW#11 solutions distributed in class on 5/8, Thursday.
- Thursday, 5/1 : Read Examples 11.4 and 11.5 from section 11.2 as further illustrations of Branch and Bound.
- Homework #12 : SUBMIT BT 11.1 [Detailed Instructions- part a: use a calculator mif you like; part b: list all the points in X and then show the CH in the figure; part c: show the final simplex tableau of the LP relaxation before you pick the first Gomory cut and then show final tableau for the LP which includes this cut and say what you would do next; part d: Its enough to draw the enumeration tree the way I drew in class and use a calculator to solve the intermediate LPs; Parts e and f: Use the statement of Theorem 11.4 to write the LP that finds Z_D but before that write convex hull of the repective set X algebraicly], BT 11.13 (attempt after Tuesday, you will need to use Theorem 11.4 for this). Due Thursday, 5/8. HW#12 solutions distributed in class on 5/8, Thursday.
- Tuesday, 5/6 : Read Example 11.9 from section 11.6.
Class Log:
- Tuesday, 1/22 : Linear Program and its various forms, Basic terminology, Matrix and vector forms for setting up an LP, Linear and Affine Functions, Dummy example to setup as LP, Setting up the Transportation problem as an LP. (From Section 1.1 and elsewhere)
- Thursday, 1/24 : Standard form of LP, How to convert any LP into standard form, When are two LPs (or optimization problems) equivalent; Why standard form? its relation to general solution of system of m linear equations in n variables, and the related concepts of Null space of A, nullity, rank, etc.; Geometric description of feasibility set of an LP, Geometric method of solution using family of parallel lines perpendicular to c, the cost vector, Different possible solutions for an LP, Linear-fractional programming problem as an LP. (From Sections 1.1, 1.4, 1.5, and elsewhere)
- Tuesday, 1/29 : Converting an optimization problem with a piecewise linear objective function into an LP, Removing absolute value function from an optimization problem, Geometric definitions - Polyhedron, Polytope, Hyperplane, Half-space, A polyhedron is the intersection of half-spaces, Convexity of a set, Convex combination and Convex Hull of a finite set of vectors, Basic properties of convexity - Polyhedron and Convex Hull are convex, etc., Extreme point of a polyhedron. (From Sections 1.3, 2.1, 2.2)
- Thursday, 1/31 : Definition and geometric intuition of extreme point, vertex, and Basic feasible solution of a polyhedron, and the proof of their equivalence, active constraints, linear independence of constraints, properties of n active constraints, basic solutions (and their possible infeasibility). (From Section 2.2)
- Tuesday, 2/5 : Completion of the proof of equivalence of extreme point, vertex, and BFS; finiteness of number of basic solutions, Adjacent basic solutions (and corresponding geometric intuition), Polyhedron in standard form, Characterization of basic solutions of a standard form Polyhedron and how it can be used construct basic solutions for an standard form LP (we used facts about row-rank and column ranks of a matrix, and completion of a basis from Linear Algebra), Example and definitions of basic variables, basic columns, basis matrix and using the basis matrix to solve for the basic variables and constructing a basic solution. (From Sections 2.2 and 2.3)
- Thursday, 2/7 : More examples for constructing Basic solutions, Relation between Bases and Basic solutions, Adjacent basic solutions with examples, Theorem that shows why we can assume all rows of A are linearly independent, Degenerate and non-degenerate Basic solutions with examples and geometric intuition, Degenerate Basic solutions in standard form polyhedron. (From Sections 2.3 and 2.4)
- Tuesday, 2/12 : Non-empty polyhedron has an extreme point iff it does not contain aline (without proof), If a standard LP has an optimal solution then it has an optimal BFS, A local search Algo for solving an Optimization problem, Feasible direction, Analysis of how to choose direction d and step length theta to move from a current BFS to a new one - jth basic direction from equality constraints, choice of theta from nondegenerate BFS, non-existence of theta for degenerate BFS, Change in the value of cost function and the corresponding reduced cost, Optimality conditions. (From Sections 2.5, 2.6, and 3.1)
- Thursday, 2/14 : Reduced cost of a basic variable, For a given feasible direction how to choose theta to maximize the step length, How the choice of the step length (and feasible direction) implies that a basic variable becomes zero and leaves the basis while a non-basic variable becomes non-zero and enters the basis, Change of basis and the associated BFS after an iteration of the Simplex Algo, An example, The five step iteration of the simplex algorithm and its termination after finite number of steps and the two possible ways in which it can terminate. Discussion on how many steps the Simplex algorithm can take to terminate - worst-case analysis and on-average-case analysis. (From Sections 3.1 and 3.2, 3.7)
- Tuesday, 2/19 : Simplex Algorithm under degeneracy, the difference between moving from BFS to BFS and moving from Basis to Basis, Cycling and how it can be avoided, pivoting rules, Bland's rule, How to minimize the computations associated with running the simplex - particularly calculating the inverse of updated basis matrix, Full tableau form of Simplex Algo, Interpretation of the tableau entries and how they are updated using elementary row operations, A few iterations of the Full tableau form of Simplex Algo on Example 3.5. (From Sections 3.2, and 3.3)
- Thursday, 2/21 : Two Detailed Examples of Full Tableau Simplex Algorithm, Revised simplex algo and how its different from full tableau simplex, Difference between their number of operations and number of entries to maintain, Lexicographic order and Lex. pivoting rule, Why lex pivoting rule giving an unique pivot row, How to setup a tableau such that all the rows (except zeroth) are lex positive, Proof of why no cycling occurs when using lex pivoting rule. (From Sections 3.3, and 3.4)
- Tuesday, 2/26 : How to find an initial BFS for a standard form LP, Two phase method using artificial variables, the three possible conclusions of the Phase 1: positive optimal cost and why that implies infeasible original LP, zero optimal cost and no artificial variable in the basis, zero optimal cost and some artificial variables in the basis, How to remove artificial variables from the optimal basis of Phase 1 - linear independence of old basis columns from A and the new incoming basis column from A, How rank(A) strictly less than m implies a zero row in B^{-1}A in the final tableau of Phase1 which leads to removal of a redundant constraint, Several Examples illustrating the Two Phase Method, Big-M method. (From Section 3.5)
- Thursday, 2/28 : Using Lagrangean relaxation (Lagrange multiplier method) to define a dual LP that maximizes a lower bound on the optimal cost of the primal LP, General rules for formulating a dual of a minimization - the relation between the constraints and variables/ rows and columns of these two LPs, Dual(Dual(Primal))= Primal, Duals of two equivalent LPs are also equivalent. (From Sections 4.1 and 4.2)
- Tuesday, 3/4 : Weak Duality Theorem and how it allows to check for optimality using a feasible solution of the dual, Strong Duality Theorem and how the optimal solution of the dual is defined using the optimal solution of the primal in standard form, Nine possibilities for the solution of a primal and its dual, Complementary slackness theorem and how is can be used to check whether a feasible solution of the primal is optimal (detailed example). (From Section 4.3)
- Thursday, 3/6 : EXAM # 1. Exam #1 solutions distributed on 3/11, Tuesday.
- Tuesday, 3/11 : The underlying principles of primal simplex and dual simplex algo, Basic (but not necessarily feasible) solutions for primal and dual using a basis matrix, dual feasibility iff reduced costs are non-negative, pivoting rule and the termination conditions for the dual simplex, Lex. pivoting rule for the dual simplex, Applying Dual simplex when Dual BFS is easily available like when the constraint vector of the primal is changed, An example for running the Dual simplex. (From Section 4.5)
- Thursday, 3/13 : The underlying properties of Dual LP - Basis to dual basic solution, feasibility of dual basic solution, active dual constraint, degeneracy of dual basic solution; example illustrating the geometry of the dual simplex, cone of vectors, geometric interpretation of Farkas' lemma using idea of positive projections, Farkas' Lemma and its proof using strong duality, Another theorem proved using strong and weak duality. (From Sections 4.5 and 4.6)
- Tuesday, 3/25 : Further discussion of a Farkas-Lemma-type theorem proved using strong and weak duality, Discussion of change in feasibility and optimality of current optimal basis and BFS when - a new variable is added, a new inequality constraint is added; and how to handle such change. (From Sections 4.6 and 5.1)
- Thursday, 3/27 : Discussion of change in feasibility and optimality of current optimal basis and BFS when - a new inequality constraint is added, a new equality constraint is added (two approaches), changes in requirement vector, changes in cost vector; and how to handle such change.(From Section 5.1)
- Tuesday, 4/1 : Discussion of change in feasibility and optimality of current optimal basis and BFS when - an entry in A is changed; and how to handle such change, Applying simplex algorithm to solve a parametric LP, global dependence of optimal cost on requirement vector b. (From Sections 5.1, 5.5 and 5.2)
- Thursday, 4/3 : Global dependence of optimal cost on requirement vector b (convexity) and on cost vector c (concavity), Example to illustrate the application of dual simplex algo to solve a parametric LP with parameter in the requirement vector, the principle of delayed column generation, the cutting stock problem - setup, relaxation to LP, solution using revised simplex and delayed column generation, the delayed column generation problem as a knapsack problem. (From Sections 5.2, 5.4, 5.5, 6.1, 6.2)
- Tuesday, 4/8 : Details of the solution of the cutting stock porblem using revised simplex and delayed column generation - initial BFS, iteration, subproblem that calculates reduced cost and generates the corresponding constraint column; Delayed column generation with retained columns and how an optimal solution for a restricted LP can be optimal for the full LP, Cutting plane methods as delayed constraint generation - the geometric intuition & its relation to delayed column generation for the dual; Dantzig-Wolfe Decomposition - standard form of LP suitable for decomposition, decomposition into a master problem using extreme points of the polyhedra associated with the disjoint sets of variables. (From Sections 6.1, 6.2, 6.3, 6.4)
- Thursday, 4/10 : Dantzig-Wolfe Decomposition - why we don't need to find out the extreme points of polyhedra to solve the master problem, using revised simplex and delayed column generation by defining subproblems associated with each underlying polyhedra which automatically finds a negative reduced cost and generates the appropriate extreme point needed to generate the corresponding constraint column in the master LP, how to find the initial BFS for the master LP, generalization to t subproblems, computational experience, storage comparison between straightforward simplex and D-W decomposition method, detailed exposition of example 6.2 for computational application of D-W decomposition method. (From Section 6.4)
- Tuesday, 4/15 : Bounds for the optimal cost of the Dantzig-Wolfe master problem in terms of the optimal costs of the subproblems, Two-stage stochastic LP - elaborate example, mathematical description, and the block structure of the problem, underlying idea of Bender's Decomposition as delayed constraint generation for such problems. (From Sections 6.4 and 6.5)
- Thursday, 4/17 : Detailed derivation of Bender's Decomposition -standard form of LP suitable for decomposition, decomposition into a master problem using extreme points of the fixed dual polyhedra, finding the initial relaxed master LP, using dual simplex algo to find the cutting plane to add to the constraints, its relation to D-W decomposition, discussion of 2-stage stochastic LPs. (From Section 6.5)
- Tuesday, 4/22 : Mixed Integer programs and their formulations - using binary variables to model various scenarios, detailed example of formulation of an MIP for sequencing problem with setup times. (From Section 10.1)
- Thursday, 4/24 : EXAM # 2. Exam #2 solutions distributed on 4/29, Tuesday.
- Tuesday, 4/29 : LP relaxation, its relation to the IP and its optimal solution, cutting plane methods, Gomory cutting planes with an example; Discussion of Exam #2. (From Sections 10.2 and 11.1)
- Thursday, 5/1 : A example to illustrate that the distance between the feasible integral solution closest to the optimal solution of LP relaxation and the optimal solution of the IP can be very large, a detailed IP example to illustrate the Branch and bound method for solving IPs including the use of bounds to prune the enumeration tree, general description of B&B, Duality theory for IPs - Lagrangean relaxations and the associated optimal costs and their relation to the optimal cost of IP, Lagrangean dual and its optimal cost as a better lower bound for the optimal cost of the IP (Weak Duality for IP). (From Sections 11.2 and 11.4)
- Tuesday, 5/6 : Rewriting the Lagrangean dual as a linear program with proof, Understanding how to apply this theorem to solve Lagrangean dual using properties of Convex Hull, describing Convex Hull of integer points in terms of linear constraints, relationship between optimal costs of the IP, Lag. dual, and the LP relaxation. (From Section 11.4)
- Thursday, 5/8 : [Not a part of the Final Exam Syllabus] Aprroximation algorithms, 2-approx. algo for Euclidean TSP, Randomized Algorithms, MAX SAT problem, 1/2-approx and (1-1/e)-approx randomized algorithms for MAX SAT, Randomized rounding technique through LP relaxation. (From Sections 11.5 and elsewhere)
Supplemental Reading:
For alternative points of view and for additional applications:
Free Online LP Book by Vanderbei
MATLAB Information:
MATLAB - getting started at IIT, by Greg Fasshauer
Crash course by Toby Driscoll
MATLAB Tutorial
Older Guide I
Older Guide II
Links for Additional Information:
Linear Programming at Wikipedia
OR Models
NEOS Optimization Guide
Mathematical Programming Glossary
Mathworld-Optimization
Myths and Counterexamples
HOME