MATH 332 Elementary Linear Algebra
Instructor: Hemanshu Kaul
Office: 125C, Rettaliata Engg.
Phone: (312) 567-3128
E-mail: kaul [at] iit.edu
Time: 10am, Monday and Wednesday.
Place: 152, Pritzker Science Center.
Office Hours: 11:15am-12:15pm Monday and Wednesday, and by appointment (send email).
Emailed questions are also encouraged.
As well as the discussion forum at Piazza.
Math TA Office Hours: Check the schedule at 129, Retalliata Engg.
ARC Tutoring Service: Mathematics tutoring at the Academic Resource Center.
Online Problem Practice: Linear Algebra book at COW (Calculus on Web).
|Course Information|
|Advice|
|Announcements|
|Examinations|
|Homework|
|Class Log|
|Links|
Course Information:
The Course Information Handout has extensive description of the course - topics, textbook, student evaluation policy, as well as other relevant information. Read it carefully!
What is this course really about? Required reading.
The official course syllabus for Math 332.
A couple of useful external handouts:
Use/Purpose of Linear Algebra, by Oliver Knill (Harvard)
A Self-Guided Aid to Proofs, by Daniel Solow
Advice for students:
Excellent advice by Doug West on how to write homework solutions for proof-based problems.
Why do we have to learn proofs?
Understanding Mathematics - a study guide
On a more abstract note, here is a discussion of Language and Grammar of Mathematics - which is what you are starting to learn in a course like this.
Excellent advice for math majors, especially those planning to go on to graduate school, by Terry Tao, 2006 Fields medallist. Required reading.
Some of the primary sources of information/discussion for careers in Mathematical Sciences:
MAA - Careers
SIAM - Careers
INFORMS - Careers
AMS - Careers
Class Announcements:
- Wednesday, 10/4 : Due to the delay caused by the Fall break, HW#7 will be due on Monday, 10/23, and Exam#2 has been postponed to Monday, 10/30.
- Monday, 9/4 : All the Exam dates have been announced below.
- Monday, 8/21 : Check this webpage regularly for homework assignments, announcements, etc.
Examinations:
- Exam #1 : 9/27, Wednesday. Topics: All the topics corresponding to the HW#1, HW#2, HW#3, HW#4.
- Exam #2 : 10/30, Monday. Topics: All the topics corresponding to the HW#5, HW#6, HW#7.
- Exam #3 : 11/20, Monday. Topics: All the topics corresponding to the HW#8, and HW#9.
- Final Exam : Wednesday, 12/6, 10:30am-12:30pm. Topics: All topics studied during the semester.
Homework Assignments:
You only have to submit solutions to written problems.
However, solving a majority of the suggested problems is strongly encouraged. Solving these problems will improve your understanding of the course material and better prepare you for the exams.
Problem numbers below are based on the 11th edition of the textbook. If you are using an earlier edition, please make sure you are solving the correct problems. (The sections 1.1., 1.2, 1.3 of the textbook are available in the preview of the textbook on Amazon.)
Remember: Homework needs to be submitted at the beginning of class on the due date. Solutions must be written clearly, legibly, and concisely, and will be graded for both mathematical correctness and presentation. Points will be deducted for sloppiness, incoherent or insufficient explanation, or for lack of intermediate steps.
Be sure to staple the pages together and write your name (and that of any collaborator), course number, assignment number, and the date of submission on the front.
Do not forget to do the reading HWs.
- Monday, 8/21 : Read examples 1 to 6 in Section 1.3
Find 2x2 (and 3x3) matrices A and B such that AB is not equal to BA.
- Wednesday, 8/23 : Read Example 6 in Section 1.1 and examples for Row-Echelon and Reduced Row-echelon forms in Section 1.2.
- Homework #1 : Due Wednesday, 8/30. HW#1 solutions distributed in class on 9/6.
Suggested Problems: Section 1.1: 1, 5, 7, 9, 21&26, TF. Section 1.2: 1, 3, 15, 19, 35, TF. Section 1.3: 23, 30.
Written Problems: [Comment: When solving a system, set up the augmented matrix and then apply row operations; clearly label each row operation applied and show all intermediate steps.]. Section 1.1: 12, 16b, 20b, TF(e)(f)(g). Section 1.2: 18, 24ac, 26, 31, 34, 43a. Section 1.3: 27, 30a.
- Wednesday, 8/30 : Read about Partitioned Matrices and Examples 7, 8, 9, 10, 11 and 12 in Section 1.3.
- Homework #2 : Due Wednesday, 9/6. HW#2 Solutions distributed in class on 9/13.
Suggested Problems: Section 1.2: 13, 33, . Section 1.3: 1, 3, 5, 7, 11, 13, 15, 23, 25, 29, 32, TF(e)(f)(l)(m).
Written Problems: Section 1.2: 8 and 12 (solve them in continuation/ together), 38, 39, 40, TF(b)(d)(g)(i). Section 1.3: 5de, 8af, 16, 36b, TF(m).
- Wednesday, 9/6 : Read Theorems 1.4.5 & 1.4.6 and Examples 7 & 9 & 10 in Section 1.4
- Homework #3 : Due Wednesday, 9/13. HW#3 solutions to be distributed in class on 9/18.
Suggested Problems: Section 1.4: 3, 4, 5, 9, 12, 13, 17, 23, 32, 35, 41, 44, 45, 49.
Written Problems: Section 1.4: 10(use Theorem 1.4.5), 24, 31(b)(c), 33(a), 36, 40, 43, 54, TF(j)(k).
- Wednesday, 9/13 : Read Examples #4 and #5 in Section 1.5; Examples #3 and #4 in Section 1.6 and pay close attention to the final expression for b
Read Theorem 1.6.4 in Section 1.6 and Theorem 1.7.1 in Section 1.7.
- Homework #4 : Due Wednesday, 9/20. HW#4 solutions distributed in class on 9/20.
Suggested Problems: Section 1.5: #1, #3, #5, #7, #9, #13, #19, #25, #27, #29, #33, TF. Section 1.6: #5, #15, #18a, #19, #21, T/F#(b)(d). Section 1.7: #1, #9, #13, #17, #21, #25, #30, #37, #47.
Written Problems: Section 1.4: #50. Section 1.5: #6b, #8c, #16, #20a, #22, #28, #32. Section 1.6: #14, #20, #22, T/F#(f). Section 1.7: #26 (just set up the system without finding the exact values for a,b,c), #28, #34b.
- Wednesday, 9/20 : Do Examples 3-5 from Section 2.1 to practice co-factor expansion for calculating determinant.
Do example 5 in Section 2.2 for a combination of Row operations and Cofactor expansion.
- Monday, 9/25 : Do Example 1 and Example 3 in Section 9.1 for an example of direct construction of L and U from the Gaussian elimination procedure.
Do Example 2 in Section 1.9 for an example of network flows.
- Homework #5 : Due Wednesday, 10/4. HW#5 solutions distributed in class on 10/9.
Suggested Problems: Section 2.1: #3, #15, #21, #41. Section 2.2: #5, #11, #17, #23, #27, #29, #33. Section 2.3: #7, #9, #17, #33, #36, #37. Section 9.1: #1, #5, TF(a)(b).
Written Problems: Section 2.1: #18, #24. Section 2.2: #20, #24, #26, #34, #TF(d). Section 2.3: #18, #34, TF(g)(h). Chapter 2 Supplementary Problems: #33. Section 9.1: #6.
- Monday, 10/2 : Re-read Examples 6 and 8 in Section 4.1.
- Wednesday, 10/4 : Read Example 6, and Examples 7-10 together with the Figure of function spaces in Section 4.2.
- Homework #6 : Due Wednesday, 10/11. HW#6 solutions distributed in class on 10/16.
Suggested Problems: Section 4.1: #1, #2, #5, #6, #9, #10, #12. Section 4.2: #1, #2, #3, #4, #5, #16, #17, T/F.
Written Problems: Section 4.1: #4, #7, #8, #19&20, and these two problems. Section 4.2: #1cd, #2abe, #3b, #4b, #18.
- Monday, 10/16 : Read Theorem 4.2.4 in Section 4.2.
Read Examples 3-5 and Theorems 4.3.2 & 4.3.3 in Section 4.3.
Read Definition 1 and Examples 1-4 in Section 4.4.
- Homework #7 : Due Monday, 10/23. HW#7 solutions distributed in class on 10/23.
Suggested Problems: Section 4.2: #7, #9, #11, #13. Section 4.3: #1, #2, #3, #5, #11, #26. Section 4.4: #3, #5.
Written Problems: Section 4.2: #9a, #10a, #12c, TF(g)(h). Section 4.3: #4, #6, #10a, #22, TF(e). Section 4.4: #4, #6.
- Wednesday, 10/25 : Read Examples 2, 3, 4, 5, 9 in Section 4.7.
- Homework #8 : Due Monday, 11/6. HW#8 solutions distributed in class on 11/8.
Suggested Problems: Section 4.4: #7, #8, #9, #13, #14, #15, #17, #20, #TF. Section 4.5: #3, #8, #9, #11, #TF. Section 4.7: #3, #5,
Written Problems: Section 4.4: #10 [HINT: cos^2(x) - sin^2(x) = cos(2x)], #16, #25, #30. Section 4.5: #4, #8b, #9a, #10, #14, #18, #TF(i). Section 4.7: #4a, #6, #10a. [Optional and Extra Credit: 4.5.#22 and 23.]
- Monday, 11/6 : Read Examples of Linear Transformations in R^2 and R^3 in Section 4.9 (you don't have to memorize these but you should be aware of them).
- Wednesday, 11/8 : Read Examples 3, 4, 8 in Section 4.10.
- Wednesday, 11/8 : Read Example 3 in Section 4.6.
- Homework #9 : Due Wednesday, 11/15. Note this HW is longer than usual as its based on 3 lectures. Get started on it right away. HW#9 solutions distributed in class on 11/15.
Suggested Problems: Section 4.6: #1, #3, #5, #9, #13, #14, #16. Section 4.7: #8a, #11, #14, #27. Section 4.8: #3, #7, #9, #19, #27, . Section 4.9: #3, #5, #7, #9, #11, #15, #39. Section 4.10: #21, #27, #30, #TF(e)(f)(g).
Written Problems: Section 4.6: #4, #6, #12. Section 4.7: #7b, #16, #18 [see example 9], #28 [Hint:TF(g) is true (why?)]. Section 4.8: #6, #7b and #8, #14a, #21, #TF(b) [Hint: Consider A as m x n matrix and look at linear independence of rows and columns when m less than n and when m larger than n]. Section 4.9: #32b and #38 [Hint: cos(-t)=cost (t), sin(-t)= -sint(t)], #39. Section 4.10: #4 [Use matrix from Section 4.9], #20a, #24.
- Wednesday, 11/15 : Read Example 8 in Section 5.1.
Read Examples 1 and 2 in Section 5.2.
- Monday, 11/20 : [Optional Reading: Not part of the course syllabus] Read Section 5.4 and Section 5.5 to see how these ideas are applied in Differential Equations and in Markov Chains.
- Homework #10 : Due Wednesday, 11/29. Note this HW is longer than usual as its based on 3 lectures. Get started on it right after the EXAM#3. HW#10 solutions to be distributed in class on 11/29.
Suggested Problems: Section 5.1: #3, #5, #13, #15, #25, #27, #33 . Section 5.2: #3, #5, #15, #19, #25, #26, #TF. Section 6.1: #2, #5, #10, #33, #35, #TF.
Written Problems: Section 5.1: #8, #10, #24a and #25, #34, #TF(c). Section 5.2: #8, #10, #12, #16a, #20a, TF(d)(e). Section 6.1: #18, #20, #22, #28, #34.
Class Log:
- Monday, 8/21 : Discussion of course organization and purpose. linear equations and systems of linear equations, comparison to lines and planes, consistent and inconsistent systems, only three possibilities for number of solutions of a linear system. Matrix notation and terminology, Equality of two matrices, Addition and subtraction of matrices, Scalar product of matrices, Product of matrices - condition for definition, relation to dot product. Matrix form of system of linear equations. (From Sections 1.1 and 1.3)
- Wednesday, 8/23 : Augmented matrix, Elementary row operations and back substitution for solving linear systems, Definitions of Row-Echelon and Reduced Row-echelon forms, examples of Row-Echelon and Reduced Row-echelon forms, parametric form of infinite family of solutions, Identifying no solutions, 1 solution and infinitely many solutions from the augmented matrix, Leading 1s, leading and free variables. (From Section 1.1 and 1.2)
- Monday, 8/28 : Distribution of Course information Sheet and discussion of course organization and purpose; Gaussian Elimination and Gauss-Jordan algorithms - motivation. correctness and examples. (From Section 1.2 and elsewhere)
- Wednesday, 8/30 : Homogenous system and its properties - trivial solution and consistency; Homogenous system with more variables then equations has non-trivial solutions (with proof); Product of matrices - column-by-column and row-by-row expressions, columns (rows) of the product as linear combination of columns (rows), Matrix equation and its relation to system of linear equations, Non-commutativity of Matrix multiplication, Non-properties of Matrix multiplication; Transpose of a matrix, trace of a square matrix. (From Sections 1.3, 1.4)
- Monday, 9/4 : Labor Day Holiday.
- Wednesday, 9/6 : Basic properties of matrix algebra, Non-commutativity of Matrix multiplication, Non-properties of Matrix multiplication - Cancelation law and commutativity of product, Zero matrices and their properties, Identity matrices and their properties, Invertible and Singular matrices. (From Section 1.4)
- Monday, 9/11 : Uniqueness of the inverse, Inverse of 2X2 matrices, Inverse of product of invertible matrices, Integer powers of a matrix, Laws of exponents for matrices, Properties of transpose, Transpose of AB, Inverse of transpose of an invertible matrix, Elementary matrices - relation with row operations, Inverse of Elementary Matrix and their relation to inverse row operations. (From Sections 1.4, and 1.5)
- Wednesday, 9/13 : Method for finding inverse of a matrix and its underlying logic, Finding the inverse of a matrix, Solving linear systems with matrix inversion, Simpler condition for invertibility of a square matrix with proof, Statements equivalent to invertibility of a matrix with proofs, Basic properties of Diagonal and Triangular matrices, and Symmetric matrices. (From Sections 1.5, 1.6, and 1.7)
- Monday, 9/18 : Number of solutions of a system of linear equations with proof, Simpler condition for invertibility of a square matrix with proof, for sq matrices AB invertible implies A and B are invertible, Two properties of solutions of non-homogenous systems equivalent to invertibility of a matrix with proofs. Introduction to Determinants, Properties of determinant under row operations, determinants of triangular matrices and matrices with a zero row or column, det(A)=det(transpose(A)); Using Row operations to evaluate a determinant. (From Sections 1.6, 2.2, and 2.3, and elsewhere)
- Wednesday, 9/20 : Invertibility in terms of determinant with proof, Determinant of product of matrices, Determinant of the inverse with proof, Cofactor of an entry, Cofactor expansion for finding determinant. LU decomposition of matrix. (From Sections 2.1, 2.2, 2.3, and 9.1, and elsewhere)
- Monday, 9/25 : LU decomposition of matrix - when does it exist and how to find it using row operations in the Gaussian Elimination, Relation between L and elementary matrices and the relation between U and row echelon form, How to use the LU decomposition to easily solve a matrix equation (linear system), An application of linear systems to network flows. (From Sections 9.1, 1.8, and elsewhere)
- Wednesday, 9/27 : Mid-term Exam #1.
- Monday, 10/2 : Definition of vector space, examples and non-examples of Vector Spaces, Examples (R^n, M_{m x n}, F[a,b], P_n, etc.) and non-examples (R^2 with non-standard scalar multiplication, Polynomials of degree=n, Invertible Matrices) of Vector Spaces, how to prove V is a vector space, how to prove V is not a vector space - how to show an axiom is not satisfied (Axioms 4 and 5 vs. other axioms). (From Section 4.1)
- Wednesday, 10/4 : Distribution of Exam#1 and discussion of its solutions (to be continued). Some elementary properties of vector spaces with proofs, introduction to subspaces with examples and non-examples, Characterization of subspaces. (From Section 4.2)
- Monday, 10/9 : Fall Break
- Wednesday, 10/11 : Discussion of EXAM #1 solutions. Vector space of solution vectors of a homogenous system (Null(A)), Linear combination of vectors. (From Section 4.2)
- Monday, 10/16 : When is vector in R^n a linear combination of some other vectors in R^n? - conversion to a linear system, Span of vectors, Span(S) is a subspace and the smallest subspace containing S, Spanning sets for some vector spaces and subspaces, Conversion of a spanning set problem into a linear system problem, linear independence and its motivations, Linear independence and dependence of vectors with examples and non-examples, relation between a vector equation and a linear system, Characterization of linear dependence and independence in terms of linear combinations. (From Sections 4.2 and 4.3)
- Wednesday, 10/18 : Some simple reasons for linear dependence, A sufficient condition for linear dependence in R^n, Basis of a Vector Space, Standard bases for R^n, P_n, and M_nn, how to show S is a Basis of R^n. (From Sections 4.3, and 4.4)
- Monday, 10/23 : How to show S is a Basis of R^n, P_n, etc., Basis of the solution space of a homogenous system, Uniqueness of basis representation, Coordinate vector relative to a basis with examples from R^n and P_n, Properties of sets with more or with less vectors than in a basis, Dimension of a vector space, examples, dimension of the solution space of a homogenous system. (From Sections 4.4 and 4.5)
- Wednesday, 10/25 : Plus/Minus theorem, How to check for basis of a vector space whose dimension is known, Converting a large spanning set or a small linearly independent set into a basis, How to extend a set of vectors into a basis for R^n, dimension of a subspace vs vector space containing it, Row space, Column space, and Null space of a matrix. (From Sections 4.5 and 4.7)
- Monday, 10/30 : Mid-term Exam #2.
- Wednesday, 11/1 : Row space, Column space, and Null space of a matrix, Relation between consistency of a non-homogenous system and the Column space, General solution of a non-homogenous system in terms of a particular solution and a general solution of the corresponding homogenous system, Row operations and Row, Col and Null spaces of a matrix and their bases, Finding Basis for Row(A), Col(A) and Null(A), Using Row(A) and Col(A) to find a basis of a Euclidean subspace expressed as span(S) - the difference between the two methods, Statements with proofs related to: rank(A), nullity(A), Row(A)=Col(A^T), rank(A)=rank(A^T), rank + nullity = #of columns, rank and nullity in terms of the solution of the corresponding homogenous system. (From Sections 4.5, 4.7, 4.8)
- Monday, 11/6 : Consistency theorem, Equivalent statements for rank(A) = #rows, overdetermined and underdetermined linear systems and their properties, Consistency properties of linear systems with non-square coefficient matrices, extension of characterization of invertible square matrices, Linear transformations from R^n to R^m and its relation to matrix multiplication with an mxn matrix, zero transformation, Identity operator, Reflection operator as a linear operator. (From Sections 4.8 and 4.9)
- Wednesday, 11/8 : More examples of linear operators, Compositions of linear transforms, Injective and surjective(onto) linear transforms, Characterization of invertible matrices in terms of their corresponding linear transforms, Inverse of a linear transform - when does it exist and how to find it, Characterization of linearity with proof using standard Euclidean basis vectors to form the standard matrix, Using the standard basis to find the standard matrix for any linear operator, Using the standard basis to find the standard matrix for any linear operator, Change of basis problem and transition matrix for relating the two coordinate vectors, Relation between the two transition matrices, (From Sections 4.6, 4.9 and 4.10)
- Monday, 11/13 : Eigenvalues and eigenvectors of a matrix, Characteristic polynomial and characteristic equation of a matrix; Eigenspace of a matrix w.r.t. an eigenvalue, Finding bases for the eigenspaces of a matrix, Invertibility and eigenvalues. Eigenvector problem and the Diagonalization problem. Distribution of Exam#2 and discussion of its solutions. (From Sections 5.1 and 5.2)
- Wednesday, 11/15 : Conclusion of the discussion of the solutions of Exam#2. Definition and motivation for diagonalizability of matrices, Similar matrices, Characterization of diagonalizable matrices in terms of eigenvectors and sum of nullities, eigenvectors corresponding to distinct eigenvalues are linearly independent, How to check whether or not a matrix is diagonalizable, Procedure for diagonalizing a matrix, relation between P and D in the diagonalization. (From Section 5.2)
- Monday, 11/20 : Mid-term Exam #3.
- Wednesday, 11/22 : Thanksgiving Break.
- Monday, 11/27 : Geometric and algebraic multiplicities of a eigenvalue and their characterization of diagonalizability of a matrix. Inner product on a vector space, Inner product spaces, 3 different Inner products on R^n, Relation between different inner products on R^n, Inner products on Matrices, Polynomials, and Continuous functions, Norm and distance functions and their properties, Cauchy-Schwarz inequality, Triangle inequality, Angle between two vectors in an i.p.s., Orthogonal vectors, Generalized Pythagoras Theorem. (From Sections 5.2, 6.1 and 6.2)
- Wednesday, 11/29 : Distribution of Exam#3 and discussion of its solutions. Orthogonal complement of a subspace, Properties and examples of Orthogonal complements, Null(A) and Row(A) are orthogonal complements, Finding the basis of an orthogonal complement in the Euclidean space, Orthogonal and Orthonormal sets of vectors, Orthonormal Basis, Coordinate vector relative to an Orthonormal basis. Gram-Schmidt process for creating an Orthonormal basis of an inner product space with proof, QR decomposition.
Links for Additional Information: