Textbook 2: Applied Linear Algebra

2y ago
28 Views
3 Downloads
287.64 KB
9 Pages
Last View : 13d ago
Last Download : 3m ago
Upload by : Jamie Paz
Transcription

MATH 401: APPLICATIONS OF LINEAR ALGEBRASection: 0301Lectures: TuTh 9:30am – 10:45am, MTH B0423Office hours: Tu 11:00 – 11:59am or by appointmentTextbook 1: Linear Algebra and its applications (4th edition), by David C. Lay,ISBN: 9780321385178Textbook 2: Applied Linear Algebra (1st edition), by Peter J. Olver and ChehrzadShakiban, ISBN: 9780131473829Prerequisites: C- or better in one of MATH461, MATH240, MATH341Instructor: Yuri LimaEmail: yurilima@gmail.comOffice: Mathematics 4117Webpage: http://www2.math.umd.edu/ yurilima/math401.htmlDetailed syllabusFIRST MIDTERM: Chapters 1, 2, 4, except Sections 2.4, 2.6, 2.7, 4.8.SECOND MIDTERM: Chapters 5, 6, 7, except Sections 5.4, 5.8, 6.8, 7.4, 7.5;applications to graphs.1/28 – Introduction, systems of linear equations, row reduction: 1.1, 1.2. Various applications: Math, electrical network, economics, computer graphics,airline industry. Basic definitions: linear equation, system of equations, solution set. Example: 2 2 matrix and its solution set. Matrix notation: coefficient matrix and augmented matrix. Example 1, page 5. Row operations: interchange, multiplication and linear combination.1/30 – Alternative interpretations of systems of linear equations: 1.3,1.4, 1.5. Review: systems of linear equations, augmented matrix, row reduction. Echelon form, reduced echelon form, pivot position. Solutions of systems of linear equations: trichotomy. Alternative interpretation 1: vectorial equation a1 x1 · · · an xn b. Solution set non-empty b Span(x1 , · · · , xn ). Some examples of Span(x1 , · · · , xn ) for n 2, 3. Alternative interpretation 2: matrix equation Ax b, where A coeff. matrix.1

2MATH 401: APPLICATIONS OF LINEAR ALGEBRA Alternative interpretation 3: linear transformation  : Rn Rm via Âx Ax. What is a linear transformation, and why is  so?2/4 – Applications of linear systems: 1.6, 1.10. Review: three different interpretations of systems of linear equations. Leontief input–output model. Example with three products. Balancing chemical equations. Ammonia synthesis (Haber process): N2 2H2 2NH3 . Methane decomposition: CH4 2O2 CO2 2H2 O. Electrical networks: Ohm’s formula (V RI) and Kirhhoff’s formula. Example 2, page 83. Networks on graphs (somewhat similar to electrical networks): in out. Difference equations xk 1 Axk . Example: Fibonacci sequence.2/6 – Matrix algebra (basic sum/product operations, inverse) 2.1, 2.2. Review: applications of systems of linear equations. Matrix operations: sum, product, transpose. Basic properties of matrix operations. Inverse. Example with 2 2 matrices. Uniqueness of inverse, if it exists. Elementary matrices: relation with I, and invertibility.2/11 – Inverse: 2.3. Review: matrix algebra, elementary matrices. Lemma: A, B invertible AB invertible. Corollary: The product of elementary matrices is invertible. Theorem: A invertible Ax b always has solution A is row equivalentto I. Scholium 1: if A is row reduced to I via E1 , . . . , Ep , then A 1 E1 1 · · · Ep 1 . Scholium 2: algorithm to calculate A 1 . Theorem 8, page 112: invertibility in terms of algebra, row reduction, and lineartransformation.2/13 – NO CLASS.2/18 – LU decomposition, vector spaces, subspaces: 2.5, 4.1. Review: row reduction via matrix products; Theorem 8, page 112. LU decomposition. Application: computers use LU decomposition to solve systems of linear equations. Vector space: definition. Example 1, page 191: Rn . Example 4, page 192: Pn real polynomials of degree n. Counterexample: real polynomials of degree exactly n. Example 5, page 192: {f : [0, 1] R}. Counterexample: {f : [0, 1] R : f (0) 1}. Subspace: definition.

MATH 401: APPLICATIONS OF LINEAR ALGEBRA3 Example: subspaces of R2 .2/20 – Subspaces, linear independence, bases: 4.1 (2.8), 4.3 (1.7). Review: vector spaces and subspaces. Example 6, page 193: zero subspace. Example 7, page 193: Pk Pn when k n. Example: subspaces of Rn . Subspace spanned by a set. Example: span on R2 , R3 . Linear independence: definition. Example: three vectors in R2 . Fact: in Rn , linear independence is equivalent to solving an equation Ax 0.2/25 – Linear independence, basis, dimension: 4.3, 4.5 (2.9). Review: span, linear independence. Example: {1, x, . . . , xn } is linearly independent in Pn . Fact: if S is linearly dependent, then there is v S s.t. span(S) span(S\{v}). Basis: definition. Example 6, page 209. Spanning set theorem: Theorem 5, page 210. Theorem: if V has a basis with n elements, then every of basis of V has nelements. (We will prove this theorem next class.) Dimension: definition. Examples: dim Rn n, dim Pn n 1. Properties of bases: Theorems 11 and 12, page 227.2/27 – Coordinates, linear transformations: 4.4, 4.2 (1.8). Review: span, linear independence, basis, dimension. Conclude proof of Theorems 11 and 12, page 227. Fact: p n vectors in Rn are linearly dependent. Unique representation theorem: Theorem 7, page 216. Coordinates in a basis B. Thus vector spaces look like Rn . Example: coordinates in the basis {1, x, . . . , xn } of Pn . Linear transformations: definition. Example: a matrix A of size m n defines T : Rn Rm . Example: if dim V n, then a basis B defines T : V Rn . Null space. Lemma: the null space is a subspace of the domain. Example: the null space of matrix A is the set of solutions of Ax 0.3/4 – Linear transformations: 4.2. Review: coordinates, linear transformation, null space. Lemma: the null space of a matrix has dimension equal to the number of nonpivot columns. Range. Example: the range of a matrix A. Lemma: the range is a subspace of the codomain. Exercise: what are the null space and range of a coordinate map T : V Rn ?

4MATH 401: APPLICATIONS OF LINEAR ALGEBRA Theorem: if V has a basis with n elements, then every of basis of V has nelements. (Proof.)3/6 – Rank, matrix of linear transformation: 4.6 (2.9), 1.9. Review: null space, range.Rank: definition.Fact: if A has column vectors v1 , . . . , vn , then range(A) span(v1 , . . . , vn ).Lemma: the rank of a matrix is equal to the number of pivot columns.The matrix of a linear transformation T : V W (wrt basis of V, W ).Application in computer graphics: the matrices of dilation, shear and rotation.3/7 – Change of basis, stochastic matrices: 4.7, 4.9. Review: rank, matrix of linear transformation.The rank theorem: Theorem 14, page 233.Change of basis: matrix P satisfying [v]R P [v]S .Fact: if [v]R P [v]S , then [v]S P 1 [v]R .Change of basis: Theorem 15, page 240.Example: check Homework 4.Stochastic matrix P and its associated graph.Examples: economics, mathematical biology, queueing theory, genetics, GooglePageRank, random walks.Stationary vector: probability vector x s.t. P x x.Perron-Frobenius theorem (first version): Let P be stochastic with positive entries. Then P has a unique stationary.Perron-Frobenius theorem (second version): Let P be stochastic s.t. some P khas positive entries. Then P has a unique stationary vector.Examples: in economics, the stationary vector represents an equilibrium; in arandom walk, the stationary vector represents the frequency of visits to the vertices.3/11 – Markov chains, applications to computer graphics: 4.9, 2.7. Review: stochastic matrix graph, examples, Perron-Frobenius theorem. Markov chain: {P n x}, where P stochastic matrix and x probability vector. Perron Frobenius theorem: Let P be stochastic s.t. P k has positive entries forsome k 0. Then P has a unique stationary probability vector x, and for anyother probability vector y it holds that P n y x. In economics, mathematical biology, queueing theory, genetics: x equilibrium. In Google PageRank, random walks: x frequency of visits to vertices. Computer graphics: dilation, shearing and rotation are linear, but translationsare not. Homogeneous coordinates: (x, y) R2 (x, y, 1) R3 . Translation on R2 linear transformation on R3 . Dilation, shearing, rotation, translation are linear on homogeneous coordinates.3/13: Midterm 1.3/18 and 3/20: Spring break.

MATH 401: APPLICATIONS OF LINEAR ALGEBRA53/25: Eingenvalues, eigenvectors, characteristic equation – 5.1, 5.2, 5.3. Example: population dynamics. Definition: eigenvalue and eigenvector. Example: triangular matrix. How to find them in general? Crash course in determinants: Theorem 3, page 275. Characteristic equation: it tells what are the eigenvalues. Eigenvectors: row reduce the homogeneous system (A λI)v 0.3/27: Characteristic equation, complex eigenvalues – 5.3, 5.5. Review: eigenvalue, eigenvector, characteristic equation. Definition: diagonalizable matrix. If P change of basis, then A P DP 1 . The reverse is true. Theorem 6, page 284: if A has n distinct eigenvalues, then it is diagonalizable. Counterexample: diagonal matrix with equal entries. Theorem 7, page 285. Example 3, page 283. Complex eigenvalues. If λ a bi is eigenvalue with eigenvector v wi,span(v, w), then A preserves a band on this plane it acts according to the matrix. b a a b rotation dilation. b a4/1: Discrete dynamical systems – 5.6. Review: diagonalization (equivalence, sufficient condition, sufficient and necessary conditions), complex eigenvalues. Remember population dynamics: update of population according to vk 1 Avk . Other instances: chaos theory (butterfly effect), control theory, cosmology. If A is diagonalizable, then vk has a precise description in terms of v0 and theeigenvalues/eingenvectors of A. Example: all possibilities for 2 2 matrix, with notions of saddle, attractor,repeller.4/3: Applications to differential equations, applications to graphs (additional topic) – 5.7. Differential equations: x00 3x0 2x 0 x0 Ax. More generally, systems of linear differential equations lead to x0 Ax. Solution set: {x : x0 Ax}. Superposition principle: the solution set is a subspace. Fact: the dimension of the solution set equals the dimension of A. How to get n linearly independent solutions? If A diagonal, it is easy: decoupled system. If not, look for eigenvalues/eigenvectors. They define eigenfunctions. In the example x00 3x0 2x 0, x c1 et c2 e2t . Graphs: G (V, E), where V vertices and E non-oriented edges. How to count L(k) # of closed loops of length k? Adjacency matrix A. Property: If Ak (akij ), then akij # of paths of length k from i to j.

6 MATH 401: APPLICATIONS OF LINEAR ALGEBRADefinition: trace of a matrix.L(k) trace(Ak ).Theorem: Every symmetric matrix is diagonalizable.Lemma: If B P CP 1 , then trace(B) trace(C).Conclusion: L(k) λk1 · · · λkn .4/8: Inner products, length – 6.1 (6.7). Goal: understand angles in vector spaces.Motivation: shortest distance,Pn projection, optimization in computer graphics.Example in Rn : u · v i 1 ui vi , u · v kukkvk cos (u, v).Definition: inner products h·, ·i.Examples for Rn , Pn , andp{f : [a, b] R : f is continuous}.Length of vectors: kvk hv, vi.Properties: kvk 0 v 0; Pythagoras theorem.4/10: Orthogonality, orthogonal projections – 6.2, 6.3. Review: inner products, length.Definition: orthogonality.Orthogonal projection: projL (v).Minimization property: projL (v) minimizes the distance of v to L.What if we want to project to a plane, or even to a subspace?Definition: orthogonal set.Theorem 8, page 348 (expression for orthogonal projection).Theorem 9, page 350 (minimization property).Definition: orthonormal set.Special case of orthogonal projection wrt orthonormal basis.Theorem 10, page 351 (matrix form for the orthogonal projection).4/15: Gram-Schmidt procedure, least-squares problem – 6.4, 6.5. Review: projection to a line, projection to a subspace.Gram-Schmidt for two/three vectors.Gram-Schmidt for an arbitrary number of vectors.QR decomposition: invertible case/general case.Least-squares problem: get the “best” aproximate solution for Ax b.If Abx projection of b to range(A), then kAbx bk kAx bk for all x.Origin of the method: development of celestial mechanics for use in navigations.4/17: Least-squares problem, linear regression – 6.5, 6.6. Review: Gram-Schmidt process, least-squares problem. Examples: linear regression (biology, social sciences, economics, actuary), quadratic regression (computer graphics). Goal: minimize kAx bk. Normal equation: At Ax At b. Uniqueness theorem (Theorem 14, page 363). Solution via QR decomposition (Theorem 15, page 365). Linear regression: plot a line through a data of points (x1 , y1 ), . . . , (xn , yn ). Goal: minimize Xβ y, where β (β0 , β1 ) defines the line y β0 β1 y. Example: plot line through (1, 1.5), (2, 3), (3, 2.5), (4, 2). The result is β (2, 0.1).

MATH 401: APPLICATIONS OF LINEAR ALGEBRA74/22: General regression and applications to computer graphics, symmetric matrices – 6.6, 7.1. Review: least-squares problem, normal equation, linear regression. Interpolation using quadratic equations. Example: for (1, 1), (2, 4), (3, 3), (4, 1), the solution is 3.75 6.15x 1.25x2 . Interpolation using a general set of functions. Symmetric matrices. Example 2, page 395. If A is orthogonally diagonalizable, then A is symmetric. Theorem: the reverse is true, i.e. if A is diagonalizable then A has an orthonormalbasis of eigenvectors.4/24: Quadratic forms, constrained optimization – 7.2, 7.3. Review: diagonalization of symmetric matrices.T Quadratic expressions quadratic form Q(x) x Ax, where A is symmetric. 7 3x22 Example: 7x 3y 6xy x y.3 3y Constrained optimization: maxxT x 1 Q(x), or minxT x 1 Q(x). Write A P DP T , x P y: maxxT x 1 Q(x) maxyT y 1 y T Dy (same for min). Theorem: maxxT x 1 Q(x) largest eigenvalue of A, minxT x 1 Q(x) smallesteigenvalue of A. Positive, negative, indefinite quadratic forms.4/29: Review.5/1: Midterm 2.5/6: Min-max theorem (additional topic). 5/8: Affine geometry, barycentric coordinates – 8.1, 8.2. 5/13: Review. HomeworkHomework 1: due 2/6 1.1: 8, 25 1.2: 3, 4, 29, 31 1.3: 13, 14, 24 1.4: 17, 18 1.5: 7, 8, 39Homework 2: due 2/20 1.6: 4, 7 2.1: 10, 15 2.2: 10, 23 2.3: 11

8MATH 401: APPLICATIONS OF LINEAR ALGEBRA 2.5: 9, 19, 24Homework 3: due 2/27 1.7: 21 4.1: 7, 8, 32, 33 4.3: 15, 21, 31 4.5: 19, 21Homework 4: due 3/11 1.8: 30 1.9: 11, 23 4.2: 6, 25, 33 4.3: 31 4.6: 4, 34 4.7: 7Set-list for Midterm 1 1.4: 23 1.5: 24 1.6: 1 1.8: 23 2.3: 12 2.7: 10 4.2: 8, 31 4.3: 32 4.6: 10, 13 4.7: 1Homework 5: due 3/27 2.7: 11, 12 4.9: 17 Chapter 4 – Supplementary exercises: 1Homework 6: due 4/8 5.1: 13 5.2: 15, 27 5.3: 5, 22 5.5: 13, 26 5.6: 5, 9, 17abHomework 7: due 4/17 5.7: 1, 2 6.1: 2, 20 6.2: 10, 23 6.3: 2, 13 6.7: 9, 25Homework 8: due 4/24

MATH 401: APPLICATIONS OF LINEAR ALGEBRA 6.4: 9, 13, 17, 22 6.5: 5, 15, 18 6.6: 4, 19, 20Homework 9: due 5/6 7.1: 13, 19, 26 7.2: 7, 23, 24 7.3: 11Set-list for Midterm 2 5.1: 26 5.6: 4 5.7: 3 Chapter 5 – Supplementary exercises: 1 6.2: 24 6.4: 10 6.5: 6 6.6: 1 Chapter 6 – Supplementary exercises: 1 (except item l). 7.2: 5, 9 7.3: 79

MATH 401: APPLICATIONS OF LINEAR ALGEBRA Section: 0301 Lectures: TuTh 9:30am { 10:45am, MTH B0423 O ce hours: Tu 11:00 { 11:59am or by appointment Textbook 1: Linear Algebra and its applications (4th edition), by David C. Lay, ISBN: 9780321385178 Textbook 2: Applied Linear Algebra (1st edition), by Peter J. Olver and Chehrzad Shakiban, ISBN .

Related Documents:

Robert Gerver, Ph.D. North Shore High School 450 Glen Cove Avenue Glen Head, NY 11545 gerverr@northshoreschools.org Rob has been teaching at . Algebra 1 Financial Algebra Geometry Algebra 2 Algebra 1 Geometry Financial Algebra Algebra 2 Algebra 1 Geometry Algebra 2 Financial Algebra ! Concurrently with Geometry, Algebra 2, or Precalculus

INTRODUCTION TO LINEAR ALGEBRA AND S-LINEAR ALGEBRA 1.1 Basic properties of linear algebra 7 1.2 Introduction to s-linear algebra 15 1.3 Some aapplications of S-linear algebra 30 Chapter Two INTRODUCTORY COCEPTS OF BASIC BISTRUCTURES AND S-BISTRUCTU

MTH 210: Intro to Linear Algebra Fall 2019 Course Notes Drew Armstrong Linear algebra is the common denominator of modern mathematics. From the most pure to the most applied, if you use mathematics then you will use linear algebra. It is also a relatively new subject. Linear algebra as we

Linear Algebra and Optimization for Machine Learning A Textbook A frequent challenge faced by beginners in machine learning is the extensive background requirement in linear algebra and optimization. This makes the learning curve very steep. This book, therefore, reverses the focus by teaching linear algebra and optimization as the

Sep 07, 2020 · 06 - Linear Algebra Review De ning Matrices Basic Matrix Operations Special Types of Matrices Matrix Inversion Properties of Matrices Operations of Matrices Simple Linear Regression References OverviewI We wrap up the math topics by reviewing some linear algebra concepts Linear algebra

High-level description of course goals: 1. linear algebra theory; 2. linear algebra computa-tional skills; 3. introduction to abstract math. Today’s topic: introduction to linear algebra. Conceptually, linear algebra is about sets of quantities (a.k.a. vectors

results- rst approach to machine learning, and linear algebra is not the rst step, but perhaps the second or third. Practitioners Study Too Much Linear Algebra When practitioners do circle back to study linear algebra, they learn far more of the eld than is required for or relevant to machine learning. Linear algebra is a large eld of study

Achieved a high qualification In Radiology such as American Board, ABRMI or equivalent. Has an experience of at least 3 years after the higher qualification. Of the rank of Consultant Radiologist. Is employed on a full time basis , in the selected training hospital/ center -6-4.2 - Responsibilities and Duties of the Trainer Responsible for the actual performance of the trainee. Look after the .