MATRICES ROHAN RAMCHAND We will start by defining the concept of a matrix. Definition 1. Let F be a field and let m, n ∈ N≥0 . Then an m × n F -matrix is a table consisting of m rows and n columns with elements in F . We defined in previous sections (see Linear Transformations) the matrix space as (Matm×n (F ), +, α), or the set of all m × n F -matrices, along with matrix addition and scalar multiplication. Furthermore, dim Matm×n (F ) = mn. The proof of this statement follows. Proof. We begin by defining the standard basis of Matm×n (F ) as the set of matrices E = {E0,0 ..., Em,n }, where Em,n = (ei,j ) 1≤i≤m such that ei,j = δm,n (i, j). Then A = 1≤j≤n P P i,j αi,j Ei,j , where αi,j = Ai,j ; therefore, E spans A. Furthermore, if i,j αi,j Ei,j = 0, then αi,j = 0 and E is linearly independent. Therefore, E is a basis with length mn; then dim Matm×n = mn and the proof is complete. We now define one final operation on matrices: matrix multiplication. Definition 2. Let A ∈ Matm×r (F ) and B ∈ Matr×n (F ). Then the product of A and B, C = A · B ∈ Matm×n (F ), is defined as ci,j = r X ai,k bk,j . k=1 Matrices and Linear Transformations The following two properties of matrices are used in the proofs that follow and are stated without proof. Distributivity: Let A ∈ Matm×r (F ) and B1 , B2 ∈ Matr×n (F ). Then A · (B1 + B2 ) = A · B1 + A · B2 . Scalar Multiplication: Let A ∈ Matm×r (F ), B ∈ Matr×n (F ), and λ ∈ F . Then A · (λ · B) = λ · (A · B). 1 MATRICES 2 We now define the following map. Definition 3. Let A ∈ Matm×n (F ). Define TA : Matn×r → Matm×r : TA (B) = A · B. Theorem 1. TA is a linear transformation. Proof. Both properties of matrix multiplication above satisfy the properties of a linear transformation. We will, in particular, study the case r = 1, defined below. Definition 4. Let A ∈ Matm×r and let r = 1. Then A is a column vector. In particular, the set of m-dimensional column vectors is referred to as colm (F ). Furthermore, colm (F ) can be identified with F m , the former being a column vector and the latter being (canonically) a row vector. Therefore, we redefine TA as TA : Matm×1 → Matn×1 : F n → F m . We will now revisit the concept of matrix multiplication as composition of linear transformations. Let a11 . . . a1n .. .. A = ... . . am1 . . . amn and let x1 ~x = ... . xn Then a11 . . . a1n x1 .. .. .. TA (~x) = ... . . . am1 . . . amn an a11 x1 + a12 x2 + ... + a1n xn a21 x1 + a22 x2 + ... + a2n xn = .. . am1 x1 + am2 x2 + ... + amn xn a12 a1n a11 a2n a22 a21 = x1 .. + x2 .. + ... + xn .. . . . am1 am2 amn MATRICES 3 This is one of the more important results of this course: the product of a matrix and a vector is a linear combination of the columns of the matrix, where the coefficients in the combination are the elements of the vector. This implies that the result of matrix multiplication is a linear transformation in and of itself, and therefore TA , as defined above, sends Matm×n (F ) to Hom(F n , F m ). We now state an important theorem regarding TA . Theorem 2. TA is a bijective linear transformation; in other words, Matm×n (F ) and Hom(F n , F m ) are isomorphic. Proof. Let A, A1 , A2 ∈ Matm×n (F ) and λ ∈ F . TA1 +A2 (~x) = (A1 + A2 )~x = A1 ~x + A2 ~x = TA1 (~x) + TA2 (~x) TλA (~x) = (λA)(~x) = λ(A~x) = λTA (~x) Therefore, TA is a linear transformation. Let M = T −1 : Hom(F n , F m ) → Matm×n (F ). For a linear transformation A : F n → F m , MA is a matrix, composed of columns m ~ 1 , ..., m ~ n , such that MA · ~x = A(~x) = n X xi m ~ i. i=1 By definition, the basis (e1 , e2 , ..., en ), with ei = (δi (1), δi (2), ..., δi (n)) spans any vector space V : therefore, A(~x) = A · (x1 e1 + ... + xn en ) = n X xi A(ei ). i=1 Therefore, MA = (A(e1 ), ..., A(en )) and MA is defined for all A; therefore, TA has an inverse for all linear transformations A and is therefore an isomorphism. Note that we did not actually prove that M is a linear transformation; the proof of this statement is left as an exercise. This concept is illustrated with the following example. MATRICES 4 Example. Let A : R2 → R2 : A(x, y) = Then x+y x−y MA = (A(1, 0), A(0, 1)) = 1 1 1 −1 . . It has been stated on multiple occasions that composition of linear transformations is equivalent to matrix multiplications; this theorem is restated formally and proven below. Theorem 3. Let A ∈ Matm×r (F ), B ∈ Matr×n (F ), with AB ∈ Matm×n (F ). Then let TA : F r → F m , TB : F n → F r , with TAB : F n → F m . Then TAB = TA ◦ TB . Proof. As a convenience measure, we will denote the ith column of a matrix M as Mi . TAB (~x) = (A · B) · ~x = n X xi (A · B)i i=1 = n X xi (A · Bi ) ((A · B)i = A · Bi ) i=1 = n X A(xi Bi ) i=1 =A· n X xi Bi i=1 = A · (B · ~x) = (TA ◦ TB )(~x) Armed with this theorem, we can prove another theorem of matrix multiplication. Theorem 4. Let A ∈ Matm×r (F ), B ∈ Matr×l (F ), C ∈ Mat l × n(F ) be matrices. Then (A · B) · C = A · (B · C). Proof. (1) T(A·B)·C = TA·(B·C) (2) TA·B ◦ TC = TA ◦ TB·C (3) (TA ◦ TB ) ◦ TC = TA ◦ (TB ◦ TC ) By associativity of composition of functions, (3) is true and the proof is complete.
© Copyright 2024 ExpyDoc