Column Rank = Row Rank or Rk(A) = Rk(AT)
This result forms a very important part of the fundamental theorem of linear algebra. We present two proofs of this result. The first is short and uses only basic properties of linear combination of vectors. The second is an elegant argument using orthogonality and is based upon: Mackiw, G. (1995). A Note on the Equality of the Column and Row Rank of a Matrix. Mathematics Magazine, Vol. 68, No. 4. Interestingly, the first proof begins with a basis for the column space, while the second builds from a basis for the row space. The first proof is valid when the matrices are defined over any field of scalars, while the second proof works only on inner-product spaces. Of course they both work for real and complex euclidean spaces. Also, the proofs are easily adapted when A is a linear transformation.
First proof: Let A be an m × n matrix whose column rank is r. Therefore, the dimension of the column space of A is r. Let be any basis for the column space of A and place them as column vectors to form the m × r matrix . Therefore, each column vector of A is a linear combination of the r columns of C. From the definition of matrix multiplication, there exists an r × n matrix R, such that A = CR. (The -th element of R is the coefficient of when the j-th column of A is expressed as a linear combination of the r columns of C. Also see rank factorization.) Now, since A = CR, every row vector of A is a linear combination of the row vectors of R. (The -th element of C is the coefficient of the j-th row vector of R when the i-th row of A is expressed as a linear combination of the r rows of R.) This means that the row space of A is contained within the row space of R. Therefore, we have row rank of A ≤ row rank of R. But note that R has r rows, so the row rank of R ≤ r = column rank of A. This proves that row rank of A ≤ column rank of A. Now apply the result to the transpose of A to get the reverse inequality: column rank of A = row rank of AT ≤ column rank of AT = row rank of A. This proves column rank of A equals row rank of A. See a very similar but more direct proof for rk(A) = rk(AT) under rank factorization. QED
Second proof: Let A be an m × n matrix whose row rank is r. Therefore, the dimension of the row space of A is r and suppose that is a basis of the row space of A. We claim that the vectors are linearly independent. To see why, consider the linear homogeneous relation involving these vectors with scalar coefficients :
Finally, we provide a proof of the related result, rk(A) = rk(A*), where A* is the conjugate transpose or hermitian transpose of A. When the elements of A are real numbers, this result becomes rk(A) = rk(AT) and can constitute another proof for row rank = column rank. Otherwise, for complex matrices, rk(A) = rk(A*) is not equivalent to row rank = column rank, and one of the above two proofs should be used. This proof is short, elegant and makes use of the null space.
Third proof: Let A be an m × n matrix. Define rk(A) to mean the column rank of A. First note that A*Ax = 0 if and only if Ax = 0. This is elementary linear algebra – one direction is trivial; the other follows from:
Read more about this topic: Rank (linear Algebra)
Famous quotes containing the words column, rank and/or row:
“The actor who lets the dust accumulate on his Ibsen, his Shakspere [sic], and his Bible, but pores greedily over every little column of theatrical news, is a lost soul.”
—Minnie Maddern Fiske (18651932)
“In a famous Middletown study of Muncie, Indiana, in 1924, mothers were asked to rank the qualities they most desire in their children. At the top of the list were conformity and strict obedience. More than fifty years later, when the Middletown survey was replicated, mothers placed autonomy and independence first. The healthiest parenting probably promotes a balance of these qualities in children.”
—Richard Louv (20th century)
“In Flanders fields the poppies blow
Between the crosses, row on row,”
—John McCrae (18721918)