Linear Independence
In linear algebra, two slightly different notions of linear independence used: the linear independence of a family of vectors, and the linear independence of a set of vectors.
- A family of vectors is linearly independent family if none of them can be written as a linear combination of finitely many other vectors in the family. A family of vectors which is not linearly independent is called linearly dependent.
- A set of vectors is linearly independent set if the set (regarded as a family indexed by itself) is a linearly independent family.
These two notions aren't equivalent: the difference being that in a family we allow repeated elements, while in a set we don't. For example if is a vector space, then the family such that and is a linearly dependent family, but the singleton set of the images of that family is wich is a linearly independent set.
Both notions are important and used in common, and sometimes even confused in the literature.
For instance, in the three-dimensional real vector space we have the following example.
Here the first three vectors are linearly independent; but the fourth vector equals 9 times the first plus 5 times the second plus 4 times the third, so the four vectors together are linearly dependent. Linear dependence is a property of the family, not of any particular vector; for example in this case we could just as well write the first vector as a linear combination of the last three.
In probability theory and statistics there is an unrelated measure of linear dependence between random variables.
Read more about Linear Independence: Definition, Geometric Meaning, Example II, Example III, Example IV, Projective Space of Linear Dependences, Linear Dependence Between Random Variables
Famous quotes containing the word independence:
“A tragic irony of life is that we so often achieve success or financial independence after the chief reason for which we sought it has passed away.”
—Ellen Glasgow (18731945)