Linear Independence - Linear Dependence Between Random Variables

Linear Dependence Between Random Variables

The covariance is sometimes called a measure of "linear dependence" between two random variables. That does not mean the same thing as in the context of linear algebra. When the covariance is normalized, one obtains the correlation matrix. From it, one can obtain the Pearson coefficient, which gives us the goodness of the fit for the best possible linear function describing the relation between the variables. In this sense covariance is a linear gauge of dependence.

Read more about this topic:  Linear Independence

Famous quotes containing the words dependence, random and/or variables:

    ... the whole Wilsonian buncombe ... its ideational hollowness, its ludicrous strutting and bombast, its heavy dependence upon greasy and meaningless words, its frequent descents to mere sound and fury, signifying nothing.
    —H.L. (Henry Lewis)

    Assemble, first, all casual bits and scraps
    That may shake down into a world perhaps;
    People this world, by chance created so,
    With random persons whom you do not know—
    Robert Graves (1895–1985)

    The variables are surprisingly few.... One can whip or be whipped; one can eat excrement or quaff urine; mouth and private part can be meet in this or that commerce. After which there is the gray of morning and the sour knowledge that things have remained fairly generally the same since man first met goat and woman.
    George Steiner (b. 1929)