Consider random variables and , to measure how much they change together, we use covariance:

Firstly, we center them at their mean, , , and find the expectation of the product. The product is positive if both of them move beyond (both positive) or below (both negative) the mean. But negative if they are at different side about the mean value. Thus the measure, named covariance, defined as

For random vectors and , the covariance is a matrix whose -th element is the correlation of the -th element in and -th element in , i.e. . Precisely,

Covariance has the property that, .

The magnitude of covariance does not carry much interpretation. Thus we usually normalize the covariance into and produce the Pearson’s coefficient:

The Pearson’s coefficient measures the strength of linear dependence between random variables and .