Plot of the Marchenko-Pastur distribution for various values of lambda

In the mathematical theory of random matrices, the Marchenko–Pastur distribution, or Marchenko–Pastur law, describes the asymptotic behavior of singular values of large rectangular random matrices. The theorem is named after Soviet Ukrainian mathematicians Volodymyr Marchenko and Leonid Pastur who proved this result in 1967.

If denotes a random matrix whose entries are independent identically distributed random variables with mean 0 and variance , let

and let be the eigenvalues of (viewed as random variables). Finally, consider the random measure

counting the number of eigenvalues in the subset included in .

Theorem. [citation needed] Assume that so that the ratio . Then (in weak* topology in distribution), where

and

with

The Marchenko–Pastur law also arises as the free Poisson law in free probability theory, having rate and jump size .

Moments

For each , its -th moment is[1]

Some transforms of this law

The Stieltjes transform is given by

for complex numbers z of positive imaginary part, where the complex square root is also taken to have positive imaginary part.[2] It satisfies the quadratic equationThe Stieltjes transform can be repackaged in the form of the R-transform, which is given by[3]

The S-transform is given by[3]

For the case of , the -transform [3] is given by where satisfies the Marchenko-Pastur law.

where

For exact analysis of high dimensional regression in the proportional asymptotic regime, a convenient form is often which simplifies to

The following functions and , where satisfies the Marchenko-Pastur law, show up in the limiting Bias and Variance respectively, of ridge regression and other regularized linear regression problems. One can show that and .

Application to correlation matrices

For the special case of correlation matrices, we know that and . This bounds the probability mass over the interval defined by

Since this distribution describes the spectrum of random matrices with mean 0, the eigenvalues of correlation matrices that fall inside of the aforementioned interval could be considered spurious or noise. For instance, obtaining a correlation matrix of 10 stock returns calculated over a 252 trading days period would render . Thus, out of 10 eigenvalues of said correlation matrix, only the values higher than 1.43 would be considered significantly different from random.

See also

References

  1. ^ Bai & Silverstein 2010, Section 3.1.1.
  2. ^ Bai & Silverstein 2010, Section 3.3.1.
  3. ^ a b c Tulino & Verdú 2004, Section 2.2.
No tags for this post.