Download presentation
Presentation is loading. Please wait.
Published byHendra Pranoto Modified over 6 years ago
1
Advances in Random Matrix Theory: Let there be tools
Alan Edelman Brian Sutton, Plamen Koev, Ioana Dumitriu, Raj Rao and others MIT: Dept of Mathematics, Computer Science AI Laboratories World Congress, Bernoulli Society Barcelona, Spain Wednesday July 28, 2004 1/18/2019
2
Tools So many applications …
Random matrix theory: catalyst for 21st century special functions, analytical techniques, statistical techniques In addition to mathematics and papers Need tools for the novice! Need tools for the engineers! Need tools for the specialists! 1/18/2019
3
Themes of this talk Tools for general beta
What is beta? Think of it as a measure of (inverse) volatility in “classical” random matrices. Tools for complicated derived random matrices Tools for numerical computation and simulation 1/18/2019
4
Wigner’s Semi-Circle The classical & most famous rand eig theorem
Let S = random symmetric Gaussian MATLAB: A=randn(n); S=(A+A’)/2; S known as the Gaussian Orthogonal Ensemble Normalized eigenvalue histogram is a semi-circle Precise statements require n etc. n=20; s=30000; d=.05; %matrix size, samples, sample dist e=[]; %gather up eigenvalues im=1; %imaginary(1) or real(0) for i=1:s, a=randn(n)+im*sqrt(-1)*randn(n);a=(a+a')/(2*sqrt(2*n*(im+1))); v=eig(a)'; e=[e v]; end hold off; [m x]=hist(e,-1.5:d:1.5); bar(x,m*pi/(2*d*n*s)); axis('square'); axis([ ]); hold on; t=-1:.01:1; plot(t,sqrt(1-t.^2),'r');
5
G 6 5 4 3 2 1 sym matrix to tridiagonal form
Same eigenvalue distribution as GOE: O(n) storage !! O(n2) compute 1/18/2019
6
G 6 5 4 3 2 General beta beta:
1: reals 2: complexes 4: quaternions Bidiagonal Version corresponds To Wishart matrices of Statistics 1/18/2019
7
Tools Motivation: A condition number problem
Jack & Hypergeometric of Matrix Argument MOPS: Ioana Dumitriu’s talk The Polynomial Method The tridiagonal numerical 109 trick 1/18/2019
8
Tools Motivation: A condition number problem
Jack & Hypergeometric of Matrix Argument MOPS: Ioana Dumitriu’s talk The Polynomial Method The tridiagonal numerical 109 trick 1/18/2019
9
Numerical Analysis: Condition Numbers
(A) = “condition number of A” If A=UV’ is the svd, then (A) = max/min . Alternatively, (A) = max (A’A)/ min (A’A) One number that measures digits lost in finite precision and general matrix “badness” Small=good Large=bad The condition of a random matrix??? 1/18/2019
10
Von Neumann & co. estimates (1947-1951)
“For a ‘random matrix’ of order n the expectation value has been shown to be about n” Goldstine, von Neumann “… we choose two different values of , namely n and 10n” Bargmann, Montgomery, vN “With a probability ~1 … < 10n” X P(<n) 0.02 P(< 10n)0.44 P(<10n) 0.80 1/18/2019
11
Random cond numbers, n
Distribution of /n Experiment with n=200 1/18/2019
12
Finite n n= n=25 n= n=100 1/18/2019
13
Condition Number Distributions
P(κ/n > x) ≈ 2/x P(κ/n2 > x) ≈ 4/x2 Real n x n, n∞ Generalizations: β: 1=real, 2=complex finite matrices rectangular: m x n Complex n x n, n∞ 1/18/2019
14
Condition Number Distributions
P(κ/n > x) ≈ 2/x P(κ/n2 > x) ≈ 4/x2 Square, n∞: P(κ/nβ > x) ≈ (2ββ-1/Γ(β))/xβ (All Betas!!) General Formula: P(κ>x) ~Cµ/x β(n-m+1), where µ= β(n-m+1)/2th moment of the largest eigenvalue of Wm-1,n+1 (β) and C is a known geometrical constant. Density for the largest eig of W is known in terms of 1F1((β/2)(n+1), ((β/2)(n+m-1); -(x/2)Im-1) from which µ is available Tracy-Widom law applies probably all beta for large m,n. Johnstone shows at least beta=1,2. Real n x n, n∞ Complex n x n, n∞ 1/18/2019
15
Tools Motivation: A condition number problem
Jack & Hypergeometric of Matrix Argument MOPS: Ioana Dumitriu’s talk The Polynomial Method The tridiagonal numerical 109 trick 1/18/2019
16
∫ pκ(x)pλ(x) Δ(x)β ∏i w(xi)dxi = δκλ
Multivariate Orthogonal Polynomials & Hypergeometrics of Matrix Argument Ioana Dumitriu’s talk The important special functions of the 21st century Begin with w(x) on I ∫ pκ(x)pλ(x) Δ(x)β ∏i w(xi)dxi = δκλ Jack Polynomials orthogonal for w=1 on the unit circle. Analogs of xm 1/18/2019
17
Multivariate Hypergeometric Functions
1/18/2019
18
Multivariate Hypergeometric Functions
1/18/2019
19
Wishart (Laguerre) Matrices!
1/18/2019
20
Plamen’s clever idea 1/18/2019
21
Tools Motivation: A condition number problem
Jack & Hypergeometric of Matrix Argument MOPS: Ioana Dumitriu’s talk The Polynomial Method The tridiagonal numerical 109 trick 1/18/2019
22
Mops (Dumitriu etc.) Symbolic
1/18/2019
23
Symbolic MOPS applications
A=randn(n); S=(A+A’)/2; trace(S^4) det(S^3) 1/18/2019
24
Symbolic MOPS applications
β=3; hist(eig(S)) 1/18/2019
25
Smallest eigenvalue statistics
A=randn(m,n); hist(min(svd(A).^2)) 1/18/2019
26
Tools Motivation: A condition number problem
Jack & Hypergeometric of Matrix Argument MOPS: Ioana Dumitriu’s talk The Polynomial Method -- Raj! The tridiagonal numerical 109 trick 1/18/2019
27
Courtesy of the Polynomial Method
RM Tool – Raj! Courtesy of the Polynomial Method 1/18/2019
28
1/18/2019
29
Tools Motivation: A condition number problem
Jack & Hypergeometric of Matrix Argument MOPS: Ioana Dumitriu’s talk The Polynomial Method The tridiagonal numerical 109 trick 1/18/2019
30
Everyone’s Favorite Tridiagonal
-2 1 1 n2 … … … … … d2 dx2 1/18/2019
31
Everyone’s Favorite Tridiagonal
-2 1 G 1 (βn)1/2 1 n2 … … … + … … dW β1/2 d2 dx2 + 1/18/2019
32
Stochastic Operator Limit
, dW β 2 x dx d + - , N(0,2) χ n β 2 1 ~ H 2) (n 1) ÷ ø ö ç è æ - … … … , G β 2 H n + 1/18/2019
33
Largest Eigenvalue Plots
1/18/2019
34
MATLAB beta=1; n=1e9; opts.disp=0;opts.issym=1;
alpha=10; k=round(alpha*n^(1/3)); % cutoff parameters d=sqrt(chi2rnd( beta*(n:-1:(n-k-1))))'; H=spdiags( d,1,k,k)+spdiags(randn(k,1),0,k,k); H=(H+H')/sqrt(4*n*beta); eigs(H,1,1,opts) 1/18/2019
35
Tricks to get O(n9) speedup
Sparse matrix storage (Only O(n) storage is used) Tridiagonal Ensemble Formulas (Any beta is available due to the tridiagonal ensemble) The Lanczos Algorithm for Eigenvalue Computation ( This allows the computation of the extreme eigenvalue faster than typical general purpose eigensolvers.) The shift-and-invert accelerator to Lanczos and Arnoldi (Since we know the eigenvalues are near 1, we can accelerate the convergence of the largest eigenvalue) The ARPACK software package as made available seamlessly in MATLAB (The Arnoldi package contains state of the art data structures and numerical choices.) The observation that if k = 10n1/3 , then the largest eigenvalue is determined numerically by the top k × k segment of n. (This is an interesting mathematical statement related to the decay of the Airy function.) 1/18/2019
36
Random matrix tools! 1/18/2019
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.