Random matrix products: Universality and least singular values fileJoint work with... Natalie Coston...

40
Random matrix products: Universality and least singular values Sean O’Rourke University of Colorado Boulder August 27, 2018 Supported in part by NSF grants ECCS-1610003 and DMS-1810500.

Transcript of Random matrix products: Universality and least singular values fileJoint work with... Natalie Coston...

Random matrix products: Universality and leastsingular values

Sean O’Rourke

University of Colorado Boulder

August 27, 2018

Supported in part by NSF grants ECCS-1610003 and DMS-1810500.

Joint work with...

Natalie Coston (University of Colorado Boulder)

Phil Kopel (University of Colorado Boulder)

Van Vu (Yale University)

Introduction

Consider G drawn from the complex Ginibreensemble:

I Gij are independent and identicallydistributed (iid) random variables,

I each Gij has the standard complexGaussian distribution.

G has density

X 7→ 1

πn2e− tr(XX∗).

Introduction

Consider G drawn from the complex Ginibreensemble:

I Gij are independent and identicallydistributed (iid) random variables,

I each Gij has the standard complexGaussian distribution.

G has density

X 7→ 1

πn2e− tr(XX∗).

Eigenvalues of the Ginibre ensembleThe eigenvalues λ1(G ), . . . , λn(G ) ∈ C of G have joint density

φn(z1, . . . , zn) =1

Znexp

(−

n∑k=1

|zk |2) ∏

1≤i<j≤n|zi − zj |2.

The corresponding k-point correlation function

φn,k(z1, . . . , zk) =

∫Cn−k

φn(z1, . . . , zn)d2zk+1 · · · d2zn

satisfies

φn,k(z1, . . . , zk) =(n − k)!

n!γ(z1) · · · γ(zk) det[K (zi , zj)]1≤i ,j≤k ,

where

K (z ,w) :=n−1∑l=0

(zw)l

l!, γ(z) :=

1

πe−|z|

2

Eigenvalues of the Ginibre ensembleThe eigenvalues λ1(G ), . . . , λn(G ) ∈ C of G have joint density

φn(z1, . . . , zn) =1

Znexp

(−

n∑k=1

|zk |2) ∏

1≤i<j≤n|zi − zj |2.

The corresponding k-point correlation function

φn,k(z1, . . . , zk) =

∫Cn−k

φn(z1, . . . , zn)d2zk+1 · · · d2zn

satisfies

φn,k(z1, . . . , zk) =(n − k)!

n!γ(z1) · · · γ(zk) det[K (zi , zj)]1≤i ,j≤k ,

where

K (z ,w) :=n−1∑l=0

(zw)l

l!, γ(z) :=

1

πe−|z|

2

Eigenvalues of the Ginibre ensembleThe eigenvalues λ1(G ), . . . , λn(G ) ∈ C of G have joint density

φn(z1, . . . , zn) =1

Znexp

(−

n∑k=1

|zk |2) ∏

1≤i<j≤n|zi − zj |2.

The corresponding k-point correlation function

φn,k(z1, . . . , zk) =

∫Cn−k

φn(z1, . . . , zn)d2zk+1 · · · d2zn

satisfies

φn,k(z1, . . . , zk) =(n − k)!

n!γ(z1) · · · γ(zk) det[K (zi , zj)]1≤i ,j≤k ,

where

K (z ,w) :=n−1∑l=0

(zw)l

l!, γ(z) :=

1

πe−|z|

2

Products of Ginibre matrices

Consider the product G1 · · ·GM of M independent Ginibrematrices. The eigenvalues again form a determinantal pointprocess with kernel

K (z ,w) :=n−1∑l=0

(zw)l

(l!)M

and weight function wM(z) which depends only on |z | and can beexpressed in terms of the Meijer G -function [G. Akemann, Z.Burda, 2012].

iid random matrices

Definition (iid matrix)

Let ξ be a subgaussian random variable with mean zero, unitvariance, and independent real and imaginary parts. An n × nmatrix X is an iid matrix with atom variable ξ if the entries of Xare iid copies of ξ.

Example

The Bernoulli iid matrix is defined as an iid matrix with atomvariable

ξ =

{1 with probability 1/2,

−1 with probability 1/2.

iid random matrices

Definition (iid matrix)

Let ξ be a subgaussian random variable with mean zero, unitvariance, and independent real and imaginary parts. An n × nmatrix X is an iid matrix with atom variable ξ if the entries of Xare iid copies of ξ.

Example

The Bernoulli iid matrix is defined as an iid matrix with atomvariable

ξ =

{1 with probability 1/2,

−1 with probability 1/2.

Eigenvalues of products of iid random matrices

Theorem (M-fold circular law)

The empirical spectral distribution of the productP = n−M/2X1 · · ·XM of M independent n × n iid matricesconverges almost surely to the nonrandom measure supported onthe unit disk |z | < 1 with density

z 7→ 1

Mπ|z |2/M−2.

-1 -0.5 0 0.5 1

-1

-0.8

-0.6

-0.4

-0.2

0

0.2

0.4

0.6

0.8

1

F. Gotze, A. Tikhomirov

O., A. Soshnikov

Eigenvalues of products of iid random matrices

Theorem (M-fold circular law)

The empirical spectral distribution of the productP = n−M/2X1 · · ·XM of M independent n × n iid matricesconverges almost surely to the nonrandom measure supported onthe unit disk |z | < 1 with density

z 7→ 1

Mπ|z |2/M−2.

-1 -0.5 0 0.5 1

-1

-0.8

-0.6

-0.4

-0.2

0

0.2

0.4

0.6

0.8

1

F. Gotze, A. Tikhomirov

O., A. Soshnikov

Eigenvalues of products of iid random matrices

Theorem (M-fold circular law)

The empirical spectral distribution of the productP = n−M/2X1 · · ·XM of M independent n × n iid matricesconverges almost surely to the nonrandom measure supported onthe unit disk |z | < 1 with density

z 7→ 1

Mπ|z |2/M−2.

-1 -0.5 0 0.5 1

-1

-0.8

-0.6

-0.4

-0.2

0

0.2

0.4

0.6

0.8

1

F. Gotze, A. Tikhomirov

O., A. Soshnikov

Eigenvalues of products of iid random matrices

Theorem (Local law; Nemish)

Let P = n−M/2X1 · · ·XM be the product of M independent n × niid matrices. Let f : C→ C be a fixed smooth function withcompact support. Then for any ε > 0, d ∈ (0, 1/2] and any z0 ∈ Cwith |z0| ≥ ε and |1− |z0|| ≥ ε∣∣∣∣∣∣1n

n∑j=1

fz0(λj(P))− 1

∫|z|<1

fz0(z)|z |2/M−2d2z

∣∣∣∣∣∣ ≤ n−1+2d+ε‖∆f ‖L1

with overwhelming probability, where fz0 is the n−d -rescaling of faround z0:

fz0(z) = n2d f (nd(z − z0)).

This result generalizes work of Bourgade, Yau, and Yin for M = 1.It has recently been generalized by F. Gotze, A. A. Naumov, A. N.Tikhomirov.

Eigenvalues of products of iid random matrices

Theorem (Local law; Nemish)

Let P = n−M/2X1 · · ·XM be the product of M independent n × niid matrices. Let f : C→ C be a fixed smooth function withcompact support. Then for any ε > 0, d ∈ (0, 1/2] and any z0 ∈ Cwith |z0| ≥ ε and |1− |z0|| ≥ ε∣∣∣∣∣∣1n

n∑j=1

fz0(λj(P))− 1

∫|z|<1

fz0(z)|z |2/M−2d2z

∣∣∣∣∣∣ ≤ n−1+2d+ε‖∆f ‖L1

with overwhelming probability, where fz0 is the n−d -rescaling of faround z0:

fz0(z) = n2d f (nd(z − z0)).

This result generalizes work of Bourgade, Yau, and Yin for M = 1.It has recently been generalized by F. Gotze, A. A. Naumov, A. N.Tikhomirov.

Main results

With Coston, Kopel, and Vu, we consider the linear statistics andlocal correlation functions for the eigenvalues of the productP = n−M/2X1 · · ·XM .

For the linear statistics, one considers the fluctuations of

n∑j=1

f (λj(P)).

In this case

Var

n∑j=1

f (λj(P))

= O(1).

Main results

With Coston, Kopel, and Vu, we consider the linear statistics andlocal correlation functions for the eigenvalues of the productP = n−M/2X1 · · ·XM .

For the linear statistics, one considers the fluctuations of

n∑j=1

f (λj(P)).

In this case

Var

n∑j=1

f (λj(P))

= O(1).

Main results

With Coston, Kopel, and Vu, we consider the linear statistics andlocal correlation functions for the eigenvalues of the productP = n−M/2X1 · · ·XM .

For the linear statistics, one considers the fluctuations of

n∑j=1

f (λj(P)).

In this case

Var

n∑j=1

f (λj(P))

= O(1).

Linear statistics: A generalization of Rider and Virag

Theorem (Kopel, O., Vu)

Let f : C→ R be a function with continuous partial derivatives andat most polynomial growth at infinity. Let P = n−M/2G1 · · ·GM bethe product of M independent Ginibre matrices. Then

n∑j=1

f (λj(P))− E

n∑j=1

f (λj(P))

converges in distribution to the mean zero normal distribution withvariance

1

∫|z|<1

|∇f (z)|2d2z +1

2‖f ‖2

H1/2 ,

where

‖f ‖2H1/2 =

∑k∈Z|k ||f (k)|2, f (k) =

1

∫ 2π

0f (e iθ)e−ikθdθ.

Universality

The ubiquitous universality phenomenon in random matrix theoryasserts that we should expect the same result for the product of iidmatrices.

Definition (Moment matching)

We say two n × n iid matrices X and X ′ match moments to orderk if, for all 1 ≤ i , j ≤ n, a, b ≥ 0, a + b ≤ k ,

E[Re(Xij)a Im(Xij)

b] = E[Re(X ′ij)a Im(X ′ij)

b].

Universality

The ubiquitous universality phenomenon in random matrix theoryasserts that we should expect the same result for the product of iidmatrices.

Definition (Moment matching)

We say two n × n iid matrices X and X ′ match moments to orderk if, for all 1 ≤ i , j ≤ n, a, b ≥ 0, a + b ≤ k ,

E[Re(Xij)a Im(Xij)

b] = E[Re(X ′ij)a Im(X ′ij)

b].

Linear statistics

Theorem (Kopel, O., Vu)

Let f : C→ R be a function with at least two continuousderivatives, supported in the spectral bulk {z ∈ C : ε < |z | < 1− ε}for some fixed ε > 0. Let P = n−M/2X1 · · ·XM be the product ofM independent n × n iid matrices which match moments to fourthorder with the complex Ginibre ensemble. Then

n∑j=1

f (λj(P))− E

n∑j=1

f (λj(P))

converges in distribution to the mean zero normal distribution withvariance

1

∫|z|<1

|∇f (z)|2d2z +1

2‖f ‖2

H1/2 .

Linear statistics

Theorem (Coston, O.)

Let f be analytic in some neighborhood containing the disk{z ∈ C : |z | ≤ 1 + δ} and bounded otherwise. LetP = n−M/2X1 · · ·XM be the product of M independent n × n iidmatrices which match moments to second order with the complexGinibre ensemble. Then

n∑j=1

f (λj(P))− E

n∑j=1

f (λj(P))

converges to a mean-zero Gaussian random variable F (f ) with

E[F (f )F (f )] =1

π

∫|z|<1

∣∣∣∣ ddz f (z)

∣∣∣∣2 d2z .

Correlation functions

Let λ1, . . . , λN be random points in the complex plane.

Define thek-point correlation function φN,k by

E

∑i1,...,ik distinct

g(λi1 , . . . , λik )

=

∫Ck

g(z1, . . . , zk)φN,k(z1, . . . , zk)d2z1 · · · d2zk

for any continuous, compactly supported test function g : Ck → C.

Correlation functions

Let λ1, . . . , λN be random points in the complex plane. Define thek-point correlation function φN,k by

E

∑i1,...,ik distinct

g(λi1 , . . . , λik )

=

∫Ck

g(z1, . . . , zk)φN,k(z1, . . . , zk)d2z1 · · · d2zk

for any continuous, compactly supported test function g : Ck → C.

Correlation functions

For the eigenvalues of products of independent Ginibre matrices,the microscopic correlation functions, away from the origin,converge to the same limits as those for a single Ginibre matrix(M = 1) [G. Akemann, Z. Burda, 2012].

In the spectral bulk, away from the origin, the asymptotic kernel isgiven by

Kbulk(z ,w) :=1

π

(zw

|zw |

)(1−M)/2

exp(−1/2(|z |2 + |w |2 + zw)

).

Correlation functions

For the eigenvalues of products of independent Ginibre matrices,the microscopic correlation functions, away from the origin,converge to the same limits as those for a single Ginibre matrix(M = 1) [G. Akemann, Z. Burda, 2012].

In the spectral bulk, away from the origin, the asymptotic kernel isgiven by

Kbulk(z ,w) :=1

π

(zw

|zw |

)(1−M)/2

exp(−1/2(|z |2 + |w |2 + zw)

).

M-th root eigenvalue process

Let P = X1 · · ·XM be the product of M independent n × n iidmatrices. We define the M-th root eigenvalue process associatedto P as the point process on C consisting of the Mn roots (countedwith algebraic multiplicity) of the polynomial z 7→ det(zM I − P).

Universality

Theorem (Kopel, O. Vu)

Let P = X1 · · ·XM and Q = Y1 · · ·YM be products of Mindependent n × n iid matrices, and assume that Xi and Yi matchmoments to fourth order. Let z1, . . . , zk ∈ C withε < n−1/2|zi | < 1− ε for some fixed ε > 0. Let φP,k , φQ,k be thek-point correlation function of the M-th root eigenvalue processassociated with P,Q. Then for any smooth, compactly supportedg : Ck → C,∣∣∣∣∫

Ck

g(w1, . . . ,wk)(φP,k − φQ,k)(z1 + w1, . . . , zk + wk)d2w1 · · · d2wk

∣∣∣∣≤ Cε,k,gn

−c0

for some c0 > 0.

Idea of the proof: Step 1: Linearization

Consider the Mn ×Mn block matrix

Y :=1√n

0 X1 0 · · · 00 0 X2 0 · · · 0...

. . .. . .

...0 0 · · · 0 XM−1XM 0 · · · 0 0

.

Then the eigenvalues of YM are the same as the eigenvalues of theproduct P = n−M/2X1 · · ·XM (up to multiplicity).

Suffices to studyMn∑j=1

f (λj(Y).

Idea of the proof: Step 1: Linearization

Consider the Mn ×Mn block matrix

Y :=1√n

0 X1 0 · · · 00 0 X2 0 · · · 0...

. . .. . .

...0 0 · · · 0 XM−1XM 0 · · · 0 0

.

Then the eigenvalues of YM are the same as the eigenvalues of theproduct P = n−M/2X1 · · ·XM (up to multiplicity).

Suffices to studyMn∑j=1

f (λj(Y).

Step 2: Hermitianization

We start with the identity

f (λ) =1

∫C

∆f (z) log |λ− z |d2z

to get

Mn∑j=1

f (λj(Y)) =1

∫C

∆f (z) log | det(Y − zI )|d2z

=1

∫C

∆f (z) log |det(Y − zI )(Y∗ − z I )| d2z

=1

∫C

∆f (z)Mn∑j=1

log [λj((Y − zI )(Y∗ − z))] d2z .

Step 2: Hermitianization

We start with the identity

f (λ) =1

∫C

∆f (z) log |λ− z |d2z

to get

Mn∑j=1

f (λj(Y)) =1

∫C

∆f (z) log | det(Y − zI )|d2z

=1

∫C

∆f (z) log |det(Y − zI )(Y∗ − z I )| d2z

=1

∫C

∆f (z)Mn∑j=1

log [λj((Y − zI )(Y∗ − z))] d2z .

Step 2: Hermitianization

We start with the identity

f (λ) =1

∫C

∆f (z) log |λ− z |d2z

to get

Mn∑j=1

f (λj(Y)) =1

∫C

∆f (z) log | det(Y − zI )|d2z

=1

∫C

∆f (z) log |det(Y − zI )(Y∗ − z I )| d2z

=1

∫C

∆f (z)Mn∑j=1

log [λj((Y − zI )(Y∗ − z))] d2z .

Step 2: Hermitianization

We start with the identity

f (λ) =1

∫C

∆f (z) log |λ− z |d2z

to get

Mn∑j=1

f (λj(Y)) =1

∫C

∆f (z) log | det(Y − zI )|d2z

=1

∫C

∆f (z) log |det(Y − zI )(Y∗ − z I )| d2z

=1

∫C

∆f (z)Mn∑j=1

log [λj((Y − zI )(Y∗ − z))] d2z .

Step 3: Comparison

We then need to compare

Mn∑j=1

log [λj((Y − zI )(Y∗ − z))] ≈Mn∑j=1

log [λj((G − zI )(G∗ − z))] ,

where

G :=1√n

0 G1 0 · · · 00 0 G2 0 · · · 0...

. . .. . .

...0 0 · · · 0 GM−1GM 0 · · · 0 0

.

Step 4: Least singular value bound

Need to bound the smallest singular value of σmin(z) of−zI X1 0 · · · 0

0 −zI X2 0 · · · 0...

. . .. . .

...0 0 · · · −zI XM−1XM 0 · · · 0 −zI

.

Step 4: Least singular value bound

Theorem (Kopel, O., Vu)

For any sufficiently small A > 0,

P(σmin(z) ≤ n−1/2−A

)≤ Cn−KA.

This generalizes previous works of Tao-Vu and Rudelson-Vershyninfor a single iid matrix as well as results ofGotze-Naumov-Tikhomirov and O.-Soshnikov.

Step 4: Least singular value bound

Theorem (Kopel, O., Vu)

For any sufficiently small A > 0,

P(σmin(z) ≤ n−1/2−A

)≤ Cn−KA.

This generalizes previous works of Tao-Vu and Rudelson-Vershyninfor a single iid matrix as well as results ofGotze-Naumov-Tikhomirov and O.-Soshnikov.

Thank you!