Download presentation
Presentation is loading. Please wait.
Published byKeven Davison Modified over 9 years ago
1
TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: AAAAAAAAAAAAAA Antonia Tulino Università degli Studi di Napoli Chautauqua Park, Boulder, Colorado, July 17, 2008 Workshop on Random Matrix Theory and Wireless Communications Bridging the Gaps: Free Probability and Channel Capacity
2
Linear Vector Channel N -dimensional output noise=AWGN+interference K -dimensional input (N K) channel matrix Variety of communication problems by simply reinterpreting K, N, and H Fading Wideband Multiuser Multiantenna
3
Role of the Singular Values Mutual Information: Ergodic case: Non-Ergodic case:
4
Role of the Singular Values Minumum Mean-Square Error (MMSE) :
5
H-Model Independent and Identically distributed entries Separable Correlation Model UIU-Model with independent arbitrary distrbuted entries with which is uniformly distributed over the manifold of complex matrices such that
6
ISI Channels with Fading Fading channel Motivation: fading and/or jamming (wireless) cellular system with unreliable wired infrastructure impulse noise (power lines and DSL) faulty transducers (sensor networks) Randomly spread multicarrier CDMA channel with fading
7
ISI Channels with Fading Fading channel
8
Flat Fading & Deterministic ISI: Gaussian Erasure Channels Random erasure mechanisms: link congestion/failure (networks) cellular system with unreliable wired infrastructure impulse noise (DSL) faulty transducers (sensor networks)
9
d-Fold Vandermonde Matrix i.i.d. with uniform distribution in [0, 1] d distribution in [0, 1] d, Sensor networks Multiantenna multiuser communications Detection of distributed Targets
10
Flat Fading & Deterministic ISI: an i.i.d sequence
11
Formulation = asymptotically circulant matrix (stationary input (with PSD ) = asymptotically circulant matrix (stationary input (with PSD ) = asymptotically circulant matrix Grenander -Szego theorem Eigenvalues of
12
Formulation = asymptotically circulant matrix (stationary input (with PSD ) = asymptotically circulant matrix (stationary input (with PSD ) = asymptotically circulant matrix Grenander -Szego theorem Eigenvalues of
13
& |A i | =1 Deterministic ISI & |A i | =1 where is the waterfilling input power spectral density given by: the water level is chosen so that: the water level is chosen so that: Key Tool : Grenander-Szego theorem on the distribution of the eigenvalues of Key Tool : Grenander-Szego theorem on the distribution of the eigenvalues of large Toeplitz matrices large Toeplitz matrices
14
= asymptotically circulant matrix A = random diagonal fading matrix Key Question : The distribution of the eigenvalues of a large-dimensional random matrix: & Flat Fading Deterministic ISI & Flat Fading
15
A i =e i ={0,1} Key Question : The distribution of the eigenvalues of a large-dimensional random matrix: = asymptotically circulant matrix E = random 0 - 1 diagonal matrix
16
RANDOM MATRIX THEORY : - & Shannon-Transform The - and Shannon-transform of an nonnegative definite random matrix, with asymptotic ESD with X a nonnegative random variable whose distribution is while is a nonnegative real number. with X a nonnegative random variable whose distribution is while is a nonnegative real number. A. M. Tulino and S. Verdú “Random Matrices and Wireless Communications,” Foundations and Trends in Communications and Information Theory, vol. 1, no. 1, June 2004.
17
RANDOM MATRIX THEORY : Shannon-Transform A be a nonnegative definite random matrix. Theorem: The Shannon transform and -transforms are related through: where is defined by the fixed- point equation
18
Property of is monotonically increasing with which is the solution to the equation is monotonically decreasing with y
19
Theorem: -Transform Theorem: The -transform of is where is the solution to:
20
Theorem: Shannon-Transform Theorem: The Shannon-transform of is where and are the solutions to:
22
Flat Fading & Deterministic ISI: Theorem: The mutual information is: Stationary Gaussian inputs with power spectral with
23
Flat Fading & Deterministic ISI: Theorem: The mutual information is: Stationary Gaussian inputs with power spectral with
24
Flat Fading & Deterministic ISI: Theorem: The mutual information is: Stationary Gaussian inputs with power spectral with
25
Special Case: No Fading
26
Special Case: Memoryless Channels
27
Special case: Gaussian Erasure Channels Theorem: The mutual information is: Stationary Gaussian inputs with power spectral with
28
Let Flat Fading & Deterministic ISI: Theorem: The mutual information is: with : Stationary Gaussian inputs with power spectral
29
Example n=200 n = 200 ─
30
Example n=1000 n = 1000 ─
31
RANDOM MATRIX THEORY : asymptotic ESD The empirical spectral distribution (ESD)of an Hermitian random matrix The empirical spectral distribution (ESD) of an Hermitian random matrix If converges almost surely as n ,then the corresponding limit (asymptotic ESD) is denoted by
32
Key Ingredients: Theorem: Let: Let: be a nonnegative random variable. be uniformly distributed between [0,1] where is the solution to
33
Application to ISI Channel Let be a nonnegative random variable be uniformly distributed between [0,1]
34
RANDOM MATRIX THEORY : Shannon-Transform A be a nonnegative definite random matrix. Theorem: The Shannon transform and -transforms are related through: where is defined by the fixed- point equation
35
Property of is monotonically increasing with which is the solution to the equation is monotonically decreasing with y
36
A. M. Tulino, A. Lozano and S. Verdú “Capacity-Achieving Input Covariance for Single-user Multi-Antenna Channels”, IEEE Trans. on Wireless Communications 2006 Theorem: be an random matrix such that Input Optimization with so that: i-th column of
37
Input Optimization Theorem: The capacity-achieving input power spectral density is: where and is chosen so that
38
Corollary: Effect of fading on the capacity-achieving input power spectral density = SNR penalty Input Optimization with regulates amount of water admitted < 1 regulates amount of water admitted on each frequency tailoring the waterfilling for no-fading to fading channels for no-fading to fading channels. the fading-free water level for the fading-free water level for the waterfilling solution for the waterfilling solution for
39
Proof: Input Optimization:
40
Observations ! 0 with and denote the set of frequencies such that such that
41
Theorem: -Transform Theorem: The -transform of is where is the solution to:
42
Proof: Key Ingredient We can replace by it circulant asymptotic equivalent counterpart, = F F † Let Q = EF, denote by q i the ith column of Q, and let
43
Proof: Matrix inversion lemma:
44
Proof:
45
Lemma:
46
Theorem: Shannon-Transform Theorem: The Shannon-transform of is where and are the solutions to:
47
Asymptotics Low-power ( ) High-power ( )
48
Asymptotics: Low-SNR At low SNR we can closely approximate it linearly need and S 0 1 st order approx. to C(E b /N 0 ) captures 2 nd order behavior of C(SNR) notes :
49
Asymptotics: Low-SNR The minimum energy per bit and wideband slope of the spectral efficiency of a channel with Fading & deterministic ISI are equal to: The minimum energy per bit and wideband slope S 0 of the spectral efficiency of a channel with Fading & deterministic ISI are equal to: where Theorem:
50
Asymptotics: High-SNR At large SNR we can closely approximate it linearly need and S 0where High-SNR slope High-SNR dB offset
51
Let, and the generalized bandwidth, and the generalized bandwidth, Asymptotics: High-SNR Theorem:
52
Asymptotics Sporadic Erasure ( e ! 0 ) Sporadic Non-Erasure ( e ! 1 )
53
Theorem: For sporadic erasures: For sporadic erasures: Asymptotics: Sporadic Erasures ( e 0 ) For any output power spectral density and Theorem: where is the water level of the PSD that achieves Memoryless noisy erasure channel High SNR Low SNR
54
Asymptotics: Sporadic Non-Erasures ( e 1 ) Theorem:Theorem: Optimizingover with Optimizing over with with the maximum channel gain
55
Theorem: The mutual information rate is lower bounded by: Bounds: Equality S(f) =1
56
Theorem: The mutual information rate is upper bounded by: Bounds:
57
d-Fold Vandermonde Matrix Diagonal matrix (either random or deterministic) with supported compact measure Diagonal matrix (either random or deterministic) with supported compact measure
58
Theorem: The -transform of is The Shannon-Transformr is d-Fold Vandermonde Matrix
59
Theorem: The p-moment of is: d-Fold Vandermonde Matrix
60
Summary Summary Asymptotic distribution of A S A ---new result at the intersection of the asymptotic eigenvalue distribution of Toeplitz matrices and of random matrices --- The mutual information of a Channel with ISI and Fading. Optimality of waterfilling in the presence of fading known at the receiver. Easily computable asymptotic expressions in various regimes (low and high SNR) New result for d-fold Vandermond matrices and on their product with diagonal matrices
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.