Download presentation
Presentation is loading. Please wait.
1
EECS 290S: Network Information Flow
Anant Sahai David Tse TexPoint fonts used in EMF: AAAAAAAAAAAAAA
2
Logistics Anant Sahai: 267 Cory (office hours in 258 Cory), Office hours: Mon 4-5pm and Tue 2:30-3:30pm. David Tse, 257 Cory (enter through 253 Cory), Office hours: Tue 10-11am, and Wed 9:30-10:30am. Prerequisite: some background in information theory, particularly for the second half of the course. Evaluations: Two problem sets (10%) Take-home midterm (15%) In-class participation and a lecture (25%) Term paper/project (50%)
3
Logistics Text: References
Raymond Yeung, Information Theory and Network Coding, preprint available at Papers References T. Cover and J. Thomas, Elements of Information Theory, 2nd edition.
4
Classical Information Theory
Source has entropy rate H bits/sample. Channel has capacity C bits/sample. Reliable communication is possible iff H < C. Information is like fluid passing through a pipe. How about for networks? A Mathematical Theory of Communication 1948
5
General Problem Each source is observed by some nodes and needs to be sent to other nodes Question: Under what conditions can the sources be reliably sent to their intended nodes?
6
Simplest Case Single-source-single-destination (single unicast flow)
All links are orthogonal and non-interfering (wireline) (Ford-Fulkerson 1956) Applies to commodities or information. Applies even if each link is noisy. Fluid through pipes analogy still holds. S D
7
Extensions More complex traffic patterns
More complex signal interactions.
8
Multicast s b b t b b u w b b x y z
Single source needs to send the same information to multiple destinations. What choices can we make at node w? One slave cannot serve two masters. Or can it? s b b 1 2 t b b u 1 2 w b b 1 2 x y z
9
Multicast Picking a single bit does not achieve the min-cut of both
destinations s b b 1 2 t b b u 1 2 w b b 1 2 b 1 x y b b z 1 1
10
Network coding Needs to combine the bits and forward equations.
Each destination collects all the equations and solves for the unknown bits. Can achieve the min-cut bound simultaneously for both destinations. s b b 1 2 t b b u 1 2 w b b 1 2 b + b 1 2 x y b + b b + b z 1 2 1 2
11
Other traffic patterns
Multiple sources send independent information to the same destination. Single source sending independent information to several destinations. Multiple sources each sending information to their respective destinations. The last two problems are not easy due to interference
12
Complex Signal Interactions: Wireless Networks
D relays Key properties of wireless medium: broadcast and superposition. Signals interact in a complex way. Standard physical-layer model: linear model with additive Gaussian noise.
13
Gaussian Network Capacity: Known Results
Tx Rx point-to-point (Shannon 48) Tx 1 Rx1 Rx Tx Tx 2 Rx 2 broadcast (Cover, Bergmans 70’s) (Weintgarten et al 05) multiple-access (Alshwede, Liao 70’s)
14
What We Don’t Know Unfortunately we don’t know the capacity of most other Gaussian networks. Tx 1 Rx 1 Tx 2 Rx 2 Interference (Best known achievable region: Han & Kobayashi 81) Relay S D relay (Best known achievable region: El Gamal & Cover 79)
15
Bridging between Wireline and Wireless Models
There is a huge gap between wireline and Gaussian channel models: signal interactions Noise Approach: deterministic channel models that bridge the gap by focusing on signal interactions and forgoing the noise.
16
Two-way relay example R A B 1) R A B 2) R RAB=RBA=1/4 b1 A B 3) R b2 A
4) 16
17
Network coding exploits broadcast medium
1) R b2 A B 2) b1b2 A B 3) RAB=RBA=1/3 17
18
Nature does the coding via superposition
b1b2 b1 b2 1) A B b1b2 A B 2) RAB=RBA=1/2 But what happens when the signal strengths of the two links are different? 18
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.