Agenda On Quiz 2 Network Information Theory

Slides:



Advertisements
Similar presentations
Derives the optimal achievable rate for MISO secondary users under coexistence constraints Proposes practical strategy for cognition and cooperation in.
Advertisements

Information theory Multi-user information theory A.J. Han Vinck Essen, 2004.
Quiz Determine the minimum number of shift register stages required to create a maximal length PN sequence which has a repetition time greater than 10.
CN College Algebra Ch. 11: Sequences 11.3: Geometric Sequences Goals: Determine if a sequence is geometric. Find a formula for a geometric sequence. Find.
Chapter 6 Information Theory
Three Lessons Learned Never discard information prematurely Compression can be separated from channel transmission with no loss of optimality Gaussian.
EE360 – Lecture 2 Outline Announcements: Class mailing list: (subscribe Wireless network lunches:
Computer Networks Chapter 3: Digital transmissions fundamentals Part 1.
Lattices for Distributed Source Coding - Reconstruction of a Linear function of Jointly Gaussian Sources -D. Krithivasan and S. Sandeep Pradhan - University.
A Graph-based Framework for Transmission of Correlated Sources over Multiuser Channels Suhan Choi May 2006.
A network is shown, with a flow f. v u 6,2 2,2 4,1 5,3 2,1 3,2 5,1 4,1 3,3 Is f a maximum flow? (a) Yes (b) No (c) I have absolutely no idea a b c d.
Chapter 5. Operations on Multiple R. V.'s 1 Chapter 5. Operations on Multiple Random Variables 0. Introduction 1. Expected Value of a Function of Random.
Distributed Video Coding Bernd Girod, Anne Margot Aaron, Shantanu Rane, and David Rebollo-Monedero IEEE Proceedings 2005.
Linear Codes for Distributed Source Coding: Reconstruction of a Function of the Sources -D. Krithivasan and S. Sandeep Pradhan -University of Michigan,
SSC Page 1 Frequency Agile Spectrum Access Technologies Presentation to FCC Workshop on Cognitive Radios May 19, 2003 Mark McHenry Shared Spectrum Company.
Quiz Properties of Logarithms Date: ____________.
4/11/40 page 1 Department of Computer Engineering, Kasetsart University Introduction to Computer Communications and Networks CONSYL Transmission.
EEET 5101 Information Theory Chapter 1
Channel Capacity
Abhik Majumdar, Rohit Puri, Kannan Ramchandran, and Jim Chou /24 1 Distributed Video Coding and Its Application Presented by Lei Sun.
MD-based scheme could outperform MR-based scheme while preserving the source- channel interface Rate is not sufficient as source- channel interface, ordering.
JWITC 2013Jan. 19, On the Capacity of Distributed Antenna Systems Lin Dai City University of Hong Kong.
1 Information in Continuous Signals f(t) t 0 In practice, many signals are essentially analogue i.e. continuous. e.g. speech signal from microphone, radio.
Network Flow. Network flow formulation A network G = (V, E). Capacity c(u, v)  0 for edge (u, v). Assume c(u, v) = 0 if (u, v)  E. Source s and sink.
Outline Transmitters (Chapters 3 and 4, Source Coding and Modulation) (week 1 and 2) Receivers (Chapter 5) (week 3 and 4) Received Signal Synchronization.
Superposition encoding A distorted version of is is encoded into the inner codebook Receiver 2 decodes using received signal and its side information Decoding.
Effects of Call Arrival Rate and Mobility on Network Throughput in Multi-Cell CDMA Manju Hegde, Robert Akl, Paul Min Washington University St. Louis,
1 Source Coding and Compression Dr.-Ing. Khaled Shawky Hassan Room: C3-222, ext: 1204, Lecture 10 Rate-Distortion.
Postacademic Interuniversity Course in Information Technology – Module C1p1 Chapter 4 Communications, Theory and Media.
Nonbinary Orthogonal Modulation in Direct- Sequence Spread Spectrum Communication Systems Michael Y. Tan Home Institution: Clemson University Advisor:
Doc.: IEEE /1263r2 Submission Dec 2009 Z. Chen, C. Zhu et al [Preliminary Simulation Results on Power Saving] Date: Authors: Slide.
Jayanth Nayak, Ertem Tuncel, Member, IEEE, and Deniz Gündüz, Member, IEEE.
Jayanth Nayak, Ertem Tuncel, Member, IEEE, and Deniz Gündüz, Member, IEEE.
1 WELCOME Chen. 2 Simulation of MIMO Capacity Limits Professor: Patric Ö sterg å rd Supervisor: Kalle Ruttik Communications Labortory.
Entropy estimation and lossless compression Structure and Entropy of English How much lossless compression can be achieved for a given image? How much.
Distributed Joint Source-Channel Coding on a Multiple Access Channel with Side Information Vinod Sharma.
Bridging the Gap: A Deterministic Model for Wireless Links David Tse Wireless Foundations U.C. Berkeley NSF Wireless Networks Workshop Aug 27, 2007 TexPoint.
MIMO WIRELESS COMMUNICATION SYSTEMS
Shannon’s Theorem.
Introduction to Cognitive radios Part two
EECS 290S: Network Information Flow
Presented by Tae-Seok Kim
Wireless Communication
Joint Source, Channel, and Network Coding in MANETs
Wireless Communication Project
Reasoning about Performance in Competition and Cooperation
Layerless Dynamic Networks
Ivana Marić, Ron Dabora and Andrea Goldsmith
Resource Allocation in Non-fading and Fading Multiple Access Channel
Lecture 7: Noise to ASK, FSK and PSK 2nd semester
Iterative Water-filling for Gaussian Vector Multiple Access Channel
Wireless Communication Co-operative Communications
Interference Avoidance and Control
Network Flow 2016/04/12.
A Few Principles We have a transmitter, a receiver and transmission media. Information flows from two entities over a channel. We can have broadcast.
Wireless Communication Co-operative Communications
CDMA2000.
Throughput-Optimal Broadcast in Dynamic Wireless Networks
Introduction Results Proofs Summary
Jinhua Jiang, Ivana Marić, Andrea Goldsmith, and Shuguang Cui Summary
Nyquist’s theorem: D = 2BLog2K
Foundation of Video Coding Part II: Scalar and Vector Quantization
Sampling Theorems- Nyquist Theorem and Shannon-Hartley Theorem
Spread Spectrum Watermarking
CSE 313 Data Communication
Introduction to Maximum Flows
Homeworks 1 PhD Course.
Introduction to Maximum Flows
Presentation transcript:

Agenda On Quiz 2 Network Information Theory Typical Networked Communication Scenarios Gaussian Networked Communication Channels Joint Typical Sequences

Network Communication Theory Point-to-point Information Theory Lossless data compression Channel capacity Lossy data communication, rate-distortion Information Theory for Communication Network Multiple point communication Data flow in the network

Typical Communication Topology Multiple Access Communication System Broadcast Communication System Relay Communication System Transmitter Transmitter 1 Transmitter 2 Relay Transmitter Receiver Receiver Receiver 1 Receiver 2

Typical Communication Topology Communication Network Max-Flow Min-Cut Theorem C = min{C1+C2, C1-C3+C5, C2+C3+C4, C4+C5} C1 C4 C3 C2 C5

Gaussian Multi-user Channels Gaussian Single User Channel Y = X + Z, Z ~ Gaussian(0, N) C = (1/2)log(1 + P/N) Gaussian Multi-access Channel with M Users Y = Σ1≤m≤MXm + Z, Z ~ Gaussian(0, N) Capacity Region: Σm \in SRm ≤ (1/2)log(1 + |S|P/N) Two users? Three users?

Gaussian Multi-user Channels Gaussian Broadcast Channels Y1 = X + Z1, Z1 ~ Gaussian(0, N1) Y2 = X + Z2, Z2 ~ Gaussian(0, N2) User Rate Region: (R1, R2), for 0 ≤ α ≤ 1 R1 ≤ C(αP/N1), R2 ≤ C((1-α)P/(αP+N2))

Gaussian Interference Channel Y1 = X1 + aX2 + Z1, Z1 ~ Gaussian(0, N) Y2 = X2 + aX1 + Z2, Z2 ~ Gaussian(0, N) Strong Interference = No Interference; What is a >> 1? X1 Y1 a a X2 Y2

Joint Typical Sequence Communication Network Signal Vector (X1, X2, …, XN) Let S Denote any Order Sets of (X1, X2, …, XN) S = (X1, X2), S = (X1, XN), S = (X1, X3, XN-1), S1, S2, …, SN be realizations of S -(1/n)log p(S1, S2, …, SN)  H(S) Typical: |-(1/n)log p(S1, S2, …, SN) - H(S)| < ε All 2N-1 choices of S

Joint Typical Sequence A(n)ε(S): the ε-typical sequence w.r.t. S For sufficiently large n, we have Pr(A(n)ε(S)) > 1 – ε For s in A(n)ε(S), we have p(s) ≈ 2-nH(s) |A(n)ε(S)| ≈ 2nH(s) For s1, s2 in A(n)ε(S), we have p(s1|s2) ≈ 2-nH(s1|s2)

Joint Typical Sequence A(n)ε(S1|s2): set of S1 sequence jointly typical with s2 sequence For sufficiently large n, we have |A(n)ε(S1|s2)| ≤ 2n(H(S1|S2) + 2ε) (1-ε)2n(H(S1|S2)-2ε) ≤ Σs2p(s2)|A(n)ε(S1|s2)|

Homework 15.2, 15.4, 15.6, 15.23, 15.25, 15.33