Presentation is loading. Please wait.

Presentation is loading. Please wait.

Cooperative Communication in Sensor Networks: Relay Channels with Correlated Sources Brian Smith and Sriram Vishwanath University of Texas at Austin October.

Similar presentations


Presentation on theme: "Cooperative Communication in Sensor Networks: Relay Channels with Correlated Sources Brian Smith and Sriram Vishwanath University of Texas at Austin October."— Presentation transcript:

1 Cooperative Communication in Sensor Networks: Relay Channels with Correlated Sources Brian Smith and Sriram Vishwanath University of Texas at Austin October 1, 2004 Allerton Conference on Communication, Control, and Computing

2 Overview The Sensor Network Problem The Relay Channel Block-Markov Coding Correlated Side Information and the Relay Channel Communicating Two Sources Example: A MIMO Gaussian Relay Channel Conclusion

3 Sensor Networks Many distributed nodes can make measurements and cooperate to propagate information through the system, perhaps to a single endpoint Key Observation: Nodes located near each other may have correlated information. What effect does this have on information flow? Terminal Node Sensor Nodes Example Relay Configurations

4 The Relay Channel Introduced by van der Meulen “Three-terminal communication channels,” Adv. Appl Prob., vol. 3, pp.120-54, 1971. Discrete Memoryless Relay Channel consists of An Input X. A Relay Output Y 1. A Relay Sender X 1, which can depend upon previously received Y 1. A Channel Output Y. A conditional probability distribution p(y,y 1 |x,x 1 ). Transmitter, Input X Relay, Receives Y 1 ; Inputs X 1 Receiver, Receives Y

5 The Relay Channel in Sensor Networks Multi-hop Strategies are thought to be “good” methods for conveying information through wireless networks G. Kramer and M. Gastpar and P. Gupta,“Capacity Theorems for Wireless Relay Channels,” Proc. 41 st Allerton Conf. on Comm., Control, and Computing, Oct. 1-3 2003 P. Gupta and P. R. Kumar, “The Capacity of Wireless Networks,” IEEE Trans. Inform. Theory, vol. 46, no. 3, pp. 388-404. Mar. 2000 A Relay Channel embedded in a sensor network differs from the previous definition in that both the sender and the relay have access to sources of information. If those sources are correlated, can exploit the relay’s source as side information In some instances, it is desirable to transmit the relay’s information to the receiver, as well as the transmitter’s

6 The Relay Channel Capacity of General Relay Channel is Unknown Upper Bound by Cut-Set Argument Capacity Known for special cases, i.e. Degraded Relay Channel T.M. Cover and A.A. El Gamal, “Capacity theorems for the relay channel,” IEEE Trans. Information Theory, vol. IT-25, No. 5, pp 572-84, Sep 1979. Is also a lower bound on capacity of the general relay channel Often the best known lower bound This Rate is Achieved by Block-Markov Coding Introduce a correlation between the transmitter input and the relay input to aid in decoding at the receiver The relay completely decodes the message meant for the receiver in the block- Markov scheme

7 Block-Markov Coding Overview of Block-Markov Coding for Classic Relay Channel Relay Terminal completely decodes a message index w from the set {1..2 nR } sent by the transmitter Relay sends the bin index of the message that it received to aid the receiver in decoding This is helpful, because the transmitter’s codeword is dependent on the bin index that the relay is transmitting in the same block Codebook generation Fix any p(x,x1) as Generate 2 nR 0 codewords x 1 n as Index them as x 1 n (s) For each of these x 1 n codewords, generate 2 nR x n codewords as Index these as x n (w|s) Independently bin the messages w into the 2 nR 0 bins s

8 Block-Markov Coding Encoding Messages sent in a total of B blocks In the first block Relay sends codeword x 1 n corresponding to a pre-determined null message, say x 1 n (φ) Transmitter sends codeword x n, dependant on the first message w1 and the null message, say x n (w 1 |φ) In block b Assuming relay correctly decoded message w b-1 sent in the previous block, relay sends codeword x 1 n (s b-1 ) Transmitter sends x n (w b |s b-1 ) Same bin index s that relay is simultaneously sending Shifted by one block to allow relay to decode current message

9 Block-Markov Encoding Message: Block:b=1b=2 b w1w1 Transmitter: Relay:x1n(φ)x1n(φ) w2w2 x n (w 2 |s 1 ) x1n(s1)x1n(s1) x n (w 1 |φ)x n (w b |s b-1 ) b-1 x 1 n (s b-1 ) w b-1 wbwb w1w1 Relay Decodes: Receiver Decodes:s1s1 At the end of each block: w2w2 s b-2 wbwb s b-1 w1w1 w b-2 w b-1

10 Block-Markov Coding Decoding At the Relay Can determine w b correctly whp if R<I(X;Y 1 |X 1 ) At the Receiver Can determine s b-1 correctly whp if R 0 <I(X 1 ;Y) Make a list of all messages w b for which the codewords x n (w b |s b-1 ). x 1 n (s b-1 ), and y n are jointly typical, put the list aside until the end of the next block In the block b+1, determine s b as above, and declare w b to be the message sent in block b if it is the only message on the list in the bin s b Done correctly whp when R-R 0 < I(X;Y|X 1 ) Intuitively – 2^(R-R 0 ) messages in the bin, only one should be jointly typical If both constraints on R are fulfilled, then the rate is achievable Next, apply this to our sensor network (correlated) configuration X Y 1 :X 1 Y Source U

11 Key addition to the coding strategy: Slepian-Wolf coding With no side information at the relay, the block-Markov strategy requires that all of H(U) be transmitted to the relay However, if the relay has access to a source V correlated with U, only need to push H(U|V) information across the channel Will show that this relay channel has achievable rate Implies that if for the p* which maximizes I(X,X 1 ;Y) the first term is greater than the second term, then we have found capacity Relay Channel with Correlated Side Information X Y 1 :X 1 Y Source U Source V Correlation

12 Strategy for Relay Channel with Correlated Side Information Codebook Generation: Identical to Block-Markov Encoding: For source U, generate 2 nR sequences U n and index them by w The rest of the encoding is identical to Block-Markov. Decoding at the Relay: Form two lists of message indices w Those whose codewords x n (w b |s b-1 ) are jointly typical with the yb received and x 1 n (s b-1 ) Those whose sequences U n are jointly typical with the V n at the relay Choose the unique index in that appears in both lists

13 Strategy for Relay Channel with Correlated Side Information Decoding at the Relay Choose the unique index in that appears in both lists Correct with high probability if Probability that an incorrect codeword is jointly typical with y n Probability that an incorrect sequence is jointly typical with V n Independent, both errors must occur, error probability is the product Decoding at the Receiver Identical to Block-Markov, leading to the same constraint X Y 1 :X 1 Y Source U

14 Summary for Relay Channel with Correlated Side Information Achievable Rates Not closed form; because I(U;V) and H(U) are related If correlation is great enough, capacity is found Specifically, if under the which maximizes the cooperative rate MIMO example with two receive antennae Strategy utilizes Joint Source-Channel Coding

15 Relay Channel with Two Correlated Sources Desire to send both the source U at the transmitter and the source V located at the relay Multiple access channel where one source can send some information to the other Achievable Rates such that Interpretation: In the previous case, the codeword x 1 n carried information about the bin index s, only. Now, it needs to also describe the sequence V n

16 Two Sources Block-Markov Coding Overview Have data sequences U n and V n As before, indices w in {1..2 nR u } refer to typical U n sequences Indices s in {1..2 nR 0 } correspond to bins of U n sequences New indices k in {1..2 nR 1 } for bins of V n sequences

17 Two Sources Block-Markov Coding Codebook generation Fix Generate 2 nR 0 sequences z n as Index them as z n (s) For each of these sequences z n, generate 2 nR u codewords x n as Index them as x n (w|s) For each of the sequences z n, generate 2 nR 1 codewords x 1 n as Index them as x 1 (k,s) Independently bin the messages w into the 2 nR 0 bins s Generate 2 nR u sequences U n and 2 nR v sequences V n Randomly place the V n sequences into the 2 nR 1 bins k

18 Encoding Graphic 2 nR u U n Sequences Indexed by w ~2 nR u -R 0 U n Sequences in Each s Bin 2 nR 0 Bins Indexed by s ~2 nR v -R 1 V n Sequences in Each k Bin S=1S=2S=3 S=2 nR0 2 nR v V n Sequences 2 nR 1 Bins Indexed by k k=1k=2k=3 k=2 nR1 Codewords for Each Block z n – depends on s x n (w b |s b-1 ) – depends on current w and previous s (through z) x 1 n (k b,s b-1 ) – depends on current k and previous s (through z)

19 Two Sources Block-Markov Decoding At the Relay Decodes w b if At the Receiver Decodes s b-1 if Decodes w b-1 if Decodes k b if Chooses V n b-1 if it is the only sequence in bin k b-1 jointly typical with U n b-1 Correct choice if X Y 1 :X 1 Y Source U Source V Correlation z n (s b-1 ), x 1 n (s b-1,k b ) x n (w b,s b-1 )

20 Two Sources Summary Constraints taken together give the achievable rates Codebook generation is not over arbitrary p(x,x 1 ) Cannot say that this is capacity Correlation helps improve rate along two links in the channel Independent compression and channel coding would limit the rate such that

21 Example: MIMO Channel Example where Capacity Found Description of example MIMO System Transmitter to Relay is Point-to-Point with gain h Relay and Transmitter each have single antenna, power constraints P and P 1 Receiver has two antennae and matrix gain H Independent Gaussian noise Model with unit noise power Desire to transmit the single source U X Y 1 =hX + η 1 Y=H[x,x 1 ] T + [η Y1, η Y2 ] T

22 Example: MIMO Channel Achievable Rate: Treat (X,X 1 ) →Y as a MIMO Point-to-Point Solve for covariance which maximizes Given the covariance of the form p*(x,x 1 ) is uniquely defined, calculate

23 Example: MIMO Channel Achievable Rate: Knowing two of the mutual information terms tells how much correlation there must be between U and V for the cooperative rate to be capacity Numerical Example: Power constraints: P=2, P 1 =2 X Y 1 =3X + η 1 Y=H[x,x 1 ] T + [η Y1, η Y2 ] T

24 Example: MIMO Channel Maximizing correlation for input distribution: Leads to H(U)=I(X,X 1 ;Y)=2.569 and I(X;Y 1 |X 1 )=1.971 So if I(U;V) >.598, then cooperative rate is capacity Assume a model for correlation such that I(U;V) = βH(U) β=0 for totally independent U and V β=1 for U = V For a β>.233, cooperative rate is capacity X Y 1 =3X + η 1 Y=H[x,x 1 ] T + [η Y1, η Y2 ] T

25 Summary Started with the observation that nodes in a sensor network may have correlated data Showed that can use this side information to increase rate (or decrease power usage) Jointly coding over both source and channel can be more powerful than doing each individually For some relay channels with correlated sources, capacity can be found


Download ppt "Cooperative Communication in Sensor Networks: Relay Channels with Correlated Sources Brian Smith and Sriram Vishwanath University of Texas at Austin October."

Similar presentations


Ads by Google