Presentation is loading. Please wait.

Presentation is loading. Please wait.

Multi-Terminal Information Theory Problems in Sensor Networks Gregory J Pottie UCLA Electrical Engineering Department

Similar presentations


Presentation on theme: "Multi-Terminal Information Theory Problems in Sensor Networks Gregory J Pottie UCLA Electrical Engineering Department"— Presentation transcript:

1 Multi-Terminal Information Theory Problems in Sensor Networks Gregory J Pottie UCLA Electrical Engineering Department pottie@icsl.ucla.edu

2 Outline General Issues Basic tools of information theory Multi-terminal information theory Applications –Data fusion –Cooperative communication My pet ideas

3 Sensor Network Operation Data fusion Cooperative communication Routing Basic goal: reliable detection/identification of sources, and timely notification of end user

4 Basic Information Theoretic Concepts Typical Sets (of sufficiently long sequences of iid variables): Has probability nearly 1 The elements are equally probable The number of elements is nearly 2 nH source channel encoder channel p(y|x) decoder W XnXn YnYn Aim of communications system: Minimize errors due to noise in channel Maximize data rate Minimize bandwidth and power (the resources) Shannon Capacity establishes the fundamental limits

5 Basic Information Theoretic Concepts source channel encoder channel p(y|x) decoder W XnXn YnYn Capacity C is the max mutual information I(X;Y) wrt p(x); that is, choose the set X leading to largest mutual information. Capacity C is the largest rate at which information can be transmitted without error Jointly typical set: from among the typical input and output sequences, choose the ones for which 1/n log p(x n,y n ) close to H(X,Y) Size of jointly typical set is about 2 nI(X,Y), thus there are about this number of distinguishable signals (codewords) in X n These codewords necessarily contain redundancy--size of set is smaller than the alphabet would imply; sequences provide better performance than isolated symbols if properly chosen.

6 Jointly Typical Sequences XnXn YnYn X1nX1n X2nX2n Output set in general larger due to additive noise; Output images of inputs may overlap due to noise

7 Gaussian Channel Capacity Discrete inputs to channel, and channel adds noise with Gaussian distribution (zero mean, variance N) Input sequence (codeword) power set to P Capacity is maximum I(X;Y) over p(x) such that EX 2 satisfies power constraint C = 1/2 log(1+P/N) bits per transmission. The more usual form is to consider a channel of bandwidth W and noise power spectral density No. Then C = W log(1+P/NoW) bits per second.

8 Capacity and Coding Shannon capacity is the maximum rate at which information may be reliably sent over a channel (zero decoding error probability) Given bandwidth and bit rate, can compute SNR at which capacity is achieved Practical channel codes seek to reduce SNR to the minimum required to achieve transmission at some finite bit error rate, with reasonable decoding complexity log P(e) SNR (dB) C Digital modulation Coded Modulation

9 Parallel Channels The water-filling power distribution maximizes the capacity of parallel Gaussian channels, with noise variances N i Capacity is sum of those of the subchannels to which power is allocated This can be extended to continuously variable channels (in some combination of time and frequency), such as radio channels experiencing multipath fading Practical algorithms must cope with channel dynamics; channel state must be conveyed to transmitter to approach capacity N1N1 N2N2 N3N3 N4N4 N5N5 P2P2 P3P3 Power

10 Multi-Terminal Information Theory The preceding discussion assumed a single transmitter and receiver Multi-terminal information theory considers maximization of mutual information for the following possibilities: Multiple senders and one receiver (the multiple access channel) One sender and multiple receivers (the broadcast channel) One sender and one receiver, but intervening transducers that can assist (the relay channel) Composite combinations of these basic types Estimation theory also aims to maximize mutual information, except the senders do not cooperate and usually there is a fidelity constraint: One sender and multiple receivers (the data fusion problem) Multiple senders and receivers (the source separation problem) Delay and resource usage may also be included

11 Gaussian Multiple Access Channel m transmitters with power P sharing the same noisy channel C(P/N)=1/2 log(1+P/N) bits per channel use for isolated sender then the achievable rate region is The last inequality dominates when rates are the same Capacity increases with more users (there is more power) Result is dual to Slepian-Wolf encoding of correlated sources

12 Gaussian Broadcast Channel One sender of power P and two receivers, one with noise N 1 and one with noise N 2, N 1 < N 2 The two codebooks are coordinated to exploit commonality of information transmitted, otherwise capacity does not exceed simple multiplexing

13 Relay Channel One sender, one relay, and one receiver; relay transmits X 1 based only on its observations Y 1 X Y Y 1 :X 1 Combines a broadcast channel and a multiple access channel Networks are comprised of multiple relay channels that may further induce delay

14 General Multi-Terminal Networks m nodes, with node j with associated transmission variable X (j), and receive variable Y (j) Node 1 transmits to node m; what is the maximum achievable rate? (X 1,Y 1 ) (X m,Y m ) Bounds derived from information flow across multiple cut sets generally not achievable Source-channel coding separation theorem fails because capacity of multiple access channels increases with correlation, while source encoding eliminates correlation

15 Now let it move… Nodes move within bounded region according to some random distribution; what is capacity subject to energy constraint on messages? Node 1 Node m Answer depends on delay constraint; eventually they will collide implying near-zero path loss and thus unbounded capacity Other questions: Probability the nodes have connecting path of required rate Probability of message arriving in required delay Time 1 Time 2

16 Some Recent Research Data fusion in sensor networks Cooperative communications in sensor networks Wild speculation on signal locality, scaling, and hierarchy

17 But first, a Rate Distortion Primer Rate distortion function R(D) can be interpreted as The minimum rate at which a source can be represented subject to a distortion D=d(X,Y) The minimum distortion that can be achieved given a maximum rate constraint R Interesting dual results to Capacity; here we get to determine the distortion Applies to compression of real-valued sequences R D Achievable region

18 Rate Distortion and Data Fusion Can identify resource use (energy/number of bits transmitted) with rate, decision reliability (false alarm rate, missed detection prob) with distortion Operate at different points on rate distortion curve depending on values of cost function Location of fusion center, numerical resolution, number of sensors, length of records, routing, distribution of processing all affect R(D)

19 A Simple Algorithm Nodes activated to send requests for information from other nodes based on SNR If above threshold T, decision is reliable, and suppress activity by neighbors Otherwise, increase likelihood of requesting help based on proximity to T In likelihood, higher SNR nodes form the cluster Bits of resolution related to SNR (e.g., for use in maximal ratio combining) 1 2 3 3 1: high SNR; initiates 2: activated, and requests further information 3: SNR too low to respond

20 The n-helper Gaussian Scenario Multiple sensors observe event and generate correlated Gaussian data. One data node (X) is the main data source (e.g. closest to phenomenon), and the n additional nodes (Y 1 - Y n ) are the ‘helpers’. The Problem: What codes and data rates so that gateway/data- fusion center can reproduce the data from the main node using the remaining nodes as sources of partial side information, subject to some distortion criterion. Y2Y2 Y1Y1 Y3Y3 YnYn X … Gateway/Fusion center

21 Main Result We do not care about reproduction of the Y variables; rather they act as helpers to reproduce X This problem was previously solved for the 2-node case Our solution: for an admissable rate (R x,R 1,…,R n ), and for some D i ’s>0, the n-helper system data rates can be fused to yield an effective data rate (wrt source X) satisfying the following rate distortion bound: where  2 is the variance and  the correlation

22 Comments Other source distributions analytically difficult, but many are likely to be convex optimizations Generalization would consider instances of relay/broadcast channels in conveying information to fusion center with minimum energy Sensor network detection problems are inherently local: even though expression may be complicated, the number of helpers will usually be small due to decay of signals as power of distance

23 Problem Definition of Cooperative Communication Many low-power and low-cost wireless sensors cooperate with each other to achieve more reliable and higher rate communications The dominant constraint is the peak power, the bandwidth is not the main concern Multiplexing (FDMA, TDMA, CDMA, OFDM) is the standard approach. Each sensor has an unique channel We focus on schemes where multiple sensors occupy the same channel

24 Example: Space-Time Coding N transmit antennas and N receive antennas Channel transition matrix displays independent Rayleigh (complex Gaussian) fading in each component With properly designed codes, capacity is N times that of single Rayleigh channel Note this implicitly assumes synchronization among Tx and Rx array elements--requires special effort in sensor networks A coordinated transmission, not a multiple access situation.

25 Context Cooperative reception problem very similar to multi-node fusion problem; same initiation procedure required to create the cluster, however we can choose channel code. Cooperative transmission and reception similar to multi-target multi- node fusion, but more can be done: beacons, space-time coding Use to overcome gaps in network, communicate with devices outside of sensor network (e.g. UAV)

26 Channel Capacity Channel state information: known at transmitter side, and at both sides If channel state information is known at the transmit side, RF synchronization can be achieved Channels: AWGN and fading channels with unequal path loss General formula

27 Channel Capacity(cont’d) Receive diversity: Transmit diversity: Combined transmit-receive diversity RF synchronization

28 Comments Capacity is much higher if phase synchronization within transmitter and receiver clusters can be achieved Have investigated practical methods for satellite/ground sensors synchronization Beacons (e.g. GPS) can greatly simplify the synchronization problem for ground/ground cooperative communications Recent network capacity results do not take into account possibilities for cooperation by nodes as transmitter/receiver clusters

29 Implications of Signal Locality Severe decay of signals with distance (second to fourth power) Mutual information to source dominated by small set of nodes Cooperative communication clusters for ground to ground transmission will likely be small Implications: Local processing is nearly optimal; do not need to convey raw data over long distances very frequently Consequently, lowest layers of processing/network formation, etc. are the most important, since most frequently invoked (“typical”) Practical example: Specialized local transmission schemes (e.g., for forming ad hoc clusters), but long range might use conventional methods such as TCP/IP

30 Hierarchy For dealing with the network as a whole, number of variations of topology are immense Distributed algorithms exploiting locality of events Use of ensembles for deriving bounds In between, considers layers of hierarchy, each of which may be amenable to a conventional optimization technique

31 Information Processing Hierarchy human observer low power high false alarm rate high duty cycle beamforming transmit decision query for more information base station high resolution processing cue high power low false alarm rate low duty cycle

32 Information Theory Challenges Minimal energy to obtain reliable decision in a distributed network Minimal energy to relay a decision across a distributed network (including gaps) Minimal (average) delay in conveying information through network Role of hierarchy; how much leads to what kinds of changes in information theoretic optimal behavior At small scale can use brute force, at large scale ensembles; what can we do in between? Limits of distributed vs. centralized approaches

33 References T. Cover and J. Thomas, Elements of Information Theory. Wiley: 1991. G. Pottie and W. Kaiser, “Wireless Integrated Network Sensors,” Commun. ACM, May 2000 M. Ahmed, Y-S. Tu, and G. Pottie, “Cooperative detection and communication in wireless sensor networks,” 38th Allerton Conf. On Comm., Control and Computing, Oct. 2000.


Download ppt "Multi-Terminal Information Theory Problems in Sensor Networks Gregory J Pottie UCLA Electrical Engineering Department"

Similar presentations


Ads by Google