Presentation is loading. Please wait.

Presentation is loading. Please wait.

Communication Under Normed Uncertainties S. Z. Denic School of Information Technology and Engineering University of Ottawa, Ottawa, Canada C. D. Charalambous.

Similar presentations


Presentation on theme: "Communication Under Normed Uncertainties S. Z. Denic School of Information Technology and Engineering University of Ottawa, Ottawa, Canada C. D. Charalambous."— Presentation transcript:

1 Communication Under Normed Uncertainties S. Z. Denic School of Information Technology and Engineering University of Ottawa, Ottawa, Canada C. D. Charalambous Department of Electrical and Computer Engineering University of Cyprus, Nicosia, Cyprus S. M. Djouadi Department of Electrical and Computer Engineering University of Tennessee, Knoxville, USA Dec 9, 2004 Robust Capacity of White Gaussian Noise with Uncertainty

2 2 Overview Importance of Uncertainty in Communications Shannon’s Definition of Capacity Review of Maximin Capacity Paper Contributions

3 3 Overview Main Results Examples Conclusion and Future Work

4 4 Importance of Communication Subject to Uncertainties Channel measurement errors Network operating conditions Channel modeling Communication in presence of jamming Sensor networks Teleoperations

5 5 Shannon’s Definition of Capacity

6 6 Model of communication system Source Encoder Decoder Sink Channel +

7 7 R is achievable rate if there exists a sequence of codes (M=2 nR,n) such that the probability of error tends to zero as n tends to infinity. The operational capacity is the supremum of all achievable rates. Shannon’s Definition of Capacity

8 8 Discrete memoryless channel Channel capacity depends on channel transition matrix Q(y|x) that is known

9 9 What if Q(y|x) is unknown ? Example: compound BSC What is the channel capacity ? Shannon’s Definition of Capacity   1-  0 0 1 1

10 10 Additive Gaussian Channels  Random Variable Case Shannon’s Definition of Capacity

11 11 What is the capacity if noise is unknown? Gaussian AVC Statistic of is unknown Shannon’s Definition of Capacity

12 12 Additive Gaussian Channels  Random Process Case Shannon’s Definition of Capacity + x y n

13 13 Random process case derivation Shannon’s Definition of Capacity

14 14 Capacity of continuous time additive Gaussian channel Shannon’s Definition of Capacity

15 15 Range of integration What is the capacity if the frequency response of the channel H, or psd of the noise S n belong to certain sets? Shannon’s Definition of Capacity

16 16 Water-filling Shannon’s Definition of Capacity

17 17 Review of Maximin Capacity

18 18 Example: compound DMC This result is due to Blackwell et. al. [6]. Also look at Csiszar [8], and Wolfowitz [21] Blachman [5], and Dobrushin [12] were first to apply game theoretic approach in computing the channel capacity with mutual information as a pay-off function for discrete channels Review of Minimax Capacity

19 19 Review of Minimax Capacity The existence of saddle point ? For further references see Lapidoth, Narayan [18]

20 20 Gaussian AVC (GAVC) Channels  Hughes, and Narayan [16] determined the -capacity of discrete time GAVC for averaged, and peak power constraints imposed on channel, and transmitted sequence for random codes  Hughes, and Narayan [17] determined the capacity of vector discrete time GAVC accompanied with water-filling equation, and proved the saddle point  Csiszar, and Narayan [9] determined the capacity of GAVC for deterministic codes  Ahlswede [1] computed the channel capacity of GAVC when the noise variance varies but does not exceed certain bound Review of Minimax Capacity

21 21 Gaussian AVC (GAVC) Channels  McEliece [22] was first to apply game theoretic approach on continuous channels with mutual information as pay-off function  Basar, and Wu [3] considered jamming Gaussian channels with mean-square error pay-off function  Diggavi, and Cover [23] used the game theoretic approach to find the worst case noise when the covariance matrix constraint is imposed on the noise  Baker [2] computed the channel capacity of M-dimensional Gaussian continuous time channels with energy constraints posed on the transmitted signal, and jammer  Root, and Varaiya [20] considered the capacity of white noise Gaussian continuous time compound channel Review of Minimax Capacity

22 22 Gaussian AVC (GAVC) Channels  Vishwanath, et. al. [24], computed the worst case capacity of Gaussian MIMO channels by applying duality principle Review of Minimax Capacity

23 23 Paper Contributions

24 24 Modeling of uncertainties in the normed linear spaces H ∞, and L 1 Explicit channel capacity formulas for SISO communication channels that depend on the sizes of uncertainty sets for uncertain channel, uncertain noise, and uncertain channel, and noise Explicit water-filling formulas that describe optimal transmitted powers for all derived channel capacities formulas depending on the size of uncertainty sets Paper Contributions

25 25 Our computation does not require the saddle point because it relies on the work of Root, and Varaiya [20], and Gallager [15] Our approach enables us to deal in the same time with two types of uncertainties (frequency response of the channel, and noise uncertainties) that has not been done until now (to our best knowledge) Paper Contributions

26 26 Our approach gives the solution to the jamming problem for continuous time channels accompanied with optimal transmitter and jammer strategies in terms of the optimal PSD’s. We show that the optimal PSD of the signal is proportional to the optimal PSD of the noise We also show the existence of the saddle point Paper Contributions

27 27 The optimal encoding, and decoding techniques for non-stationary Gaussian source in the presence of feedback for wireless flat-fading channel are derived It is shown that there exists analogy between optimal encoding, and decoding problem in Communications, and optimal tracking problem in Control The condition for perfect tracking over a wireless channel is derived Thesis Contributions

28 28 Main Results

29 29 Model Communication system model + x n y

30 30 Both could be unknown Unknown transfer function can be model by using additive uncertainty and where Other models are possible: multiplicative Communication system model

31 31 Communication system model Uncertainty models: additive and multiplicative + +

32 32 Example Communication system model  /  (1-  ) Re Im  /  (1+  ) //

33 33 The uncertainty set is described by the ball in frequency domain centered at and with radius of Communication system model

34 34 Channel capacity with uncertainty Define four sets

35 35 Overall PSD of noise is and uncertainty is modeled by uncertainty of filter or by the set A 4 Channel capacity with uncertainty

36 36 Mutual information rate is pay-off function Channel capacity with uncertainty

37 37 Three problems could be defined  Noise uncertainty  Channel uncertainty Channel capacity with uncertainty I

38 38 Channel – noise uncertainty Channel capacity with uncertainty I

39 39 Channel capacity with channel-noise uncertainty Theorem 1. Assume that is bounded and integrable. Channel capacity with uncertainty I

40 40 Channel capacity is given parametrically Channel capacity with uncertainty I

41 41 Such that where * is related to Lagrange multiplier, and is obtained from constraint equation. Infimum over noise uncertainty is achieved at Channel capacity with uncertainty I

42 42 Mutual information rate after minimization is give as Channel capacity with uncertainty I

43 43 Infimum over channel uncertainty is achieved at Channel capacity with uncertainty I

44 44 Mutual information rate after second minimization is give as Channel capacity with uncertainty I

45 45 Maximization gives water – filling equation Channel capacity with uncertainty I

46 46 Channel capacity with uncertainty I Water – filling

47 47 Jamming  Noise uncertainty  Channel – noise uncertainty Channel capacity with uncertainty II

48 48 Capacity for noise uncertainty case is obtained for Theorem 2. Assume that is bounded, and integrable. Define sets Channel capacity with uncertainty II

49 49 The lower value C - of pay-off function is defined as and is given by Theorem 2. The upper value C + is defined by Channel capacity with uncertainty II

50 50 Channel capacity is given as where are Lagrange multipliers Channel capacity with uncertainty II

51 51 The lower value C - is equal to the upper value C + implying that the saddle point exists The optimal PSD of the noise is proportional to the optimal PSD of the transmitter, which can be explained such that the players in the game try to match each other Channel capacity with uncertainty II

52 52 Channel coding theorem Define the frequency response of equivalent channel with impulse response and ten sets

53 53 Channel coding theorem

54 54 1) has finite duration  2) 3) Sets K i are conditionally compact sets. Channel coding theorem

55 55 Positive number R i is called attainable rate for the set of channels K i if there exists a sequence of codes such that when then uniformly over set K i. Theorem 1. The operational capacities C i (supremum of all attainable rates R i ) for the sets of communication channels with the uncertainties K i are given by corresponding computed capacity formulas. Proof. Follows from [15], and [20] (see [11]) Channel coding theorem

56 56 SNR and mutual information Wireless channel with feedback

57 57 Wireless channel with feedback SNR and mutual information

58 58 Wireless system model Wireless channel with feedback

59 59 Optimal encoding, and decoding Wireless channel with feedback

60 60 Analogy between optimal encoding and tracking Wireless channel with feedback + x y n - decoder

61 61 Necessary condition for zero asymptotic error Wireless channel with feedback

62 62 Uncertain channel, white noise  Transfer function Example 1

63 63 Channel capacity Example 1

64 64  P = 10 -2 W N 0 = 10 -8 W/Hz  = 1000 rad/s Example 1

65 65 Example 2 Uncertain noise  Transfer function  Noise uncertainty description

66 66 Example 2

67 67 Example 2

68 68 Example 3 Uncertain channel, uncertain noise Damping ration is uncertain The noise uncertainty is modelled as in the Example 2

69 69 Example 3

70 70 Example 3

71 71 Currently generalizing these results to uncertain MIMO channels Future work

72 72 References [1] Ahlswede, R., “The capacity of a channel with arbitrary varying Gaussian channel probability functions”, Trans. 6 th Prague Conf. Information Theory, Statistical Decision Functions, and Random Processes, pp. 13-31, Sept. 1971. [2] Baker, C. R., Chao, I.-F., “Information capacity of channels with partially unknown noise. I. Finite dimensional channels”, SIAM J. Appl. Math., vol. 56, no. 3, pp. 946-963, June 1996.

73 73 [4] Biglieri, E., Proakis, J., Shamai, S., “Fading channels: information- theoretic and communications aspects,” IEEE Transactions on Information Theory, vol. 44, no. 6, pp. 2619-2692, October, 1998. [5] Blachman, N. M., “Communication as a game”, IRE Wescon 1957 Conference Record, vol. 2, pp. 61-66, 1957. [6] Blackwell, D., Breiman, L., Thomasian, A. J., “The capacity of a class of channels”, Ann. Math. Stat., vol. 30, pp. 1229-1241, 1959. [7] Charalambous, C. D., Denic, S. Z., Djouadi, S. M. "Robust Capacity of White Gaussian Noise Channels with Uncertainty", accepted for 43th IEEE Conference on Decision and Control. References

74 74 References [8] Csiszar, I., Korner, J., Information theory: Coding theorems for discrete memoryless systems. New York: Academic Press, 1981. [9] Csiszar, I., Narayan P., “Capacity of the Gaussian arbitrary varying channels”, IEEE Transactions on Information Theory, vol. 37, no. 1, pp. 18-26, Jan., 1991. [10] Denic, S. Z., Charalambous, C. D., Djouadi, S.M., “Capacity of Gaussian channels with noise uncertainty”, Proceedings of IEEE CCECE 2004, Canada. [11 ] Denic, S.Z., Charalambous, C.D., Djouadi, S.M., “Robust capacity for additive colored Gaussian uncertain channels,” preprint.

75 75 [12] Dobrushin, L. “Optimal information transmission through a channel with unknown parameters”, Radiotekhnika i Electronika, vol. 4, pp. 1951-1956, 1959. [13] Doyle, J.C., Francis, B.A., Tannenbaum, A.R., Feedback control theory, New York: McMillan Publishing Company, 1992. [14] Forys, L.J., Varaiya, P.P., “The  -capacity of classes of unknown channels,” Information and control, vol. 44, pp. 376-406, 1969. [15] Gallager, G.R., Information theory and reliable communication. New York: Wiley, 1968. References

76 76 [16] Hughes, B., Narayan P., “Gaussian arbitrary varying channels”, IEEE Transactions on Information Theory, vol. 33, no. 2, pp. 267- 284, Mar., 1987. [17] Hughes, B., Narayan P., “The capacity of vector Gaussian arbitrary varying channel”, IEEE Transactions on Information Theory, vol. 34, no. 5, pp. 995-1003, Sep., 1988. [18] Lapidoth, A., Narayan, P., “Reliable communication under channel uncertainty,” IEEE Transactions on Information Theory, vol. 44, no. 6, pp. 2148-2177, October, 1998. [19] Medard, M., “Channel uncertainty in communications,” IEEE Information Theory Society Newsletters, vol. 53, no. 2, p. 1, pp. 10- 12, June, 2003. References

77 77 [20] Root, W.L., Varaiya, P.P., “Capacity of classes of Gaussian channels,” SIAM J. Appl. Math., vol. 16, no. 6, pp. 1350-1353, November, 1968. [21] Wolfowitz, Coding Theorems of Information Theory, Springer – Verlang, Belin Heildelberg, 1978. [22] McElice, R. J., “Communications in the presence of jamming – An information theoretic approach, in Secure Digital Communications, G. Longo, ed., Springer-Verlang, New York, 1983, pp. 127-166. [23] Diggavi, S. N., Cover, T. M., “The worst additive noise under a covariance constraint”, IEEE Transactions on Information Theory, vol. 47, no. 7, pp. 3072-3081, November, 2001. References

78 78 [24] Vishwanath, S., Boyd, S., Goldsmith, A., “Worst-case capacity of Gaussian vector channels”, Proceedings of 2003 Canadian Workshop on Information Theory. [25] Shannon, C.E., “Mathematical theory of communication”, Bell Sys. Tech. J., vol. 27, pp. 379-423, pp. 623-656,July, Oct, 1948 References


Download ppt "Communication Under Normed Uncertainties S. Z. Denic School of Information Technology and Engineering University of Ottawa, Ottawa, Canada C. D. Charalambous."

Similar presentations


Ads by Google