CWIT 2005 1 Robust Entropy Rate for Uncertain Sources: Applications to Communication and Control Systems Charalambos D. Charalambous Dept. of Electrical.

Slides:



Advertisements
Similar presentations
Random Processes Introduction (2)
Advertisements

Sampling and Pulse Code Modulation
ECE 8443 – Pattern Recognition LECTURE 05: MAXIMUM LIKELIHOOD ESTIMATION Objectives: Discrete Features Maximum Likelihood Resources: D.H.S: Chapter 3 (Part.
Information Theory EE322 Al-Sanie.
TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: AAAAAAAAAAAAAA Antonia Tulino Università degli Studi di Napoli Chautauqua.
1 12. Principles of Parameter Estimation The purpose of this lecture is to illustrate the usefulness of the various concepts introduced and studied in.
Bounds on Code Length Theorem: Let l ∗ 1, l ∗ 2,..., l ∗ m be optimal codeword lengths for a source distribution p and a D-ary alphabet, and let L ∗ be.
Chain Rules for Entropy
Observers and Kalman Filters
TCOM 501: Networking Theory & Fundamentals
Chapter 6 Information Theory
Random Matrices Hieu D. Nguyen Rowan University Rowan Math Seminar
Fundamental limits in Information Theory Chapter 10 :
SYSTEMS Identification
Probability By Zhichun Li.
Probability theory 2010 Conditional distributions  Conditional probability:  Conditional probability mass function: Discrete case  Conditional probability.
Linear Codes for Distributed Source Coding: Reconstruction of a Function of the Sources -D. Krithivasan and S. Sandeep Pradhan -University of Michigan,
QUASI MAXIMUM LIKELIHOOD BLIND DECONVOLUTION QUASI MAXIMUM LIKELIHOOD BLIND DECONVOLUTION Alexander Bronstein.
1 10. Joint Moments and Joint Characteristic Functions Following section 6, in this section we shall introduce various parameters to compactly represent.
INFORMATION THEORY BYK.SWARAJA ASSOCIATE PROFESSOR MREC.
: Appendix A: Mathematical Foundations 1 Montri Karnjanadecha ac.th/~montri Principles of.
The Complex Braid of Communication and Control Massimo Franceschetti.
Functions of Random Variables. Methods for determining the distribution of functions of Random Variables 1.Distribution function method 2.Moment generating.
Generalized Semi-Markov Processes (GSMP)
Review for Exam I ECE460 Spring, 2012.
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Two Functions of Two Random.
Functions of Random Variables. Methods for determining the distribution of functions of Random Variables 1.Distribution function method 2.Moment generating.
Correntropy as a similarity measure Weifeng Liu, P. P. Pokharel, Jose Principe Computational NeuroEngineering Laboratory University of Florida
Mathematical Preliminaries. 37 Matrix Theory Vectors nth element of vector u : u(n) Matrix mth row and nth column of A : a(m,n) column vector.
Extremal Problems of Information Combining Alexei Ashikhmin  Information Combining: formulation of the problem  Mutual Information Function for the Single.
COMMUNICATION NETWORK. NOISE CHARACTERISTICS OF A CHANNEL 1.
SUPA Advanced Data Analysis Course, Jan 6th – 7th 2009 Advanced Data Analysis for the Physical Sciences Dr Martin Hendry Dept of Physics and Astronomy.
Communication Under Normed Uncertainties S. Z. Denic School of Information Technology and Engineering University of Ottawa, Ottawa, Canada C. D. Charalambous.
Multiple Random Variables Two Discrete Random Variables –Joint pmf –Marginal pmf Two Continuous Random Variables –Joint Distribution (PDF) –Joint Density.
Vector Norms and the related Matrix Norms. Properties of a Vector Norm: Euclidean Vector Norm: Riemannian metric:
ELEC 303 – Random Signals Lecture 18 – Classical Statistical Inference, Dr. Farinaz Koushanfar ECE Dept., Rice University Nov 4, 2010.
Cooperative Communication in Sensor Networks: Relay Channels with Correlated Sources Brian Smith and Sriram Vishwanath University of Texas at Austin October.
ECE 8443 – Pattern Recognition ECE 8423 – Adaptive Signal Processing Objectives: ML and Simple Regression Bias of the ML Estimate Variance of the ML Estimate.
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Principles of Parameter Estimation.
Generalized Semi- Markov Processes (GSMP). Summary Some Definitions The Poisson Process Properties of the Poisson Process  Interarrival times  Memoryless.
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Two Random Variables.
CHAPTER 5 SIGNAL SPACE ANALYSIS
Transmission over composite channels with combined source-channel outage: Reza Mirghaderi and Andrea Goldsmith Work Summary STATUS QUO A subset Vo (with.
Consistency An estimator is a consistent estimator of θ, if , i.e., if
Robotics Research Laboratory 1 Chapter 7 Multivariable and Optimal Control.
SYSTEMS Identification Ali Karimpour Assistant Professor Ferdowsi University of Mashhad Reference: “System Identification Theory For The User” Lennart.
Basic Concepts of Information Theory Entropy for Two-dimensional Discrete Finite Probability Schemes. Conditional Entropy. Communication Network. Noise.
1 EE571 PART 3 Random Processes Huseyin Bilgekul Eeng571 Probability and astochastic Processes Department of Electrical and Electronic Engineering Eastern.
Joint Moments and Joint Characteristic Functions.
A Low-Complexity Universal Architecture for Distributed Rate-Constrained Nonparametric Statistical Learning in Sensor Networks Avon Loy Fernandes, Maxim.
1 On the Channel Capacity of Wireless Fading Channels C. D. Charalambous and S. Z. Denic School of Information Technology and Engineering, University of.
Chance Constrained Robust Energy Efficiency in Cognitive Radio Networks with Channel Uncertainty Yongjun Xu and Xiaohui Zhao College of Communication Engineering,
CDC 2006, San Diego 1 Control of Discrete-Time Partially- Observed Jump Linear Systems Over Causal Communication Systems C. D. Charalambous Depart. of.
Week 21 Order Statistics The order statistics of a set of random variables X 1, X 2,…, X n are the same random variables arranged in increasing order.
CDC Relative Entropy Applied to Optimal Control of Stochastic Uncertain Systems on Hilbert Space Nasir U. Ahmed School of Information Technology.
CDC Control over Wireless Communication Channel for Continuous-Time Systems C. D. Charalambous ECE Department University of Cyprus, Nicosia, Cyprus.
Chapter 5 Joint Probability Distributions and Random Samples  Jointly Distributed Random Variables.2 - Expected Values, Covariance, and Correlation.3.
Week 21 Statistical Model A statistical model for some data is a set of distributions, one of which corresponds to the true unknown distribution that produced.
Refinement of Two Fundamental Tools in Information Theory Raymond W. Yeung Institute of Network Coding The Chinese University of Hong Kong Joint work with.
Probability Theory and Parameter Estimation I
LECTURE 09: BAYESIAN ESTIMATION (Cont.)
Outline Introduction Signal, random variable, random process and spectra Analog modulation Analog to digital conversion Digital transmission through baseband.
Ch3: Model Building through Regression
Graduate School of Information Sciences, Tohoku University
2018/9/16 Distributed Source Coding Using Syndromes (DISCUS): Design and Construction S.Sandeep Pradhan, Kannan Ramchandran IEEE Transactions on Information.
STOCHASTIC HYDROLOGY Random Processes
Information Theoretical Analysis of Digital Watermarking
Further Topics on Random Variables: Derived Distributions
Further Topics on Random Variables: Derived Distributions
Further Topics on Random Variables: Derived Distributions
Presentation transcript:

CWIT Robust Entropy Rate for Uncertain Sources: Applications to Communication and Control Systems Charalambos D. Charalambous Dept. of Electrical and Computer Engineering University of Cyprus, Cyprus. Also with the School of Information Technology and Engineering University of Ottawa, Canada. and Alireza Farhadi School of Information Technology and Engineering University of Ottawa, Canada.

2 CWIT 2005 Overview Robust Entropy  Solution and Relations to Renyi Entropy Examples  Uncertainty in Distribution  Uncertainty in Frequency Response Stabilization of Communication-Control Systems

CWIT Applications

4 CWIT 2005 Applications: Robust Lossless Coding Theorem [1] Source words of block length produced by a discrete uncertain memory less source with Shannon entropy can be encoded into code words of block length from a coding alphabet of size with arbitrary small probability of error if Uncertain Source EncoderDecoderDestination Uniform Source Encoder

5 CWIT 2005 Applications: Observability and Stabilizability of Networked Control Systems [2] Definition. The uncertain source is uniform asymptotically observable in probability if there exists an encoder and decoder such that where is the joint density of is the source density uncertainty set. and are fixed and.

6 CWIT 2005 Applications: Observability and Stabilizability of Networked Control Systems [2] Definition. The uncertain controlled source is uniform asymptotically stabilizable in probability if there exists encoder, decoder and controller such that For and small enough, a necessary condition for uniform observability and stabilizability in probability is

CWIT Problem Formulation, Solution and Relations

8 CWIT 2005 Problem Formulation  Let be the space of density functions defined on, be the true unknown source density, belonging to the uncertainty set. Let also be the fixed nominal source density.  Definition. The robust entropy of observed process having density function is defined by where is the Shannon entropy and

9 CWIT 2005 Problem Formulation Definition. In Previous definition, if represent a sequence of R.V.’s with length of source symbols produced by the uncertain source with joint density, the robust entropy rate is defined by provided the limit exits.

10 CWIT 2005 Solution to the Robust Entropy  Let the uncertain set be defined by where is the relative entropy and is fixed.  Lemma. Given a fixed nominal density and the uncertainty defined above, the solution to the robust entropy is given by where the minimizing in above is the unique solution of.

11 CWIT 2005 Solution to the Robust Entropy Corollary [1]. Suppose, is uniform Probability Mass Function (PMF), that is When and consequently Correspond to PMF’s, that is the previous solution is reduced to

12 CWIT 2005 Solution to the Robust Entropy Rate Corollary. Let be a sequence with length of source symbols with uncertain joint density function and let we exchange with. Then, the robust entropy rate is given by where the minimizing in above is the unique solution of.

13 CWIT 2005 Robust Entropy Rate of Uncertain Sources with Nominal Gaussian Source Density Example. From Previous result follows that if the nominal source density is -dimensional Gaussian density function with mean and covariance where is the unique solution of the following nonlinear equation

14 CWIT 2005 Relation among Renyi, Tsallis and Shannon Entropies Definition. For and, the Renyi entropy [3] is defined by Moreover, the Tsallis entropy [4] is defined by We have the following relation among Shannon, Renyi and Tsallis entropies.

15 CWIT 2005 Relation between Robust Entropy and Renyi Entropy The robust entropy found previously is related to the Renyi entropy as follows. Let ; then Moreover,

CWIT Examples

17 CWIT 2005 Examples: Partially Observed Gauss Markov Nominal Source Assumptions. Let and are Probability Density Functions (PDF’s) corresponding to a sequence with length of symbols produced by uncertain and nominal sources respectively. Let also the uncertain source and nominal source are related by

18 CWIT 2005 Example: Partially Observed Gauss Markov Nominal Source Nominal Source. The nominal density is induced by a partially observed Gauss Markov nominal source defined by Assumptions. unobserved process, observed process, is iid, is iid, are mutually independent is detectable is stabilizable and

19 CWIT 2005 Partially Observed Gauss Markov Nominal Source: Lemmas Lemma. Let be a stationary Gaussian process with power spectral density. Let and assume exists. Then

20 CWIT 2005 Partially Observed Gauss Markov Nominal Source: Lemmas Lemma. For the partially observed Gauss Markov nominal source whereis the unique positive semi-definite solution of the following Algebraic Ricatti equation

21 CWIT 2005 Partially Observed Gauss Markov Nominal Source: Robust Entropy Rate Proposition. The robust entropy rate of an uncertain source with corresponding partially observed Gauss Markov nominal source is is the Shannon entropy rate of the nominal source,is given in previously mentioned Lemma and is the unique solution of

22 CWIT 2005 Partially Observed Gauss Markov Nominal Source: Remarks Remark. For the scalar case with, after solving, we obtain Remark. The case corresponds to. Letting, we have

23 CWIT 2005 Example: Partially Observed Controlled Gauss Markov Nominal Source The nominal source is defined via a partially observed controlled Gauss Markov system given by Assumptions. The uncertain source and nominal source are related by is stabilizing matrix, is unobserved process, is observed process, is iid are mutually independent and

24 CWIT 2005 Partially Observed Controlled Gauss Markov Nominal Source: Robust Entropy Rate Proposition. Using Body integral formula [5] (e.g., relation between the sensivity transfer function and the unstable eigenvalues of system), the robust entropy rate of family of uncertain sources with corresponding partially observed controlled Gauss Markov nominal source is are eigenvalues of the system matrix and is the unique solution of

25 CWIT 2005 Example: Uncertain Sources in Frequency Domain Defining Spaces. Let and be the space of scalar bounded, analytic function of. This space is equipped with the norm defined by Assumptions. Suppose the uncertain source is obtained by passing a stationary Gaussian random process, whit known power spectral density, through an uncertain linear filter where belongs to the following additive uncertainty model

26 CWIT 2005 Uncertain Sources in Frequency Domain: Robust Entropy Rate Results. The observed process is Gaussian random process, and the Shannon entropy rate of observed process is Subsequently, the robust entropy rate is defined via and the solution is given by [6]

CWIT Applications in Stabilizability of Networked Control Systems

28 CWIT 2005 Observability and Stabilizability of Networked Control Systems Definition. The uncertain source is uniform asymptotically observable in probability if there exists an encode and decoder such that Moreover, a controlled uncertain source is uniform asymptotically stabilizable in probability if there exists an encoder, decoder and controller such that

29 CWIT 2005 Necessary Condition for Observability and Stabilizability of Networked Control Systems [2] Proposition. Necessary condition for uniform asymptotically observability and stabilizability in probability is is the covariance matrix of Gaussian distribution that satisfies

30 CWIT 2005 References [1] C. D. Charalambous and Farzad Rezaei, Robust Coding for Uncertain Sources, Submitted to IEEE Trans. On Information Theory, [2] C. D. Charalambous and Alireza Farhadi, Mathematical Theory for Robust Control and Communication Subject to Uncertainty, Preprint, [3] A. Renyi, “On Measures of Entropy and Information”, in Proc. 4 th Berkeley Symp. Mathematical Statistics and Probability, vol. I, Berkeley, CA, pp , 1961.

31 CWIT 2005 References [4] C. Tsallis, Possible Generalization of Boltzmann-Gibbs Statistics, Journal of Statistics Physics, vol. 52, pp. 479, [5] B. F. Wu, and E. A. Jonckheere, A Simplified Approach to Bode’s Theorem for Continuous-Time and Discrete-Time Systems, IEEE Trans on AC, vol. 37, No. 11, pp , Nov [6] C. D. Charalambous and Alireza Farhadi, Robust Entropy and Robust Entropy Rate for Uncertain Sources: Applications in Networked Control Systems, Preprint, 2005.