COMMUNICATION SYSTEMS

Slides:



Advertisements
Similar presentations
Chapter 2 Concepts of Prob. Theory
Advertisements

Random Variables ECE460 Spring, 2012.
Chapter 4 Probability and Probability Distributions
DEPARTMENT OF HEALTH SCIENCE AND TECHNOLOGY STOCHASTIC SIGNALS AND PROCESSES Lecture 1 WELCOME.
Introduction to Probability and Statistics
Chapter 2: Probability.
Probability Distributions
Slide 1 Statistics Workshop Tutorial 4 Probability Probability Distributions.
Review of Probability and Random Processes
Review of Probability.
Lecture 7 Dustin Lueker.  Experiment ◦ Any activity from which an outcome, measurement, or other such result is obtained  Random (or Chance) Experiment.
Chapter 9 Introducing Probability - A bridge from Descriptive Statistics to Inferential Statistics.
Prof. SankarReview of Random Process1 Probability Sample Space (S) –Collection of all possible outcomes of a random experiment Sample Point –Each outcome.
1 Chapter 12 Introduction to Statistics A random variable is one in which the exact behavior cannot be predicted, but which may be described in terms of.
Problem A newly married couple plans to have four children and would like to have three girls and a boy. What are the chances (probability) their desire.
Stat 1510: Introducing Probability. Agenda 2  The Idea of Probability  Probability Models  Probability Rules  Finite and Discrete Probability Models.
1 CY1B2 Statistics Aims: To introduce basic statistics. Outcomes: To understand some fundamental concepts in statistics, and be able to apply some probability.
Probability Theory and Random Processes
Chapter 1 Probability and Distributions Math 6203 Fall 2009 Instructor: Ayona Chatterjee.
2.1 Random Variable Concept Given an experiment defined by a sample space S with elements s, we assign a real number to every s according to some rule.
Copyright © 2010, 2007, 2004 Pearson Education, Inc. Review and Preview This chapter combines the methods of descriptive statistics presented in.
Chapter 8 Probability Section R Review. 2 Barnett/Ziegler/Byleen Finite Mathematics 12e Review for Chapter 8 Important Terms, Symbols, Concepts  8.1.
Dr. Gary Blau, Sean HanMonday, Aug 13, 2007 Statistical Design of Experiments SECTION I Probability Theory Review.
Theory of Probability Statistics for Business and Economics.
Introduction Random Process. Where do we start from? Undergraduate Graduate Probability course Our main course Review and Additional course If we have.
5.3 Random Variables  Random Variable  Discrete Random Variables  Continuous Random Variables  Normal Distributions as Probability Distributions 1.
Random Variables. A random variable X is a real valued function defined on the sample space, X : S  R. The set { s  S : X ( s )  [ a, b ] is an event}.
LECTURE 15 THURSDAY, 15 OCTOBER STA 291 Fall
BINOMIALDISTRIBUTION AND ITS APPLICATION. Binomial Distribution  The binomial probability density function –f(x) = n C x p x q n-x for x=0,1,2,3…,n for.
1 TABLE OF CONTENTS PROBABILITY THEORY Lecture – 1Basics Lecture – 2 Independence and Bernoulli Trials Lecture – 3Random Variables Lecture – 4 Binomial.
LECTURE 14 TUESDAY, 13 OCTOBER STA 291 Fall
STA347 - week 31 Random Variables Example: We roll a fair die 6 times. Suppose we are interested in the number of 5’s in the 6 rolls. Let X = number of.
1 CHAPTERS 14 AND 15 (Intro Stats – 3 edition) PROBABILITY, PROBABILITY RULES, AND CONDITIONAL PROBABILITY.
Dr. Ahmed Abdelwahab Introduction for EE420. Probability Theory Probability theory is rooted in phenomena that can be modeled by an experiment with an.
Lecture V Probability theory. Lecture questions Classical definition of probability Frequency probability Discrete variable and probability distribution.
Stats Probability Theory Summary. The sample Space, S The sample space, S, for a random phenomena is the set of all possible outcomes.
Chapter 3 Foundation of Mathematical Analysis § 3.1 Statistics and Probability § 3.2 Random Variables and Magnitude Distribution § 3.3 Probability Density.
PROBABILITY, PROBABILITY RULES, AND CONDITIONAL PROBABILITY
Essential Statistics Chapter 91 Introducing Probability.
확률및공학통계 (Probability and Engineering Statistics) 이시웅.
EEE Probability and Random Variables Huseyin Bilgekul EEE 461 Communication Systems II Department of Electrical and Electronic Engineering Eastern.
CY1B2 Statistics1 (ii) Poisson distribution The Poisson distribution resembles the binomial distribution if the probability of an accident is very small.
Copyright © 2006 The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Review of Statistics I: Probability and Probability Distributions.
Review of Probability. Important Topics 1 Random Variables and Probability Distributions 2 Expected Values, Mean, and Variance 3 Two Random Variables.
Discrete Random Variables. Introduction In previous lectures we established a foundation of the probability theory; we applied the probability theory.
TELECOMMUNICATIONS Dr. Hugh Blanton ENTC 4307/ENTC 5307.
Discrete-time Random Signals
1 EE571 PART 3 Random Processes Huseyin Bilgekul Eeng571 Probability and astochastic Processes Department of Electrical and Electronic Engineering Eastern.
Lecture 6 Dustin Lueker.  Standardized measure of variation ◦ Idea  A standard deviation of 10 may indicate great variability or small variability,
+ Chapter 5 Overview 5.1 Introducing Probability 5.2 Combining Events 5.3 Conditional Probability 5.4 Counting Methods 1.
Lecture 7 Dustin Lueker.  Experiment ◦ Any activity from which an outcome, measurement, or other such result is obtained  Random (or Chance) Experiment.
Chapter 2: Probability. Section 2.1: Basic Ideas Definition: An experiment is a process that results in an outcome that cannot be predicted in advance.
Chapter 8: Probability: The Mathematics of Chance Probability Models and Rules 1 Probability Theory  The mathematical description of randomness.  Companies.
1 Review of Probability and Random Processes. 2 Importance of Random Processes Random variables and processes talk about quantities and signals which.
Random process UNIT III Prepared by: D.MENAKA, Assistant Professor, Dept. of ECE, Sri Venkateswara College of Engineering, Sriperumbudur, Tamilnadu.
Chapter 6 Random Processes
MECH 373 Instrumentation and Measurements
3. Random Variables (Fig.3.1)
Appendix A: Probability Theory
Outline Introduction Signal, random variable, random process and spectra Analog modulation Analog to digital conversion Digital transmission through baseband.
Chapter 4 – Part 3.
STA 291 Spring 2008 Lecture 7 Dustin Lueker.
STOCHASTIC HYDROLOGY Random Processes
Probability, Statistics
Chapter 6 Random Processes
Experiments, Outcomes, Events and Random Variables: A Revisit
Basic descriptions of physical data
Continuous Random Variables: Basics
Presentation transcript:

COMMUNICATION SYSTEMS BY Mr.V.Sudheer Raja, M.Tech Assistant professor , Department of Electrical Engineering Adama Science and Technology University E-Mail : Sudheerrajasanju@gmail.com

CHAPTER I Review of Probability, Random Variables and Random Process Contents: Introduction Definition of Probability, Axioms of Probability Sub-topics of Probability Random Variable and its Advantages Probability Mass function & Probability density function Conditional, Joint, Bernoulli and Binomial Probabilities Random Process Mr V Sudheer Raja ASTU

Introduction Models of a Physical Situation: • A model is an approximate representation. --Mathematical model --Simulation models • Deterministic versus Stochastic/Random -- Deterministic models offer repeatability of measurements Ex: Ohm’s Laws, model of a capacitor/inductor/resistor -- Stochastic models don’t: Ex: Processor’s caching, queuing, and estimation of task execution time • The emphasis of this course would be on Stochastic Modeling. Mr V Sudheer Raja ASTU

Probability In any communication system the signals encountered may be of two types: -- Deterministic and Random. Deterministic signals are the class of the signals that may be predicted at any instant of time, and can be modeled as completely specified functions of time. Random signals, it is not possible to predict its precise value in advance. It is possible to described these signals in terms of its statistical properties such as the average power in the random signal, or the spectral distribution of the power. The mathematical discipline that deals with the statistical characterization of random signals is probability theory. Mr V Sudheer Raja ASTU

Probability Theory The phenomenon of statistical regularity is used to explain probability. Probability theory is rooted in phenomena that, explicitly or implicitly, can be modeled by an experiment or situation with an outcome or result respectively that is subject to a chance. If the experiment is repeated the outcome may differ because of underlying random phenomena or chance mechanism. Such an experiment is referred to as random experiment. To use the idea of statistical regularity to explain the concept of probability, we proceed as follows: 1. We prescribe a basic experiment, which is random in nature and is repeated under identical conditions. 2. We specify all the possible outcomes of the experiment. As the experiment is random in nature, on any trial of the experiment the above possible outcomes are unpredictable i.e. any of the outcome prescribed may be resulted. 3. For a large number of trials of the experiment, the outcomes exhibit statistical regularity; that is a definite average pattern of outcomes is observed if the experiment is repeated a large number of times. Mr V Sudheer Raja ASTU

Random experiment Random Experiment : The outcome of the experiment varies in an random manner The possible outcome of an experiment is termed as event --Ex: In case of coin tossing experiment, the possibility of occurrence of “head” or “tail” is treated as event. Sample Space: --The set of possible results or outcomes of an experiment i.e. totality of all events without repetition is called as sample space and denoted as ‘S’. --Ex :In case of coin tossing experiment, the possible outcomes are either “head” or “tail” ,thus the sample space can be defined as, S={head , tail} Mr V Sudheer Raja ASTU

Sample Spaces for some example experiments 1. Select a ball from an urn containing balls numbered 1 to 50 2. Select a ball from an urn containing balls numbered 1 to 4. Balls 1 and 2 are black, and 3 and 4 are white. Note the number and the color of the ball 3. Toss a coin three times and note the number of heads 4. Pick a number at random between 0 and 1 5. Measure the time between two message arrivals at a messaging center 6. Pick 2 numbers at random between 0 and 1 7. Pick a number X at random between zero and one, then pick a number Y at random between 0 and X. Mr V Sudheer Raja ASTU

Some events in the experiments on the previous slide 1. An even numbered ball is selected 2. The ball is white and even-numbered 3. Each toss is the same outcome 4. The number selected is non-negative 5. Less than 5 seconds elapse between message arrivals 6. Both numbers are less than 0.5 7. The two numbers differ by less than one- tenth • Null event, elementary event, certain event Mr V Sudheer Raja ASTU

The urn experiment : Questions? Consider an urn with three balls in it, labeled 1,2,3 What are the chances that a ball withdrawn at random from the urn is labeled ‘1’? How to quantify this ‘chance’? Is withdrawing any of the three balls equally likely (equi-probable); or if any ball is more likely to be drawn compared to the others? If someone assigns that ‘1’ means ‘sure occurrence’ and ‘0’ means ‘no chance of occurrence’, then what number would you give to the chance of getting ‘ball 1’? And how do you compare the chance of withdrawing an odd-numbered ball to that of withdrawing an even- numbered ball? Mr V Sudheer Raja ASTU

Mr V Sudheer Raja ASTU

Counts of the selections of ‘kth’ outcome in ‘n’ iterations (trails) of the random experiment is given by Nk(n) The ratio Nk(n)/n is called the relative frequency of ‘kth’ outcome is defined as: If ‘kth’ outcome occurs in none of the trails then Nk(n)=0 i.e., Nk(n)/n = 0 and if ‘kth’ event occurs as outcome for all the trails then Nk(n)=n i.e., Nk(n)/n =1.Clearly the relative frequency is a nonnegative real number less than or equal to 1 i.e., 0≤ Nk(n)/n ≤ 1 Mr V Sudheer Raja ASTU

Mr V Sudheer Raja ASTU

Inferences Statistical regularity: --Long-term averages of repeated iterations of a random experiment tend to yield the same value A few ideas of note: And regarding the chances of withdrawing an odd-numbered ball, Mr V Sudheer Raja ASTU

Axioms of Probability A probability system consists of the following triple: 1.A sample space ’S’ of elementary events(outcomes) 2.A class of events that are subset of ‘S’ 3.A probability measure P(.) assigned to each event (say ‘A’) in the class, which has the following properties: (i) P(S)=1, The probability of sure event is1 (ii) 0≤ P(A) ≤ 1, The probability of an event A is a non negative real number that is less than or equal to 1 (iii) If A and B are two mutually exclusive events in the given class then, P(A+B)= P(A)+ P(B) Mr V Sudheer Raja ASTU

where Ā is the complement of event ‘A’ Property 2 Three axioms are used to define the probability and are also used to define some other properties of the probability. Property 1 P(Ā)=1- P(A) where Ā is the complement of event ‘A’ Property 2 -- If M mutually exclusive A1, A2 , --------- AM have the exhaustive property A1 + A2 +--------AM = S Then P(A1) + P(A2) + P(A3) -------- P(AM)=1 Property 3 -- When events A and B are not mutually exclusive events then the probability of union event “A or B” equals P(A+B)= P(A)+ P(B) - P(AB) Where P(AB) is called a joint probability --Joint probability has the following relative frequency interpretation, P(AB)= Mr V Sudheer Raja ASTU

Conditional Probability If an experiment involves two events A and B . Let P(B/A) denotes the probability of event B , given that event A has occurred . The probability P(B/A) is called the Conditional probability of B given A. Assuming that A has nonzero probability, the conditional probability P(B/A) is defined as, P(B/A)=P(AB)/P(A), Where P(AB) is the joint probability of A and B. P(B/A)= ,where represents the relative frequency of B given A has occurred. The joint probability of two events may be expressed as the product of conditional probability of one event given the other times the elementary probability of the other. that is , P(AB)=P(B/A)* P(A) = P(A/B)*P(B) Mr V Sudheer Raja ASTU

Random Variables A function whose domain is a sample space and whose range is some set of real numbers is called a random variable of the experiment. Random variable is denoted as X(s) or simply X, where ‘s’ is called sample point corresponds to an outcome that belongs to sample space ‘S’. Random variables may be discrete or continuous The random variable X is a discrete random variable if X can take on only a finite number of values in any finite observation interval. Ex: X(k)=k, where ‘k’ is sample point with the event showing k dots when a die is thrown, where it has limited number of possibilities like either 1 or 2 or 3 or 4 or 5 or 6 dots to show. If X can take on any value in a whole observation interval, X is called Continuous random variable. Ex: Random variable representing amplitude of a noise voltage at a particular instant of time because it may take any value between plus and minus infinity. Mr V Sudheer Raja ASTU

1. The distribution function FX (x) is bounded between zero and one. For probabilistic description of random variable, let us consider the random variable X and the probability of the event X ≤ x , denoted by P(X ≤ x) i.e. the probability is the function of the dummy variable x which can be expressed as, FX (x)= P(X ≤ x) The function FX (x) is called the cumulative distribution function or distribution function of the random variable X. and it has the following properties, 1. The distribution function FX (x) is bounded between zero and one. 2. The distribution function FX (x) is a monotone non- decreasing function of x, i.e. FX (x 1)≤ FX (x 2) , if x 1 < x 2 Mr V Sudheer Raja ASTU

Probability density function -- The derivative of distribution function is called Probability density function. i.e. -------Eq.(1). --Differentiation in Eq.(1) is with respect to the dummy variable x and the name density function arises from the fact that the probability of the event x 1 < X ≤ x 2 equals P(x 1 < X ≤ x 2) = P(X ≤ x 2)-P(X ≤ x 1) = FX (x 2)- FX (x 1) = ---------Eq.(2) Since FX (∞)=1,Corresponding to the probability of certain event and FX (-∞)=0 corresponds to the probability of an impossible event . which follows immediately that i.e. the probability density function must always be a nonnegative function and with a total area of one. Mr V Sudheer Raja ASTU

Statistical Averages The mean value or the expected value of a random variable is defined as, mX = where E denotes the expectation operator, that is the mean value mX locates the center of gravity of the area under the probability density function curve of the random variable X. Mr V Sudheer Raja ASTU

The variance of random variable is normally denoted as The variance of the random variable X is the measure of the variables randomness, it constrains the effective width of the probability density function fX(x) of the random variable X about the mean mX and is expressed as, The variance of random variable is normally denoted as The square root of Variance is called as standard deviation of the random variable X. Mr V Sudheer Raja ASTU

Random Processes Description of Random Processes Stationary and ergodicty Autocorrelation of Random Processes Cross-correlation of Random Processes

Random Processes A RANDOM VARIABLE X, is a rule for assigning to every outcome,  of an experiment a number X(. Note: X denotes a random variable and X( denotes a particular value. A RANDOM PROCESS X(t) is a rule for assigning to every  a function X(t, Note: for notational simplicity we often omit the dependence on . Another way to look at it is: RVs maps events into constants and RPs map events into functions of the parameter t -RPs can be described as an indexed set of RVs - The set of all possible waveforms or outputs is called an ensemble. We will be interested in the behavior of the system across all waveforms and a wide range of time. Mr V Sudheer Raja ASTU 23

Ensemble of Sample Functions The set of all possible functions is called the ENSEMBLE. Mr V Sudheer Raja ASTU

Random Processes A general Random or Stochastic Process can be described as: Collection of time functions (signals) corresponding to various outcomes of random experiments. Collection of random variables observed at different times. Examples of random processes in communications: Channel noise, Information generated by a source, Interference. t1 t2 Mr V Sudheer Raja ASTU 25

Mr V Sudheer Raja ASTU

Mr V Sudheer Raja ASTU

Collection of Time Functions Consider the time-varying function representing a random process where i represents an outcome of a random event. Example: a box has infinitely many resistors (i=1,2, . . .) of same resistance R. Let i be event that the ith resistor has been picked up from the box Let v(t, i) represent the voltage of the thermal noise measured on this resistor. Mr V Sudheer Raja ASTU 28

Collection of Random Variables For a particular time t=to the value x(to,i is a random variable. To describe a random process we can use collection of random variables {x(to,1 , x(to,2 , x(to,3 , . . . }. Type: a random processes can be either discrete-time or continuous-time. Ex:Probability of obtaining a sample function of a RP that passes through the following set of windows. Probability of a joint event. Mr V Sudheer Raja ASTU 29

Description of Random Processes Analytical description: X(t) =f(t,) where  is an outcome of a random event. Statistical description: For any integer N and any choice of (t1, t2, . . ., tN) the joint pdf of {X(t1), X( t2), . . ., X( tN) } is known. To describe the random process completely the PDF f(x) is required. Mr V Sudheer Raja ASTU 30

Activity: Ensembles Consider the random process: x(t)=At+B Draw ensembles of the waveforms: B is constant, A is uniformly distributed between [- 1,1] A is constant, B is uniformly distributed between [0,2] Does having an “Ensemble” of waveforms give you a better picture of how the system performs? 2 x(t) t B intersect is Random B x(t) t Slope Random Mr V Sudheer Raja ASTU 31

Stationarity Definition: A random process is STATIONARY to the order N if for any t1,t2, . . . , tN, fx{x(t1), x(t2),...x(tN)}=fx{x(t1+t0), x(t2+t0),...,x(tN +t0)} This means that the process behaves similarly (follows the same PDF) regardless of when you measure it. A random process is said to be STRICTLY STATIONARY if it is stationary to the order of N→∞. Is the random process from the coin tossing experiment stationary? Mr V Sudheer Raja ASTU

Illustration of Stationarity Time functions pass through the corresponding windows at different times with the same probability. Mr V Sudheer Raja ASTU

Example of First-Order Stationarity Assume that A and 0 are constants; 0 is a uniformly distributed RV from ) t is time. The PDF of given x(t): Note: there is NO dependence on time, the PDF is not a function of t. The RP is STATIONARY. This results applies to problems in which theta_0 is the random start up phase of an unsynchronized oscillator. Mr V Sudheer Raja ASTU 34

Non-Stationary Example Now assume that A, 0 and 0 are constants; t is time. Value of x(t) is always known for any time with a probability of 1. Thus the first order PDF of x(t) is Note: The PDF depends on time, so it is NONSTATIONARY. This results applies to problems in which theta_0 is the random start up phase of an unsynchronized oscillator. Mr V Sudheer Raja ASTU 35

Ergodic Processes Definition: A random process is ERGODIC if all time averages of any sample function are equal to the corresponding ensemble averages (expectations) Example, for ergodic processes, can use ensemble statistics to compute DC values and RMS values Ergodic processes are always stationary; Stationary processes are not necessarily ergodic Mr V Sudheer Raja ASTU 36

Example: Ergodic Process A and 0 are constants; 0 is a uniformly distributed RV from ) t is time. Mean (Ensemble statistics) Variance This results applies to problems in which theta_0 is the random start up phase of an unsynchronized oscillator. Mr V Sudheer Raja ASTU 37

Example: Ergodic Process Mean (Time Average) T is large Variance The ensemble and time averages are the same, so the process is ERGODIC This results applies to problems in which theta_0 is the random start up phase of an unsynchronized oscillator. Mr V Sudheer Raja ASTU 38

Autocorrelation of Random Process The Autocorrelation function of a real random process x(t) at two times is: Mr V Sudheer Raja ASTU

Wide-sense Stationary A random process that is stationary to order 2 or greater is Wide-Sense Stationary: A random process is Wide-Sense Stationary if: Usually, t1=t and t2=t+ so that t2- t1 = Wide-sense stationary process does not DRIFT with time. Autocorrelation depends only on the time gap but not where the time difference is. Autocorrelation gives idea about the frequency response of the RP. Mr V Sudheer Raja ASTU

Cross Correlations of RP Cross Correlation of two RP x(t) and y(t) is defined similarly as: If x(t) and y(t) are Jointly Stationary processes, If the RP’s are jointly ERGODIC, Mr V Sudheer Raja ASTU