Random Variables and Stochastic Processes –

Slides:



Advertisements
Similar presentations
Special random variables Chapter 5 Some discrete or continuous probability distributions.
Advertisements

Exponential Distribution. = mean interval between consequent events = rate = mean number of counts in the unit interval > 0 X = distance between events.
Important Random Variables Binomial: S X = { 0,1,2,..., n} Geometric: S X = { 0,1,2,... } Poisson: S X = { 0,1,2,... }
Statistical Review for Chapters 3 and 4 ISE 327 Fall 2008 Slide 1 Continuous Probability Distributions Many continuous probability distributions, including:
Prof. Dr. Ahmed Farouk Abdul Moneim. 1) Uniform Didtribution 2) Poisson’s Distribution 3) Binomial Distribution 4) Geometric Distribution 5) Negative.
Review of Basic Probability and Statistics
Probability and Statistics for Engineers (ENGC 6310) Review.
MATH408: Probability & Statistics Summer 1999 WEEK 4 Dr. Srinivas R. Chakravarthy Professor of Mathematics and Statistics Kettering University (GMI Engineering.
1 Engineering Computation Part 6. 2 Probability density function.
Estimation of parameters. Maximum likelihood What has happened was most likely.
Chapter 3-Normal distribution
Stat 321 – Day 15 More famous continuous random variables “All models are wrong; some are useful” -- G.E.P. Box.
Probability and Statistics Review
Engineering Probability and Statistics - SE-205 -Chap 3 By S. O. Duffuaa.
A random variable that has the following pmf is said to be a binomial random variable with parameters n, p The Binomial random variable.
2. Random variables  Introduction  Distribution of a random variable  Distribution function properties  Discrete random variables  Point mass  Discrete.
Important Random Variables EE570: Stochastic Processes Dr. Muqaiebl Based on notes of Pillai See also
CSE 221: Probabilistic Analysis of Computer Systems Topics covered: Special discrete distributions Sec
Discrete and Continuous Distributions G. V. Narayanan.
Chapter 21 Random Variables Discrete: Bernoulli, Binomial, Geometric, Poisson Continuous: Uniform, Exponential, Gamma, Normal Expectation & Variance, Joint.
4-1 Continuous Random Variables 4-2 Probability Distributions and Probability Density Functions Figure 4-1 Density function of a loading on a long,
Distribution Function properties. Density Function – We define the derivative of the distribution function F X (x) as the probability density function.
CIS 2033 based on Dekking et al. A Modern Introduction to Probability and Statistics, 2007 Instructor Longin Jan Latecki Chapter 7: Expectation and variance.
A Review of Probability Models
LECTURE 14 SIMULATION AND MODELING Md. Tanvir Al Amin, Lecturer, Dept. of CSE, BUET CSE 411.
Moment Generating Functions 1/33. Contents Review of Continuous Distribution Functions 2/33.
CPSC 531: Probability Review1 CPSC 531:Distributions Instructor: Anirban Mahanti Office: ICT Class Location: TRB 101.
Chapter 5 Statistical Models in Simulation
Random Variables & Probability Distributions Outcomes of experiments are, in part, random E.g. Let X 7 be the gender of the 7 th randomly selected student.
Commonly Used Distributions Andy Wang CIS Computer Systems Performance Analysis.
Further distributions
Statistics for Engineer Week II and Week III: Random Variables and Probability Distribution.
Moment Generating Functions
Random Variables and Stochastic Processes – Dr. Ghazi Al Sukkar Office Hours: will be.
Discrete Probability Distributions. Random Variable Random variable is a variable whose value is subject to variations due to chance. A random variable.
Copyright © 2010 Pearson Addison-Wesley. All rights reserved. Chapter 6 Some Continuous Probability Distributions.
Pemodelan Kualitas Proses Kode Matakuliah: I0092 – Statistik Pengendalian Kualitas Pertemuan : 2.
4-1 Continuous Random Variables 4-2 Probability Distributions and Probability Density Functions Figure 4-1 Density function of a loading on a long,
Random Variables and Stochastic Processes – Lecture#13 Dr. Ghazi Al Sukkar Office Hours:
Probability Refresher COMP5416 Advanced Network Technologies.
1 3. Random Variables Let ( , F, P) be a probability model for an experiment, and X a function that maps every to a unique point the set of real numbers.
1 3. Random Variables Let ( , F, P) be a probability model for an experiment, and X a function that maps every to a unique point the set of real numbers.
Expectation. Let X denote a discrete random variable with probability function p(x) (probability density function f(x) if X is continuous) then the expected.
Starting point for generating other distributions.
Exam 2: Rules Section 2.1 Bring a cheat sheet. One page 2 sides. Bring a calculator. Bring your book to use the tables in the back.
IE 300, Fall 2012 Richard Sowers IESE. 8/30/2012 Goals: Rules of Probability Counting Equally likely Some examples.
Lec. 08 – Discrete (and Continuous) Probability Distributions.
4.3 More Discrete Probability Distributions NOTES Coach Bridges.
Probability and Statistics Dr. Saeid Moloudzadeh Uniform Random Variable/ Normal Random Variable 1 Contents Descriptive Statistics Axioms.
Chapter 3 Statistical Models or Quality Control Improvement.
Chapter 31Introduction to Statistical Quality Control, 7th Edition by Douglas C. Montgomery. Copyright (c) 2012 John Wiley & Sons, Inc.
Random Variables and Stochastic Processes – Dr. Ghazi Al Sukkar Office Hours: will be.
Discrete Probability Distributions Chapter 4. § 4.3 More Discrete Probability Distributions.
Probability and Statistics Dr. Saeid Moloudzadeh Poisson Random Variable 1 Contents Descriptive Statistics Axioms of Probability Combinatorial.
Statistics -Continuous probability distribution 2013/11/18.
4-1 Continuous Random Variables 4-2 Probability Distributions and Probability Density Functions Figure 4-1 Density function of a loading on a long,
Statistical Quality Control, 7th Edition by Douglas C. Montgomery.
Random Variable 2013.
Probability Distributions: a review
Appendix A: Probability Theory
Moment Generating Functions
Probability & Statistics Probability Theory Mathematical Probability Models Event Relationships Distributions of Random Variables Continuous Random.
Moment Generating Functions
Probability Review for Financial Engineers
Chapter 6 Some Continuous Probability Distributions.
Chapter 3 : Random Variables
Bernoulli Trials Two Possible Outcomes Trials are independent.
Statistical Models or Quality Control Improvement
1/2555 สมศักดิ์ ศิวดำรงพงศ์
Presentation transcript:

Random Variables and Stochastic Processes – 0903720 Dr. Ghazi Al Sukkar Email: ghazi.alsukkar@ju.edu.jo Office Hours: will be posted soon Course Website: http://www2.ju.edu.jo/sites/academic/ghazi.alsukkar

Most common used RV.s Continuous-Type: Discrete-Type: Gaussian Log-Normal Exponential Gamma Erlang Chi-square Reyleigh Nakagami-m Uniform Discrete-Type: Bernoulli Binomial Poisson Geometric Negative Binomial Discrete Uniform

Gaussian (or Normal) Random Variable : 𝑓 𝑋 𝑥 = 1 2𝜋 𝜎 2 𝑒 − 𝑥−𝜇 2 2 𝜎 2 This is a bell shaped curve, symmetric around the parameter 𝜇 and its distribution function is given by 𝐹 𝑋 𝑥 = −∞ 𝑥 1 2𝜋 𝜎 2 𝑒 − 𝑦−𝜇 2 2 𝜎 2 𝑑𝑦≜𝐺 𝑥−𝜇 𝜎 𝐺 𝑥 = −∞ 𝑥 1 2𝜋 𝑒 − 𝑦 2 2 𝑑𝑦 (Tabulated) 𝑄 𝑥 = 𝑥 ∞ 1 2𝜋 𝑒 − 𝑦 2 2 𝑑𝑦 =1−𝐺 𝑥 = 1 2 𝑒𝑟𝑓𝑐 𝑥 2 Since 𝑓 𝑋 (𝑥) depends on two parameters 𝜇 and 𝜎 2 the notation 𝑋N(μ, 𝜎 2 ) is used to denote a Gaussian RV.

𝑋N(0,1): Standard Normal RV: zero mean and Unity variance. Most important and frequently encountered random variable in communications. Large 𝜎 2 Small 𝜎 2 𝜇 𝜇

Log-normal Distribution If 𝑌 is a random variable with a normal distribution, then 𝑋= 𝑒 𝑌 has a log-normal distribution. Likewise if 𝑋 is log-normal distribution, then l𝑛 𝑋 is normal distribution. Denoted as 𝑙𝑛 𝒩(𝜇, 𝜎 2 ) 𝑓 𝑋 𝑥 = 1 𝑥 2𝜋 𝜎 2 𝑒 − ln 𝑥 −𝜇 2 2 𝜎 2 , 𝑥>0. 𝐹 𝑋 𝑥 =𝐺 ln 𝑥 −𝜇 𝜎

Exponential distribution The exponential distribution represents the probability distribution of the time intervals between successive Poisson arrivals. 𝑋 is exponential if: 𝑓 𝑋 𝑥 = 𝜆 𝑒 −𝜆𝑥 , 𝑥≥0 0, 𝑜𝑡ℎ𝑒𝑟𝑤𝑖𝑠𝑒 𝐹 𝑋 𝑥 =1− 𝑒 −𝜆𝑥 𝑋~exp⁡(𝜆)

The Memoryless property of Exponential Distribution The exponential distribution is without memory. The exponential distributions is the unique continuous memoryless distributions. Let 𝑥, 𝑠≥0 𝑃 𝑋>𝑠+𝑥|𝑋>𝑠 = 𝑃 𝑋>𝑠+𝑥 ∩ 𝑋>𝑠 𝑃 𝑋>𝑠 = 𝑃 𝑋>𝑠+𝑥 𝑃 𝑋>𝑠 = 𝑒 −𝜆 𝑠+𝑥 𝑒 −𝜆𝑠 = 𝑒 −𝜆𝑥 =𝑃 𝑋>𝑥 Let 𝑋 represents the lifetime of an equipment, if the equipment has been working for time 𝑠, then the probability it will survive an additional time 𝑥 depends only on 𝑥, and is identical to the probability of survival for time 𝑥 of a new equipment.

Example The amount of waiting time a customer spends at a restaurant has an exponential distribution with a mean value of 5 minutes. The probability that a customer will spend more than 10 minutes in the restaurant is: 𝑃 𝑋>10 = 𝑒 −10/5 = 𝑒 −2 =0.1353 The probability that the customer will spend an additional 10 minutes in the restaurant given that he has been there for more than 10 minutes is: 𝑃 𝑋>10+10|𝑋>10 =𝑃 𝑋>10 = 𝑒 −2 =0.1353

Gamma (Erlang) Distribution Denoted by 𝐺 𝛼,𝛽 , 𝛼,𝛽>0. 𝑓 𝑋 𝑥 = 𝑥 𝛼−1 Γ(𝛼) 𝛽 𝛼 𝑒 − 𝑥 𝛽 , 𝑥≥0 0, 𝑜𝑡ℎ𝑒𝑟𝑤𝑖𝑠𝑒 Γ 𝛼 = 0 ∞ 𝑥 𝛼−1 𝑒 −𝑥 𝑑𝑥 is the Gamma function Γ 𝑛 = 𝑛−1 Γ 𝑛−1 = 𝑛−1 !, for 𝑛 integer.

𝑘≡𝛼, 𝜃≡𝛽

Erlang Distribution Erlang Distribution is a special case of Gamma distribution where the shape parameter 𝛼 is an integer. It is 𝐺(𝑛, 1 𝑛𝜇 ) Let 𝛼=𝑛, 𝛽= 1 𝜆 𝑓 𝑋 𝑥 = 𝜆 𝑛 𝑥 𝑛−1 𝑛−1 ! 𝑒 −𝜆𝑥 , 𝑥≥0 0, 𝑜𝑡ℎ𝑒𝑟𝑤𝑖𝑠𝑒 𝐹 𝑋 𝑥 =1− 𝑘=0 𝑛−1 𝜆𝑥 𝑘 𝑘! 𝑒 −𝜆𝑥 Put 𝜆=𝑛𝜇. Application: The number of telephone calls which might be made at the same time to a switching center.

CHI-Square Distribution It is a special case of Gamma distribution when 𝛼= 𝑛 2 , 𝑛 𝑖𝑠 𝑖𝑛𝑡𝑒𝑔𝑒𝑟, and 𝛽=2 ⟹𝐺( 𝑛 2 ,2)≡ 𝜒 2 (𝑛) Chi-square with 𝑛 degree of freedom. 𝑓 𝑋 𝑥 = 𝑥 𝑛 2 −1 Γ( 𝑛 2 ) 2 𝑛 2 𝑒 − 𝑥 2 , 𝑥≥0 0, 𝑜𝑡ℎ𝑒𝑟𝑤𝑖𝑠𝑒 If 𝑛=2: will obtain an exponential distribution.

𝑘≡𝑛

Rayleigh Distribution 𝑋 is Rayleigh distribution with parameter 𝜎 2 . 𝑋= 𝑋 1 2 + 𝑋 2 2 , 𝑋 𝑖 ~(0, 𝜎 2 ). 𝑋 1 , 𝑋 2 𝑎𝑟𝑒 𝑠𝑡𝑎𝑡𝑖𝑠𝑡𝑖𝑐𝑎𝑙𝑙𝑦 𝑖𝑛𝑑𝑒𝑝𝑒𝑛𝑑𝑒𝑛𝑡 𝑓 𝑋 𝑥 = 𝑥 𝜎 2 𝑒 − 𝑥 2 2 𝜎 2 , 𝑥≥0 0, 𝑜𝑡ℎ𝑒𝑟𝑤𝑖𝑠𝑒 , 𝐹 𝑋 𝑥 =1− 𝑒 − 𝑥 2 2 𝜎 2 Application: used to model attenuation of wireless signals facing multi-path fading.

Nakagami-m Distribution A generalization of Rayleigh distribution through a parameter 𝑚. 𝑓 𝑋 𝑥 = 2 Γ(𝑚) 𝑚 Ω 𝑥 2𝑚−1 𝑒 −𝑚 𝑥 2 /Ω , 𝑥>0 0, 𝑜𝑡ℎ𝑒𝑟𝑤𝑖𝑠𝑒 Put 𝑚=1⇒ Rayleigh distribution Application: gives greater flexibility to model randomly fluctuating channels in wireless communication theory.

𝜇≡𝑚, 𝜔≡Ω

Uniform Random Variable A continuous random variable that takes values between 𝑎 and 𝑏 with equal probabilities over intervals of equal length 𝑋~𝑈(𝑎,𝑏). 𝑓 𝑋 𝑥 = 1 𝑏−𝑎 , 𝑎≤𝑥≤𝑏 0, 𝑜𝑡ℎ𝑒𝑟𝑤𝑖𝑠𝑒 𝐹 𝑋 𝑥 = 1, 𝑥≥𝑏 𝑥−𝑎 𝑏−𝑎 , 𝑎≤𝑥≤𝑏 0, 𝑥<𝑎 The phase of a received sinusoidal carrier is usually modeled as a uniform random variable between 0 and 2𝜋. Quantization error is also typically modeled as uniform.

Discrete random variables Bernoulli Binomial Poisson Geometric Negative Binomial Discrete Uniform

Bernoulli Random Variable Simplest possible random experiment Two possibilities: Accept/failure Male/female Rain/not rain One of the possibilities mapped to 1, 𝑋(𝑓𝑎𝑖𝑙𝑢𝑟𝑒)=0, 𝑋(𝑎𝑐𝑐𝑒𝑝𝑡)=1. 𝑃 𝑎𝑐𝑐𝑒𝑝𝑡 =𝑃 𝑋 = 1 =𝑝, 𝑃 𝑓𝑎𝑖𝑙𝑢𝑟𝑒 =𝑃 𝑋=0 =1−𝑝=𝑞 Good model for a binary data source whose output is 1 or 0. Can also be used to model the channel errors.

Binomial Random Variable 𝑌 is a discrete random variable that gives the number of 1’s in a sequence of n independent Bernoulli trials. 𝑌= 𝑖=1 𝑛 𝑋 𝑖 , 𝑋 𝑖 ,𝑖=1,2,..,𝑛 are statistically independent and Identically distributed (iid) Bernoulli RV.s 𝑃 𝑛 𝑘 = 𝑛 𝑘 𝑝 𝑘 1−𝑝 𝑛−𝑘 =𝑃 𝑌=𝑘 𝑓 𝑌 𝑦 = 𝑘=0 𝑛 𝑛 𝑘 𝑝 𝑘 1−𝑝 𝑛−𝑘 𝛿 𝑦−𝑘 𝐹 𝑌 𝑦 = 𝑘=0 𝑦 𝑛 𝑘 𝑝 𝑘 1−𝑝 𝑛−𝑘 𝐹 𝑋 (𝑥) 1 2 4 6

Poisson Probability Mass Function Assume: 1. The number of events occurring in a small time interval ∆𝑡 is 𝜆 ′ ∆𝑡 as ∆𝑡→0. 2. The number of events occurring in non overlapping time intervals are independent. Then the number of events in a time interval 𝑇 have a Poisson Probability Mass Function of the form: 𝑓 𝑋 𝑥 = 𝑘=0 ∞ 𝑃(𝑋=𝑘)𝛿 𝑥−𝑘 𝑃 𝑋=𝑘 = 𝜆 𝑘 𝑘! 𝑒 −𝜆 , 𝑘=0,1,2,… Where 𝜆= 𝜆 ′ 𝑇. Application: The number of phone calls at a call center per minute. The number of time a web server is accessed per minute.

Geometric distribution How many items produced to get one passing the quality control Number of days to get rain Sequence of failures until the first success - sequence of Bernoulli trials Possible values: If we count all trials 1,2,…,∞ If we only count the failures 0,1,2,…,∞

Derivation of probability density function - counting all trials Let 𝑋 be the number of trials needed to the first success in repeated Bernoulli trials. Let's look at the sequence FFFFS with probability 1−𝑝 4 𝑝 A general sequence will be like FFF…FFS The probability of having 𝑘−1 failures before the first success is 𝑃 𝑘 𝑡𝑟𝑖𝑎𝑙𝑠 =𝑃 𝑋=𝑘 =𝑃 𝑘−1 𝑓𝑎𝑖𝑙𝑢𝑟𝑒 𝑡ℎ𝑒𝑛 𝑎 𝑠𝑢𝑐𝑐𝑒𝑠𝑠 𝑘−1 𝑓𝑎𝑖𝑙𝑢𝑟𝑒 𝑡ℎ𝑒𝑛 𝑎 𝑠𝑢𝑐𝑐𝑒𝑠𝑠 = 1−𝑝 𝑘−1 𝑝, 𝑘=1,2,…,∞ The cumulative distribution can be found to be 𝐹 𝑋 𝑥 =𝑃 𝑋≤𝑥 = 𝑘=1 𝑥 𝑃 𝑋=𝑘 =1− 1−𝑝 𝑥 ⟹𝑃 𝑋>𝑥 = 1−𝑝 𝑥

The memoryless property What will happen to the distribution knowing that 𝑛 failures already occurred? That is we have been waiting for an empty cab and have experienced 7 occupied Formally 𝑃 𝑋>𝑛+𝑥|𝑋>𝑛 = 𝑃 𝑋>𝑛+𝑥 ∩ 𝑋>𝑛 𝑃 𝑋>𝑛 = 𝑃 𝑋>𝑛+𝑥 𝑃 𝑋>𝑛 = 1−𝑝 𝑥+𝑛 1−𝑝 𝑛 = 1−𝑝 𝑥 =𝑃 𝑋>𝑥 That is, the probability of exceeding 𝑛+𝑥 having reached 𝑛 is the same as the property of exceeding 𝑥 starting from the beginning. In other words no aging. Given that the first 𝑛 trials had no success, the conditional probability that the first success will appear after an additional 𝑥 trials depends only on 𝑥 and not on 𝑛 (not on the past).

Negative Binomial Distribution Let 𝑌 be the number of Bernoulli trials required to realize 𝑟 success. 𝑃 𝑌=𝑘 =𝑃 𝑟−1 𝑠𝑢𝑐𝑐𝑒𝑠𝑠𝑒𝑠 𝑖𝑛 𝑘−1 𝑡𝑟𝑖𝑎𝑙𝑠 𝑎𝑛𝑑 𝑠𝑢𝑐𝑐𝑒𝑠𝑠 𝑎𝑡 𝑡ℎ𝑒 𝑘𝑡ℎ 𝑡𝑟𝑖𝑎𝑙 = 𝑘−1 𝑟−1 𝑝 𝑟−1 𝑞 𝑘−𝑟 𝑝 = 𝑘−1 𝑟−1 𝑝 𝑟 𝑞 𝑘−𝑟 , 𝑘=𝑟,𝑟+1,… 𝑞=1−𝑝 If 𝑛 or fewer trials are needed for 𝑟 successes, then the number of successes in 𝑛 trials must be at least 𝑟: 𝑃 𝑌≤𝑛 =𝑃 𝑋≥𝑟 𝑌~NB(r,p): Negative Binomial RV. 𝑋: Binomial RV.

Let 𝑍=𝑌−𝑟: the number of failures preceding the 𝑟𝑡ℎ success. 𝑃 𝑍=𝑘 =𝑃 𝑌=𝑘+𝑟 = 𝑟+𝑘−1 𝑟−1 𝑝 𝑟 𝑞 𝑘 = 𝑟+𝑘−1 𝑘 𝑝 𝑟 𝑞 𝑘 , k=0,1,2,…

Uniform Probability Mass Function 𝑝 𝑖 =𝑃 𝑋= 𝑥 𝑖 = 1 𝑛 , 𝑖=1,2,…,𝑛 𝑓 𝑋 𝑥 = 𝑖=1 𝑛 1 𝑛 𝛿 𝑥− 𝑥 𝑖 𝑓 𝑋 (𝑥) 1 𝑛 𝑥 𝑥 1 𝑥 2 𝑥 3 𝑥 𝑛