HKN ECE 313 Exam 2 Review Session

Slides:



Advertisements
Similar presentations
Random Variables ECE460 Spring, 2012.
Advertisements

Random Variable A random variable X is a function that assign a real number, X(ζ), to each outcome ζ in the sample space of a random experiment. Domain.
Chapter 4 Discrete Random Variables and Probability Distributions
SUMS OF RANDOM VARIABLES Changfei Chen. Sums of Random Variables Let be a sequence of random variables, and let be their sum:
Chapter 6 Continuous Random Variables and Probability Distributions
1 Review of Probability Theory [Source: Stanford University]
Tch-prob1 Chapter 4. Multiple Random Variables Ex Select a student’s name from an urn. S In some random experiments, a number of different quantities.
A random variable that has the following pmf is said to be a binomial random variable with parameters n, p The Binomial random variable.
Continuous Random Variables and Probability Distributions
Lecture II-2: Probability Review
Chapter 21 Random Variables Discrete: Bernoulli, Binomial, Geometric, Poisson Continuous: Uniform, Exponential, Gamma, Normal Expectation & Variance, Joint.
Chapter 4 Continuous Random Variables and Probability Distributions
1 Performance Evaluation of Computer Systems By Behzad Akbari Tarbiat Modares University Spring 2009 Introduction to Probabilities: Discrete Random Variables.
Week 41 Continuous Probability Spaces Ω is not countable. Outcomes can be any real number or part of an interval of R, e.g. heights, weights and lifetimes.
Tch-prob1 Chap 3. Random Variables The outcome of a random experiment need not be a number. However, we are usually interested in some measurement or numeric.
Winter 2006EE384x1 Review of Probability Theory Review Session 1 EE384X.
Further distributions
Statistics for Engineer Week II and Week III: Random Variables and Probability Distribution.
Random Sampling, Point Estimation and Maximum Likelihood.
Estimating parameters in a statistical model Likelihood and Maximum likelihood estimation Bayesian point estimates Maximum a posteriori point.
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Two Functions of Two Random.
Lecture 15: Statistics and Their Distributions, Central Limit Theorem
Multiple Random Variables Two Discrete Random Variables –Joint pmf –Marginal pmf Two Continuous Random Variables –Joint Distribution (PDF) –Joint Density.
STA347 - week 31 Random Variables Example: We roll a fair die 6 times. Suppose we are interested in the number of 5’s in the 6 rolls. Let X = number of.
Stats Probability Theory Summary. The sample Space, S The sample space, S, for a random phenomena is the set of all possible outcomes.
IE 300, Fall 2012 Richard Sowers IESE. 8/30/2012 Goals: Rules of Probability Counting Equally likely Some examples.
Topic 5: Continuous Random Variables and Probability Distributions CEE 11 Spring 2002 Dr. Amelia Regan These notes draw liberally from the class text,
Statistics Sampling Distributions and Point Estimation of Parameters Contents, figures, and exercises come from the textbook: Applied Statistics and Probability.
STA347 - week 91 Random Vectors and Matrices A random vector is a vector whose elements are random variables. The collective behavior of a p x 1 random.
One Function of Two Random Variables
Engineering Probability and Statistics - SE-205 -Chap 3 By S. O. Duffuaa.
6 vector RVs. 6-1: probability distribution A radio transmitter sends a signal to a receiver using three paths. Let X1, X2, and X3 be the signals that.
Random Variables By: 1.
ELEC 303, Koushanfar, Fall’09 ELEC 303 – Random Signals Lecture 10 – Conditioning, Continuous Bayes rule Farinaz Koushanfar ECE Dept., Rice University.
Oliver Schulte Machine Learning 726
The Exponential and Gamma Distributions
12. Principles of Parameter Estimation
Engineering Probability and Statistics - SE-205 -Chap 3
Cumulative distribution functions and expected values
Chapter 7: Sampling Distributions
Multinomial Distribution
Goodness-of-Fit Tests
Probability Review for Financial Engineers
7. Two Random Variables In many experiments, the observations are expressible not as a single quantity, but as a family of quantities. For example to record.
Tutorial 9: Further Topics on Random Variables 2
EC 331 The Theory of and applications of Maximum Likelihood Method
Statistical NLP: Lecture 4
HKN ECE 313 Exam 1 Review Session
Integration of sensory modalities
5.1 Introduction to Curve Fitting why do we fit data to a function?
EE255/CPS226 Discrete Random Variables
Parametric Methods Berlin Chen, 2005 References:
Dept. of Electrical & Computer engineering
8. One Function of Two Random Variables
Chapter 3 : Random Variables
Mathematical Foundations of BME Reza Shadmehr
7. Two Random Variables In many experiments, the observations are expressible not as a single quantity, but as a family of quantities. For example to record.
9. Two Functions of Two Random Variables
12. Principles of Parameter Estimation
Berlin Chen Department of Computer Science & Information Engineering
Further Topics on Random Variables: Derived Distributions
Berlin Chen Department of Computer Science & Information Engineering
7. Two Random Variables In many experiments, the observations are expressible not as a single quantity, but as a family of quantities. For example to record.
Berlin Chen Department of Computer Science & Information Engineering
Further Topics on Random Variables: Derived Distributions
Experiments, Outcomes, Events and Random Variables: A Revisit
Berlin Chen Department of Computer Science & Information Engineering
8. One Function of Two Random Variables
Further Topics on Random Variables: Derived Distributions
Continuous Random Variables: Basics
Presentation transcript:

HKN ECE 313 Exam 2 Review Session Corey Snyder Ahnaf Masroor Kanad Sarkar

Exam 2 topics Continuous-type Random Variables (mean and variance of CRVs) Uniform Distribution Exponential Distribution Poisson Process Linear Scaling of PDFs Gaussian Distribution ML Parameter Estimation for Continuous Random Variables Functions of a random variable Failure Rate Functions Binary Hypothesis Testing Joint CDFs, PMFs, and PDFs Independence of Random Variables Distributions of sums of random variables*

Continuous-type Random variables Cumulative Distribution Functions (CDFs) Must be non-decreasing 𝐹 𝑋 −∞ =0, 𝐹 𝑋 +∞ =1 Must be right continuous 𝐹 𝑋 𝑐 = −∞ 𝑐 𝑓 𝑥 𝑢 𝑑𝑢 𝑃 𝑋=𝑎 =0 𝑃 𝑎<𝑋≤𝑏 = 𝐹 𝑋 𝑏 − 𝐹 𝑋 𝑎 = 𝑎 𝑏 𝑓 𝑋 𝑢 𝑑𝑢 𝐸 𝑋 = 𝜇 𝑥 = −∞ +∞ 𝑢 𝑓 𝑥 𝑢 𝑑𝑢

Uniform Distribution 𝑈𝑛𝑖𝑓(𝑎,𝑏): All values between 𝑎 and 𝑏 are equally likely pdf:𝑓 𝑢 = 1 𝑏−𝑎 𝑎≤𝑢≤𝑏 0 𝑒𝑙𝑠𝑒 mean: 𝑎+𝑏 2 variance: (𝑏−𝑎 ) 2 12

Exponential Distribution Exponential(𝜆): limit of scaled geometric random variables pdf:𝑓 𝑡 =𝜆 𝑒 −𝜆𝑡 𝑡≥0 mean: 1 𝜆 variance: 1 𝜆 2 Memoryless Property 𝑃 𝑇≥𝑠+𝑡 𝑇≥𝑠} = 𝑃{𝑇≥𝑡} 𝑠,𝑡≥0

Poisson process Poisson process is used to model the number of counts in a time interval. Similar to how the exponential distribution is a limit of the geometric distribution, the Poisson process is the limit of the Bernoulli process. Bonus: A Poisson process is a collection of i.i.d exponential random variables. If we have a rate 𝜆 and time duration 𝑡, the number of counts N t ~𝑃𝑜𝑖𝑠(𝜆𝑡) pdf: 𝑓 𝑁 𝑡 𝑘 = 𝑒 −𝜆𝑡 ( 𝜆𝑡) 𝑘 𝑘! mean: 𝜆𝑡 variance: 𝜆𝑡 Furthermore, 𝑁 𝑡 − 𝑁 𝑠 ~ 𝑃𝑜𝑖𝑠 𝜆 𝑡−𝑠 , 𝑡>𝑠 , Disjoint intervals, e.g. 𝑠=[0,2] and 𝑡=[2,3], are independent. This property is both important and remarkable. In fact, we shape our analysis of Poisson processes around this frequently. (More on this later!)

Linear scaling of pdfs 𝐼𝑓 𝑌=𝑎𝑋+𝑏: 𝐸 𝑌 =𝑎𝐸 𝑋 +𝑏 𝑉𝑎𝑟 𝑌 = 𝑎 2 𝑉𝑎𝑟 𝑋 𝑉𝑎𝑟 𝑌 = 𝑎 2 𝑉𝑎𝑟 𝑋 𝑓 𝑦 𝑣 = 𝑓 𝑥 ( 𝑣−𝑏 𝑎 ) 1 𝑎

Gaussian distribution Gaussian (or Normal) Distribution ~ 𝑁(𝜇, 𝜎 2 ) Standard Gaussian, 𝑋 : 𝜇=0, 𝜎=1 𝑝𝑑𝑓:𝑓 𝑢 = 1 2𝜋 𝜎 2 𝑒 − (𝑢−𝜇 ) 2 2 𝜎 2 𝑚𝑒𝑎𝑛:𝜇 𝑣𝑎𝑟𝑖𝑎𝑛𝑐𝑒: 𝜎 2 If we standardize our Gaussian, where: 𝑋 = 𝑋−𝜇 𝜎 Φ 𝑐 = −∞ 𝑐 𝑓 𝑢 𝑑𝑢 𝑄 𝑐 =1−Φ 𝑐 = 𝑐 ∞ 𝑓 𝑢 𝑑𝑢 Φ −𝑐 =𝑄(𝑐)

Ml parameter estimation Suppose we have a random variable with a given distribution/pdf that depends on a parameter, 𝜃. By taking trials of the random variable, we can estimate 𝜃 by finding the value that maximizes the likelihood of the observed event, 𝜃 ML. There are a few ways we can find 𝜃 ML Take derivative of provided pdf and set it equal to zero (maximization) Observe the intervals where the likelihood increases and decreases, and find the maximum between these intervals Intuition!

Functions of random variables Suppose 𝑌 = 𝑔(𝑋), and we want to be able to describe the distribution of 𝑌 Step 1: Identify the support of 𝑋. Sketch the pdf of 𝑋 and 𝑔. Identify the support of 𝑌. Determine whether 𝑌 is a Continuous or Discrete RV Take a deep breath! You’ve done some important work here. Step 2 (for CRV): Use the definition of the CDF to find the CDF of 𝑌: 𝐹 𝑌 𝑐 =𝑃 𝑌≤𝑐 =𝑃{𝑔 𝑋 ≤𝑐} Step 2 (for DRV): Find the pmf of 𝑌 directly using the definition of the pmf 𝑝 𝑌 𝑣 =𝑃 𝑌=𝑣 =𝑃 𝑔 𝑋 =𝑣 Step 3 (for CRV): Differentiate the CDF of 𝑌 in order to find the pdf of 𝑌

Generating a RV with a Specified distribution We can generate any distribution by applying a function to a uniform distribution This function should be the inverse of the CDF of the desired distribution Ex: if we want an exponential distribution, 𝐹 𝑋 𝑐 =1− 𝑒 −𝑐 =𝑢;then find 𝐹 𝑋 −1 (𝑐)

Failure rate functions We can assess the probability of a failure in a system through a failure rate function, ℎ(𝑡). 𝐹 𝑡 =1− 𝑒 − 0 𝑡 ℎ 𝑠 𝑑𝑠 Two popular failure rate functions: Consistent lifetime “Bath tub”

Binary hypothesis testing Similar to BHT with Discrete RVs Maximum Likelihood (ML) Rule Λ k = 𝑓 1 (𝑘) 𝑓 0 (𝑘) Λ 𝑘 = >1 𝑑𝑒𝑐𝑙𝑎𝑟𝑒 𝐻 1 𝑖𝑠 𝑡𝑟𝑢𝑒 <1 𝑑𝑒𝑐𝑙𝑎𝑟𝑒 𝐻 0 𝑖𝑠 𝑡𝑟𝑢𝑒 Maximum a Posteriori (MAP) Rule Prior probabilities: 𝜋 0 =𝑃 𝐻 0 , 𝜋 1 =𝑃 (𝐻 1 ) 𝐻 1 𝑡𝑟𝑢𝑒 𝑖𝑓 𝜋 1 𝑓 1 𝑘 > 𝜋 0 𝑓 0 𝑘 , same as Λ k = 𝑓 1 (𝑘) 𝑓 0 (𝑘) >𝜏 𝑤ℎ𝑒𝑟𝑒 𝜏= 𝜋 0 𝜋 1 Probabilities of False Alarm and Miss 𝑝 𝑓𝑎𝑙𝑠𝑒 𝑎𝑙𝑎𝑟𝑚 =𝑃 𝑆𝑎𝑦 𝐻 1 𝐻 0 𝑖𝑠 𝑡𝑟𝑢𝑒) 𝑝 𝑚𝑖𝑠𝑠 =𝑃 𝑆𝑎𝑦 𝐻 0 𝐻 1 𝑖𝑠 𝑡𝑟𝑢𝑒) 𝑝 𝑒 = 𝜋 1 𝑝 𝑚𝑖𝑠𝑠 + 𝜋 𝑜 𝑝 𝑓𝑎𝑙𝑠𝑒 𝑎𝑙𝑎𝑟𝑚

Joint cdf, pmf, and pdf Discrete Random Variables Continuous Random Variables Joint CDF 𝐹 𝑋,𝑌 𝑢 𝑜 , 𝑣 𝑜 =𝑃{𝑋≤ 𝑢 𝑜 , 𝑌≤ 𝑣 𝑜 } Joint PMF 𝑝 𝑋,𝑌 𝑢 𝑜 , 𝑣 𝑜 =𝑃 𝑋= 𝑢 𝑜 , 𝑌= 𝑣 𝑜 Marginal PMFs 𝑝 𝑋 𝑢 = 𝑗 𝑝 𝑋,𝑌 (𝑢, 𝑣 𝑗 ) 𝑝 𝑌 𝑣 = 𝑖 𝑝 𝑋,𝑌 ( 𝑢 𝑖 ,𝑣) Conditional PMFs 𝑝 𝑌 𝑋 𝑣 𝑢 𝑜 =𝑃 𝑌=𝑣 𝑋= 𝑢 𝑜 = 𝑝 𝑋,𝑌 𝑢 𝑜 ,𝑣 𝑝 𝑋 𝑢 𝑜 Joint CDF 𝐹 𝑋,𝑌 𝑢 𝑜 , 𝑣 𝑜 =𝑃{𝑋≤ 𝑢 𝑜 , 𝑌≤ 𝑣 𝑜 } Joint PDF 𝐹 𝑋,𝑌 𝑢 𝑜 , 𝑣 𝑜 = 𝑓 𝑋,𝑌 𝑢,𝑣 𝑑𝑣𝑑𝑢 Marginal PDFs 𝑓 𝑋 𝑢 = −∞ +∞ 𝑓 𝑋,𝑌 𝑢,𝑣 𝑑𝑣 𝑓 𝑌 (𝑣)= −∞ +∞ 𝑓 𝑋,𝑌 𝑢,𝑣 𝑑𝑢 Conditional PDFs 𝑓 𝑌 𝑋 𝑣 𝑢 𝑜 = 𝑓 𝑋,𝑌 𝑢 𝑜 ,𝑣 𝑓 𝑋 𝑢 𝑜

Independence of joint distributions We can check independence in a joint distribution in a couple ways: 𝑓 𝑋,𝑌 𝑢,𝑣 = 𝑓 𝑋 𝑢 𝑓 𝑌 𝑣 The support of 𝑓 𝑋,𝑌 𝑢,𝑣 is a product set Product set must have the swap property, which is satisfied if: 𝑎,𝑏 𝑎𝑛𝑑 𝑐,𝑑 ∈𝑠𝑢𝑝𝑝𝑜𝑟𝑡 𝑓 𝑋,𝑌 , 𝑎𝑛𝑑 𝑡ℎ𝑒𝑛 𝑎,𝑑 𝑎𝑛𝑑 𝑐,𝑏 𝑎𝑟𝑒 𝑎𝑙𝑠𝑜∈𝑠𝑢𝑝𝑝𝑜𝑟𝑡( 𝑓 𝑋,𝑌 ) Checking for a product set is only sufficient to prove dependence. Saying that the support of the joint pdf is a product set is not sufficient to check independence

Distribution of sums of random variables Suppose we want to find the distribution from the sum of two independent random variables where 𝑍 = 𝑋 + 𝑌 The pdf or pmf of 𝑍 is the convolution of the two pdfs/pmfs So...

What Is Convolution? A beautiful, extraordinary linear operator that describes natural phenomena in a fundamental and concise manner.

What Is Convolution? A beautiful, extraordinary linear operator that describes natural phenomena in a fundamental and concise manner. But for real, given 𝑍=𝑋+𝑌 Discrete (pmfs): 𝑝 𝑧 𝑢 = 𝑝 𝑥 𝑢 ∗ 𝑝 𝑦 (𝑢) 𝑝 𝑧 𝑢 = 𝑘=−∞ ∞ 𝑝 𝑥 𝑘 𝑝 𝑦 (𝑢−𝑘) = 𝑘=−∞ ∞ 𝑝 𝑥 𝑢−𝑘 𝑝 𝑦 (𝑘) Continuous (pdfs): 𝑓 𝑧 𝑢 = 𝑓 𝑥 𝑢 ∗ 𝑓 𝑦 (𝑢) 𝑓 𝑧 𝑢 = −∞ ∞ 𝑓 𝑥 𝜏 𝑓 𝑦 𝑢−𝜏 𝑑𝜏 = −∞ ∞ 𝑓 𝑥 𝑢−𝜏 𝑓 𝑦 𝜏 𝑑𝜏

Fa15 Problem 2 Let 𝑁 𝑡 be a Poisson process with rate 𝜆>0 a. Obtain 𝑃{ 𝑁 3 =5} b. Obtain 𝑃{ 𝑁 7 − 𝑁 4 =5} and 𝐸[ 𝑁 7 − 𝑁 4 ] c. Obtain 𝑃{ 𝑁 7 − 𝑁 4 =5| 𝑁 6 − 𝑁 4 =2} d. Obtain 𝑃{ 𝑁 6 − 𝑁 4 =2| 𝑁 7 − 𝑁 4 =5}

FA14 Problem 3

Fa15 problem 4 Let the joint pdf for the pair (𝑋,𝑌) be: 𝑓 𝑋,𝑌 𝑥,𝑦 = 𝑐𝑥𝑦, 0≤𝑥≤1, 0≤𝑦≤1, 𝑥+𝑦≤1 0, 𝑜𝑡ℎ𝑒𝑟𝑤𝑖𝑠𝑒 a. Compute the marginal 𝑓 𝑋 (𝑥). You can leave it in terms of 𝑐. b. Obtain the value of the constant 𝑐 for 𝑓 𝑋,𝑌 to be a valid joint pdf. c. Obtain 𝑃{𝑋+𝑌< 1 2 } d. Are 𝑋 and 𝑌 independent? Explain why or why not.

Su16 Problem 3 Let 𝑋 be Gaussian with mean −1 and variance 16 a. Express 𝑃{ 𝑋 3 ≤−8} in terms of the Φ function b. Let 𝑌= 1 2 𝑋+ 3 2 . Sketch the pdf of 𝑌, 𝑓 𝑦 𝑣 , clearly marking important points c. Express 𝑃{𝑌≥ 1 2 } in terms of the Φ function

SP15 Problem 4

Fa15 problem 3 Let 𝑋 be a continuous-type random variable taking values in [0,1]. Under hypothesis 𝐻 0 the pdf of 𝑋 is 𝑓 0 . Under hypothesis 𝐻 1 the pdf of 𝑋 is 𝑓 1 . Both pdfs are plotted below. The priors are known to be 𝜋 0 =0.6 and 𝜋 1 =0.4. a. Find the value of 𝑐 b. Specify the maximum a posteriori (MAP) decision rule for testing 𝐻 0 vs. 𝐻 1 . c. Find the error probabilities 𝑝 𝑓𝑎𝑙𝑠𝑒 𝑎𝑙𝑎𝑟𝑚 , 𝑝 𝑚𝑖𝑠𝑠 𝑑𝑒𝑡𝑒𝑐𝑡𝑖𝑜𝑛 , and the average probability of error 𝑝 𝑒 for the MAP rule.

FA12 problem 6