Discrete Random Variables and Probability Distributions

Slides:



Advertisements
Similar presentations
Discrete Random Variables and Probability Distributions
Advertisements

MOMENT GENERATING FUNCTION AND STATISTICAL DISTRIBUTIONS
DISCRETE RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS
1 Set #3: Discrete Probability Functions Define: Random Variable – numerical measure of the outcome of a probability experiment Value determined by chance.
Chapter 2 Discrete Random Variables
Discrete Probability Distributions
Chapter 1 Probability Theory (i) : One Random Variable
Discrete Probability Distributions Introduction to Business Statistics, 5e Kvanli/Guynes/Pavur (c)2000 South-Western College Publishing.
Probability Densities
Review.
Probability Distributions Finite Random Variables.
Prof. Bart Selman Module Probability --- Part d)
Probability Distributions
TDC 369 / TDC 432 April 2, 2003 Greg Brewster. Topics Math Review Probability –Distributions –Random Variables –Expected Values.
1 Review of Probability Theory [Source: Stanford University]
Generating Functions. The Moments of Y We have referred to E(Y) and E(Y 2 ) as the first and second moments of Y, respectively. In general, E(Y k ) is.
1 Engineering Computation Part 5. 2 Some Concepts Previous to Probability RANDOM EXPERIMENT A random experiment or trial can be thought of as any activity.
Continuous Random Variables and Probability Distributions
Copyright © 2014 by McGraw-Hill Higher Education. All rights reserved.
Probability and Distributions A Brief Introduction.
Discrete and Continuous Distributions G. V. Narayanan.
Chapter 21 Random Variables Discrete: Bernoulli, Binomial, Geometric, Poisson Continuous: Uniform, Exponential, Gamma, Normal Expectation & Variance, Joint.
Poisson Distribution Goals in English Premier Football League – 2006/2007 Regular Season.
Discrete Probability Distributions
Discrete Random Variable and Probability Distribution
McGraw-Hill/IrwinCopyright © 2009 by The McGraw-Hill Companies, Inc. All Rights Reserved. Chapter 4 and 5 Probability and Discrete Random Variables.
Week71 Discrete Random Variables A random variable (r.v.) assigns a numerical value to the outcomes in the sample space of a random phenomenon. A discrete.
McGraw-Hill/Irwin Copyright © 2007 by The McGraw-Hill Companies, Inc. All rights reserved. Discrete Random Variables Chapter 4.
McGraw-Hill/Irwin Copyright © 2013 by The McGraw-Hill Companies, Inc. All rights reserved.
PROBABILITY & STATISTICAL INFERENCE LECTURE 3 MSc in Computing (Data Analytics)
Binomial and Geometric Distributions Delta On-Time Performance at Hartsfield- Jackson Atlanta International (June, June, 2015)
Random Variables. A random variable X is a real valued function defined on the sample space, X : S  R. The set { s  S : X ( s )  [ a, b ] is an event}.
 A probability function is a function which assigns probabilities to the values of a random variable.  Individual probability values may be denoted by.
ENGR 610 Applied Statistics Fall Week 3 Marshall University CITE Jack Smith.
Discrete Probability Distributions. Random Variable Random variable is a variable whose value is subject to variations due to chance. A random variable.
Binomial Experiment A binomial experiment (also known as a Bernoulli trial) is a statistical experiment that has the following properties:
The Mean of a Discrete RV The mean of a RV is the average value the RV takes over the long-run. –The mean of a RV is analogous to the mean of a large population.
Copyright © 2010 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Chapter 5 Discrete Random Variables.
McGraw-Hill/IrwinCopyright © 2009 by The McGraw-Hill Companies, Inc. All Rights Reserved. Chapter 5 Discrete Random Variables.
Math 4030 – 4a Discrete Distributions
Discrete Random Variables. Numerical Outcomes Consider associating a numerical value with each sample point in a sample space. (1,1) (1,2) (1,3) (1,4)
1 Topic 3 - Discrete distributions Basics of discrete distributions Mean and variance of a discrete distribution Binomial distribution Poisson distribution.
Week 21 Conditional Probability Idea – have performed a chance experiment but don’t know the outcome (ω), but have some partial information (event A) about.
Math b (Discrete) Random Variables, Binomial Distribution.
STA347 - week 31 Random Variables Example: We roll a fair die 6 times. Suppose we are interested in the number of 5’s in the 6 rolls. Let X = number of.
Random Variables Presentation 6.. Random Variables A random variable assigns a number (or symbol) to each outcome of a random circumstance. A random variable.
Copyright (C) 2002 Houghton Mifflin Company. All rights reserved. 1 Understandable Statistics Seventh Edition By Brase and Brase Prepared by: Mistah Flynn.
Stats Probability Theory Summary. The sample Space, S The sample space, S, for a random phenomena is the set of all possible outcomes.
Exam 2: Rules Section 2.1 Bring a cheat sheet. One page 2 sides. Bring a calculator. Bring your book to use the tables in the back.
Discrete Random Variables. Discrete random variables For a discrete random variable X the probability distribution is described by the probability function,
Topic 3 - Discrete distributions Basics of discrete distributions - pages Mean and variance of a discrete distribution - pages ,
Chapter 4-5 DeGroot & Schervish. Conditional Expectation/Mean Let X and Y be random variables such that the mean of Y exists and is finite. The conditional.
Random Variables Example:
Probability Theory Modelling random phenomena. Permutations the number of ways that you can order n objects is: n! = n(n-1)(n-2)(n-3)…(3)(2)(1) Definition:
Random Variables. Numerical Outcomes Consider associating a numerical value with each sample point in a sample space. (1,1) (1,2) (1,3) (1,4) (1,5) (1,6)
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Chapter 5 Discrete Random Variables.
Random Variables 2.1 Discrete Random Variables 2.2 The Expectation of a Random Variable 2.3 The Variance of a Random Variable 2.4 Jointly Distributed Random.
Engineering Probability and Statistics - SE-205 -Chap 3 By S. O. Duffuaa.
The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Review of Final Part I Sections Jiaping Wang Department of Mathematics 02/29/2013, Monday.
Copyright (C) 2002 Houghton Mifflin Company. All rights reserved. 1 Understandable Statistics Seventh Edition By Brase and Brase Prepared by: Lynn Smith.
Week 61 Poisson Processes Model for times of occurrences (“arrivals”) of rare phenomena where λ – average number of arrivals per time period. X – number.
Chapter Five The Binomial Probability Distribution and Related Topics
Discrete Random Variables
Random Variables.
Chapter 2 Discrete Random Variables
Probability and Distributions
Probability Review for Financial Engineers
Common Families of Probability Distributions
Discrete Random Variables: Basics
Presentation transcript:

Discrete Random Variables and Probability Distributions

Random Variables Random Variable (RV): A numeric outcome that results from an experiment For each element of an experiment’s sample space, the random variable can take on exactly one value Discrete Random Variable: An RV that can take on only a finite or countably infinite set of outcomes Continuous Random Variable: An RV that can take on any value along a continuum (but may be reported “discretely” Random Variables are denoted by upper case letters (Y) Individual outcomes for RV are denoted by lower case letters (y)

Probability Distributions Probability Distribution: Table, Graph, or Formula that describes values a random variable can take on, and its corresponding probability (discrete RV) or density (continuous RV) Discrete Probability Distribution: Assigns probabilities (masses) to the individual outcomes Continuous Probability Distribution: Assigns density at individual points, probability of ranges can be obtained by integrating density function Discrete Probabilities denoted by: p(y) = P(Y=y) Continuous Densities denoted by: f(y) Cumulative Distribution Function: F(y) = P(Y≤y)

Discrete Probability Distributions

Example – Rolling 2 Dice (Red/Green) Y = Sum of the up faces of the two die. Table gives value of y for all elements in S Red\Green 1 2 3 4 5 6 7 8 9 10 11 12

Rolling 2 Dice – Probability Mass Function & CDF p(y) F(y) 2 1/36 3 2/36 3/36 4 6/36 5 4/36 10/36 6 5/36 15/36 7 21/36 8 26/36 9 30/36 10 33/36 11 35/36 12 36/36

Rolling 2 Dice – Probability Mass Function

Rolling 2 Dice – Cumulative Distribution Function

Expected Values of Discrete RV’s Mean (aka Expected Value) – Long-Run average value an RV (or function of RV) will take on Variance – Average squared deviation between a realization of an RV (or function of RV) and its mean Standard Deviation – Positive Square Root of Variance (in same units as the data) Notation: Mean: E(Y) = m Variance: V(Y) = s2 Standard Deviation: s

Expected Values of Discrete RV’s

Expected Values of Linear Functions of Discrete RV’s

Example – Rolling 2 Dice y p(y) yp(y) y2p(y) 2 1/36 2/36 4/36 3 6/36 18/36 4 3/36 12/36 48/36 5 20/36 100/36 6 5/36 30/36 180/36 7 42/36 294/36 8 40/36 320/36 9 36/36 324/36 10 300/36 11 22/36 242/36 12 144/36 Sum 36/36=1.00 252/36=7.00 1974/36=54.833

Tchebysheff’s Theorem/Empirical Rule Tchebysheff: Suppose Y is any random variable with mean m and standard deviation s. Then: P(m-ks ≤ Y ≤ m+ks) ≥ 1-(1/k2) for k ≥ 1 k=1: P(m-1s ≤ Y ≤ m+1s) ≥ 1-(1/12) = 0 (trivial result) k=2: P(m-2s ≤ Y ≤ m+2s) ≥ 1-(1/22) = ¾ k=3: P(m-3s ≤ Y ≤ m+3s) ≥ 1-(1/32) = 8/9 Note that this is a very conservative bound, but that it works for any distribution Empirical Rule (Mound Shaped Distributions) k=1: P(m-1s ≤ Y ≤ m+1s)  0.68 k=2: P(m-2s ≤ Y ≤ m+2s)  0.95 k=3: P(m-3s ≤ Y ≤ m+3s)  1

Proof of Tchebysheff’s Theorem

Moment Generating Functions (I)

Moment Generating Functions (II) M(t) is called the moment-generating function for Y, and cam be used to derive any non-central moments of the random variable (assuming it exists in a neighborhood around t=0). Also, useful in determining the distributions of functions of rndom variables

Probability Generating Functions P(t) is the probability generating function for Y

Discrete Uniform Distribution Suppose Y can take on any integer value between a and b inclusive, each equally likely (e.g. rolling a dice, where a=1 and b=6). Then Y follows the discrete uniform distribution.

Bernoulli Distribution An experiment consists of one trial. It can result in one of 2 outcomes: Success or Failure (or a characteristic being Present or Absent). Probability of Success is p (0<p<1) Y = 1 if Success (Characteristic Present), 0 if not

Binomial Experiment Experiment consists of a series of n identical trials Each trial can end in one of 2 outcomes: Success or Failure Trials are independent (outcome of one has no bearing on outcomes of others) Probability of Success, p, is constant for all trials Random Variable Y, is the number of Successes in the n trials is said to follow Binomial Distribution with parameters n and p Y can take on the values y=0,1,…,n Notation: Y~Bin(n,p)

Binomial Distribution

Binomial Distribution – Expected Value

Binomial Distribution – Variance and S.D.

Binomial Distribution – MGF & PGF

Geometric Distribution Used to model the number of Bernoulli trials needed until the first Success occurs (P(S)=p) First Success on Trial 1  S, y = 1  p(1)=p First Success on Trial 2  FS, y = 2  p(2)=(1-p)p First Success on Trial k  F…FS, y = k  p(k)=(1-p)k-1 p

Geometric Distribution - Expectations

Geometric Distribution – MGF & PGF

Negative Binomial Distribution Used to model the number of trials needed until the rth Success (extension of Geometric distribution) Based on there being r-1 Successes in first y-1 trials, followed by a Success

Poisson Distribution Distribution often used to model the number of incidences of some characteristic in time or space: Arrivals of customers in a queue Numbers of flaws in a roll of fabric Number of typos per page of text. Distribution obtained as follows: Break down the “area” into many small “pieces” (n pieces) Each “piece” can have only 0 or 1 occurrences (p=P(1)) Let l=np ≡ Average number of occurrences over “area” Y ≡ # occurrences in “area” is sum of 0s & 1s over “pieces” Y ~ Bin(n,p) with p = l/n Take limit of Binomial Distribution as n  with p = l/n

Poisson Distribution - Derivation

Poisson Distribution - Expectations

Poisson Distribution – MGF & PGF

Hypergeometric Distribution Finite population generalization of Binomial Distribution Population: N Elements k Successes (elements with characteristic if interest) Sample: n Elements Y = # of Successes in sample (y = 0,1,,,,,min(n,k)